WorldWideScience

Sample records for optimize cellular processes

  1. Synthetic Biology: Tools to Design, Build, and Optimize Cellular Processes

    Science.gov (United States)

    Young, Eric; Alper, Hal

    2010-01-01

    The general central dogma frames the emergent properties of life, which make biology both necessary and difficult to engineer. In a process engineering paradigm, each biological process stream and process unit is heavily influenced by regulatory interactions and interactions with the surrounding environment. Synthetic biology is developing the tools and methods that will increase control over these interactions, eventually resulting in an integrative synthetic biology that will allow ground-up cellular optimization. In this review, we attempt to contextualize the areas of synthetic biology into three tiers: (1) the process units and associated streams of the central dogma, (2) the intrinsic regulatory mechanisms, and (3) the extrinsic physical and chemical environment. Efforts at each of these three tiers attempt to control cellular systems and take advantage of emerging tools and approaches. Ultimately, it will be possible to integrate these approaches and realize the vision of integrative synthetic biology when cells are completely rewired for biotechnological goals. This review will highlight progress towards this goal as well as areas requiring further research. PMID:20150964

  2. Synthetic Biology: Tools to Design, Build, and Optimize Cellular Processes

    Directory of Open Access Journals (Sweden)

    Eric Young

    2010-01-01

    Full Text Available The general central dogma frames the emergent properties of life, which make biology both necessary and difficult to engineer. In a process engineering paradigm, each biological process stream and process unit is heavily influenced by regulatory interactions and interactions with the surrounding environment. Synthetic biology is developing the tools and methods that will increase control over these interactions, eventually resulting in an integrative synthetic biology that will allow ground-up cellular optimization. In this review, we attempt to contextualize the areas of synthetic biology into three tiers: (1 the process units and associated streams of the central dogma, (2 the intrinsic regulatory mechanisms, and (3 the extrinsic physical and chemical environment. Efforts at each of these three tiers attempt to control cellular systems and take advantage of emerging tools and approaches. Ultimately, it will be possible to integrate these approaches and realize the vision of integrative synthetic biology when cells are completely rewired for biotechnological goals. This review will highlight progress towards this goal as well as areas requiring further research.

  3. Synthetic biology: tools to design, build, and optimize cellular processes.

    Science.gov (United States)

    Young, Eric; Alper, Hal

    2010-01-01

    The general central dogma frames the emergent properties of life, which make biology both necessary and difficult to engineer. In a process engineering paradigm, each biological process stream and process unit is heavily influenced by regulatory interactions and interactions with the surrounding environment. Synthetic biology is developing the tools and methods that will increase control over these interactions, eventually resulting in an integrative synthetic biology that will allow ground-up cellular optimization. In this review, we attempt to contextualize the areas of synthetic biology into three tiers: (1) the process units and associated streams of the central dogma, (2) the intrinsic regulatory mechanisms, and (3) the extrinsic physical and chemical environment. Efforts at each of these three tiers attempt to control cellular systems and take advantage of emerging tools and approaches. Ultimately, it will be possible to integrate these approaches and realize the vision of integrative synthetic biology when cells are completely rewired for biotechnological goals. This review will highlight progress towards this goal as well as areas requiring further research.

  4. Navigating neurites utilize cellular topography of Schwann cell somas and processes for optimal guidance

    Science.gov (United States)

    Lopez-Fagundo, Cristina; Mitchel, Jennifer A.; Ramchal, Talisha D.; Dingle, Yu-Ting L.; Hoffman-Kim, Diane

    2013-01-01

    The path created by aligned Schwann cells (SCs) after nerve injury underlies peripheral nerve regeneration. We developed geometric bioinspired substrates to extract key information needed for axon guidance by deconstructing the topographical cues presented by SCs. We have previously reported materials that directly replicate SC topography with micro- and nanoscale resolution, but a detailed explanation of the means of directed axon extension on SC topography has not yet been described. Here, using neurite tracing and time-lapse microscopy, we analyzed the SC features that influence axon guidance. Novel poly(dimethylsiloxane) materials, fabricated via photolithography, incorporated bioinspired topographical components with the shapes and sizes of aligned SCs, namely somas and processes, where the length of the processes were varied but the soma geometry and dimensions were kept constant. Rat dorsal root ganglia neurites aligned to all materials presenting bioinspired topography after a 5 days in culture and to bioinspired materials presenting soma and process features after only 17 hours in culture. Key findings of this study were: Neurite response to underlying bioinspired topographical features was time dependent, where at 5 days, neurites aligned most strongly to materials presenting combinations of soma and process features, with higher than average density of either process or soma features; but at 17 hours they aligned more strongly to materials presenting average densities of soma and process features and to materials presenting process features only. These studies elucidate the influence of SC topography on axon guidance in a time-dependent setting and have implications for the optimization of nerve regeneration strategies. PMID:23557939

  5. Genetic Dominance & Cellular Processes

    Science.gov (United States)

    Seager, Robert D.

    2014-01-01

    In learning genetics, many students misunderstand and misinterpret what "dominance" means. Understanding is easier if students realize that dominance is not a mechanism, but rather a consequence of underlying cellular processes. For example, metabolic pathways are often little affected by changes in enzyme concentration. This means that…

  6. Cellular scanning strategy for selective laser melting: Generating reliable, optimized scanning paths and processing parameters

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2015-01-01

    method based uncertainty and reliability analysis. The reliability of the scanning paths are established using cumulative probability distribution functions for process output criteria such as sample density, thermal homogeneity, etc. A customized genetic algorithm is used along with the simulation model...

  7. Design Optimization of Irregular Cellular Structure for Additive Manufacturing

    Science.gov (United States)

    Song, Guo-Hua; Jing, Shi-Kai; Zhao, Fang-Lei; Wang, Ye-Dong; Xing, Hao; Zhou, Jing-Tao

    2017-09-01

    Irregularcellular structurehas great potential to be considered in light-weight design field. However, the research on optimizing irregular cellular structures has not yet been reporteddue to the difficulties in their modeling technology. Based on the variable density topology optimization theory, an efficient method for optimizing the topology of irregular cellular structures fabricated through additive manufacturing processes is proposed. The proposed method utilizes tangent circles to automatically generate the main outline of irregular cellular structure. The topological layoutof each cellstructure is optimized using the relative density informationobtained from the proposed modified SIMP method. A mapping relationship between cell structure and relative densityelement is builtto determine the diameter of each cell structure. The results show that the irregular cellular structure can be optimized with the proposed method. The results of simulation and experimental test are similar for irregular cellular structure, which indicate that the maximum deformation value obtained using the modified Solid Isotropic Microstructures with Penalization (SIMP) approach is lower 5.4×10-5 mm than that using the SIMP approach under the same under the same external load. The proposed research provides the instruction to design the other irregular cellular structure.

  8. Optimized Reaction Conditions for Removal of Cellular Organic Matter of Microcystis aeruginosa During the Destabilization and Aggregation Process Using Ferric Sulfate in Water Purification

    Czech Academy of Sciences Publication Activity Database

    Pivokonský, Martin; Polášek, Pavel; Pivokonská, Lenka; Tomášková, Hana

    2009-01-01

    Roč. 81, č. 5 (2009), s. 514-522 ISSN 1061-4303 R&D Projects: GA ČR GA103/07/0295 Institutional research plan: CEZ:AV0Z20600510 Keywords : Microcystis aeruginosa * cellular organic matter * destabilization * aggregation * optimized reaction conditions * water purification Subject RIV: BK - Fluid Dynamics Impact factor: 0.965, year: 2009

  9. Efficiency of cellular information processing

    International Nuclear Information System (INIS)

    Barato, Andre C; Hartich, David; Seifert, Udo

    2014-01-01

    We show that a rate of conditional Shannon entropy reduction, characterizing the learning of an internal process about an external process, is bounded by the thermodynamic entropy production. This approach allows for the definition of an informational efficiency that can be used to study cellular information processing. We analyze three models of increasing complexity inspired by the Escherichia coli sensory network, where the external process is an external ligand concentration jumping between two values. We start with a simple model for which ATP must be consumed so that a protein inside the cell can learn about the external concentration. With a second model for a single receptor we show that the rate at which the receptor learns about the external environment can be nonzero even without any dissipation inside the cell since chemical work done by the external process compensates for this learning rate. The third model is more complete, also containing adaptation. For this model we show inter alia that a bacterium in an environment that changes at a very slow time-scale is quite inefficient, dissipating much more than it learns. Using the concept of a coarse-grained learning rate, we show for the model with adaptation that while the activity learns about the external signal the option of changing the methylation level increases the concentration range for which the learning rate is substantial. (paper)

  10. Honing process optimization algorithms

    Science.gov (United States)

    Kadyrov, Ramil R.; Charikov, Pavel N.; Pryanichnikova, Valeria V.

    2018-03-01

    This article considers the relevance of honing processes for creating high-quality mechanical engineering products. The features of the honing process are revealed and such important concepts as the task for optimization of honing operations, the optimal structure of the honing working cycles, stepped and stepless honing cycles, simulation of processing and its purpose are emphasized. It is noted that the reliability of the mathematical model determines the quality parameters of the honing process control. An algorithm for continuous control of the honing process is proposed. The process model reliably describes the machining of a workpiece in a sufficiently wide area and can be used to operate the CNC machine CC743.

  11. Estimating cellular parameters through optimization procedures: elementary principles and applications

    Directory of Open Access Journals (Sweden)

    Akatsuki eKimura

    2015-03-01

    Full Text Available Construction of quantitative models is a primary goal of quantitative biology, which aims to understand cellular and organismal phenomena in a quantitative manner. In this article, we introduce optimization procedures to search for parameters in a quantitative model that can reproduce experimental data. The aim of optimization is to minimize the sum of squared errors (SSE in a prediction or to maximize likelihood. A (local maximum of likelihood or (local minimum of the SSE can efficiently be identified using gradient approaches. Addition of a stochastic process enables us to identify the global maximum/minimum without becoming trapped in local maxima/minima. Sampling approaches take advantage of increasing computational power to test numerous sets of parameters in order to determine the optimum set. By combining Bayesian inference with gradient or sampling approaches, we can estimate both the optimum parameters and the form of the likelihood function related to the parameters. Finally, we introduce four examples of research that utilize parameter optimization to obtain biological insights from quantified data: transcriptional regulation, bacterial chemotaxis, morphogenesis, and cell cycle regulation. With practical knowledge of parameter optimization, cell and developmental biologists can develop realistic models that reproduce their observations and thus, obtain mechanistic insights into phenomena of interest.

  12. Cellular Automata in Topology Optimization of Continuum Structures ...

    African Journals Online (AJOL)

    In this paper, an optimization algorithm based on cellular automata (CA) is developed for topology optimization of continuum structures with shear and flexural behavior. The design domain is divided into small triangle elements and each cell is considered as a finite element. The stress analysis is performed by the Constant ...

  13. Optimizing Cellular Networks Enabled with Renewal Energy via Strategic Learning.

    Science.gov (United States)

    Sohn, Insoo; Liu, Huaping; Ansari, Nirwan

    2015-01-01

    An important issue in the cellular industry is the rising energy cost and carbon footprint due to the rapid expansion of the cellular infrastructure. Greening cellular networks has thus attracted attention. Among the promising green cellular network techniques, the renewable energy-powered cellular network has drawn increasing attention as a critical element towards reducing carbon emissions due to massive energy consumption in the base stations deployed in cellular networks. Game theory is a branch of mathematics that is used to evaluate and optimize systems with multiple players with conflicting objectives and has been successfully used to solve various problems in cellular networks. In this paper, we model the green energy utilization and power consumption optimization problem of a green cellular network as a pilot power selection strategic game and propose a novel distributed algorithm based on a strategic learning method. The simulation results indicate that the proposed algorithm achieves correlated equilibrium of the pilot power selection game, resulting in optimum green energy utilization and power consumption reduction.

  14. Cellular automata in image processing and geometry

    CERN Document Server

    Adamatzky, Andrew; Sun, Xianfang

    2014-01-01

    The book presents findings, views and ideas on what exact problems of image processing, pattern recognition and generation can be efficiently solved by cellular automata architectures. This volume provides a convenient collection in this area, in which publications are otherwise widely scattered throughout the literature. The topics covered include image compression and resizing; skeletonization, erosion and dilation; convex hull computation, edge detection and segmentation; forgery detection and content based retrieval; and pattern generation. The book advances the theory of image processing, pattern recognition and generation as well as the design of efficient algorithms and hardware for parallel image processing and analysis. It is aimed at computer scientists, software programmers, electronic engineers, mathematicians and physicists, and at everyone who studies or develops cellular automaton algorithms and tools for image processing and analysis, or develops novel architectures and implementations of mass...

  15. Image processing with a cellular nonlinear network

    International Nuclear Information System (INIS)

    Morfu, S.

    2005-01-01

    A cellular nonlinear network (CNN) based on uncoupled nonlinear oscillators is proposed for image processing purposes. It is shown theoretically and numerically that the contrast of an image loaded at the nodes of the CNN is strongly enhanced, even if this one is initially weak. An image inversion can be also obtained without reconfiguration of the network whereas a gray levels extraction can be performed with an additional threshold filtering. Lastly, an electronic implementation of this CNN is presented

  16. On Optimal Geographical Caching in Heterogeneous Cellular Networks

    NARCIS (Netherlands)

    Serbetci, Berksan; Goseling, Jasper

    2017-01-01

    In this work we investigate optimal geographical caching in heterogeneous cellular networks where different types of base stations (BSs) have different cache capacities. Users request files from a content library according to a known probability distribution. The performance metric is the total hit

  17. Optimal Design of Gravitational Sewer Networks with General Cellular Automata

    Directory of Open Access Journals (Sweden)

    Mohammad Hadi Afshar

    2014-05-01

    Full Text Available In this paper, a Cellular Automata method is applied for the optimal design of sewer networks. The solution of sewer network optimization problems requires the determination of pipe diameters and average pipe cover depths, minimizing the total cost of the sewer network subject to operational constraints. In this paper, the network nodes and upstream and downstream pipe cover depths are considered as CA cells and cell states, respectively, and the links around each cell are taken into account as neighborhood. The proposed method is a general and flexible method for the optimization of sewer networks as it can be used to optimally design both gravity and pumped network due to the use of pipe nodal cover depths as the decision variables. The proposed method is tested against two  gravitational sewer networks and the  comparison of results with other methods such as  Genetic algorithm, Cellular Automata, Ant Colony Optimization Algorithm and Particle Swarm Optimization show the efficiency and effectiveness of the proposed method.

  18. Molecular processes in cellular arsenic metabolism

    International Nuclear Information System (INIS)

    Thomas, David J.

    2007-01-01

    Elucidating molecular processes that underlie accumulation, metabolism and binding of iAs and its methylated metabolites provides a basis for understanding the modes of action by which iAs acts as a toxin and a carcinogen. One approach to this problem is to construct a conceptual model that incorporates available information on molecular processes involved in the influx, metabolism, binding and efflux of arsenicals in cells. This conceptual model is initially conceived as a non-quantitative representation of critical molecular processes that can be used as a framework for experimental design and prediction. However, with refinement and incorporation of additional data, the conceptual model can be expressed in mathematical terms and should be useful for quantitative estimates of the kinetic and dynamic behavior of iAs and its methylated metabolites in cells. Development of a quantitative model will be facilitated by the availability of tools and techniques to manipulate molecular processes underlying transport of arsenicals across cell membranes or expression and activity of enzymes involved in methylation of arsenicals. This model of cellular metabolism might be integrated into more complex pharmacokinetic models for systemic metabolism of iAs and its methylated metabolites. It may also be useful in development of biologically based dose-response models describing the toxic and carcinogenic actions of arsenicals

  19. Cellular Neural Networks for NP-Hard Optimization

    Directory of Open Access Journals (Sweden)

    Mária Ercsey-Ravasz

    2009-02-01

    Full Text Available A cellular neural/nonlinear network (CNN is used for NP-hard optimization. We prove that a CNN in which the parameters of all cells can be separately controlled is the analog correspondent of a two-dimensional Ising-type (Edwards-Anderson spin-glass system. Using the properties of CNN, we show that one single operation (template always yields a local minimum of the spin-glass energy function. This way, a very fast optimization method, similar to simulated annealing, can be built. Estimating the simulation time needed on CNN-based computers, and comparing it with the time needed on normal digital computers using the simulated annealing algorithm, the results are astonishing. CNN computers could be faster than digital computers already at 10×10 lattice sizes. The local control of the template parameters was already partially realized on some of the hardwares, we think this study could further motivate their development in this direction.

  20. Surface Dynamic Process Simulation with the Use of Cellular Automata

    International Nuclear Information System (INIS)

    Adamska-Szatko, M.; Bala, J.

    2010-01-01

    Cellular automata are known for many applications, especially for physical and biological simulations. Universal cellular automata can be used for modelling complex natural phenomena. The paper presents simulation of surface dynamic process. Simulation uses 2-dimensional cellular automata algorithm. Modelling and visualisation were created by in-house developed software with standard OpenGL graphic library. (authors)

  1. A process insight repository supporting process optimization

    OpenAIRE

    Vetlugin, Andrey

    2012-01-01

    Existing solutions for analysis and optimization of manufacturing processes, such as online analysis processing or statistical calculations, have shortcomings that limit continuous process improvements. In particular, they lack means of storing and integrating the results of analysis. This makes the valuable information that can be used for process optimizations used only once and then disposed. The goal of the Advanced Manufacturing Analytics (AdMA) research project is to design an integrate...

  2. Optimizing towing processes at airports

    OpenAIRE

    Du, Jia Yan

    2015-01-01

    This work addresses the optimization of push-back and towing processes at airports, as an important part of the turnaround process. A vehicle routing based scheduling model is introduced to find a cost optimal assignment of jobs to towing tractors in daily operations. A second model derives an investment strategy to optimize tractor fleet size and mix in the long-run. Column generation heuristics are proposed as solution procedures. The thesis concludes with a case study of a major European ...

  3. Rejuvenating cellular respiration for optimizing respiratory function: targeting mitochondria.

    Science.gov (United States)

    Agrawal, Anurag; Mabalirajan, Ulaganathan

    2016-01-15

    Altered bioenergetics with increased mitochondrial reactive oxygen species production and degradation of epithelial function are key aspects of pathogenesis in asthma and chronic obstructive pulmonary disease (COPD). This motif is not unique to obstructive airway disease, reported in related airway diseases such as bronchopulmonary dysplasia and parenchymal diseases such as pulmonary fibrosis. Similarly, mitochondrial dysfunction in vascular endothelium or skeletal muscles contributes to the development of pulmonary hypertension and systemic manifestations of lung disease. In experimental models of COPD or asthma, the use of mitochondria-targeted antioxidants, such as MitoQ, has substantially improved mitochondrial health and restored respiratory function. Modulation of noncoding RNA or protein regulators of mitochondrial biogenesis, dynamics, or degradation has been found to be effective in models of fibrosis, emphysema, asthma, and pulmonary hypertension. Transfer of healthy mitochondria to epithelial cells has been associated with remarkable therapeutic efficacy in models of acute lung injury and asthma. Together, these form a 3R model--repair, reprogramming, and replacement--for mitochondria-targeted therapies in lung disease. This review highlights the key role of mitochondrial function in lung health and disease, with a focus on asthma and COPD, and provides an overview of mitochondria-targeted strategies for rejuvenating cellular respiration and optimizing respiratory function in lung diseases. Copyright © 2016 the American Physiological Society.

  4. Manufacturing processes of cellular metals. Part I. Liquid route processes

    International Nuclear Information System (INIS)

    Fernandez, P.; Cruz, L. J.; Coleto, J.

    2008-01-01

    With its interesting and particular characteristics, cellular metals are taking part of the great family of new materials. They can have open or closed porosity. At the present time, the major challenge for the materials researchers is based in the manufacturing techniques improvement in order to obtain reproducible and reliable cellular metals with quality. In the present paper, the different production methods to manufacture cellular metals by liquid route are reviewed; making a short description about the main parameters involved and the advantages and drawbacks in each of them. (Author) 106 refs

  5. Triple Bioluminescence Imaging for In Vivo Monitoring of Cellular Processes

    Directory of Open Access Journals (Sweden)

    Casey A Maguire

    2013-01-01

    Full Text Available Bioluminescence imaging (BLI has shown to be crucial for monitoring in vivo biological processes. So far, only dual bioluminescence imaging using firefly (Fluc and Renilla or Gaussia (Gluc luciferase has been achieved due to the lack of availability of other efficiently expressed luciferases using different substrates. Here, we characterized a codon-optimized luciferase from Vargula hilgendorfii (Vluc as a reporter for mammalian gene expression. We showed that Vluc can be multiplexed with Gluc and Fluc for sequential imaging of three distinct cellular phenomena in the same biological system using vargulin, coelenterazine, and D-luciferin substrates, respectively. We applied this triple imaging system to monitor the effect of soluble tumor necrosis factor-related apoptosis-inducing ligand (sTRAIL delivered using an adeno-associated viral vector (AAV on brain tumors in mice. Vluc imaging showed efficient sTRAIL gene delivery to the brain, while Fluc imaging revealed a robust antiglioma therapy. Further, nuclear factor-κB (NF-κB activation in response to sTRAIL binding to glioma cells death receptors was monitored by Gluc imaging. This work is the first demonstration of trimodal in vivo bioluminescence imaging and will have a broad applicability in many different fields including immunology, oncology, virology, and neuroscience.

  6. Cellular Particle Dynamics simulation of biomechanical relaxation processes of multi-cellular systems

    Science.gov (United States)

    McCune, Matthew; Kosztin, Ioan

    2013-03-01

    Cellular Particle Dynamics (CPD) is a theoretical-computational-experimental framework for describing and predicting the time evolution of biomechanical relaxation processes of multi-cellular systems, such as fusion, sorting and compression. In CPD, cells are modeled as an ensemble of cellular particles (CPs) that interact via short range contact interactions, characterized by an attractive (adhesive interaction) and a repulsive (excluded volume interaction) component. The time evolution of the spatial conformation of the multicellular system is determined by following the trajectories of all CPs through numerical integration of their equations of motion. Here we present CPD simulation results for the fusion of both spherical and cylindrical multi-cellular aggregates. First, we calibrate the relevant CPD model parameters for a given cell type by comparing the CPD simulation results for the fusion of two spherical aggregates to the corresponding experimental results. Next, CPD simulations are used to predict the time evolution of the fusion of cylindrical aggregates. The latter is relevant for the formation of tubular multi-cellular structures (i.e., primitive blood vessels) created by the novel bioprinting technology. Work supported by NSF [PHY-0957914]. Computer time provided by the University of Missouri Bioinformatics Consortium.

  7. The Algorithm of Continuous Optimization Based on the Modified Cellular Automaton

    Directory of Open Access Journals (Sweden)

    Oleg Evsutin

    2016-08-01

    Full Text Available This article is devoted to the application of the cellular automata mathematical apparatus to the problem of continuous optimization. The cellular automaton with an objective function is introduced as a new modification of the classic cellular automaton. The algorithm of continuous optimization, which is based on dynamics of the cellular automaton having the property of geometric symmetry, is obtained. The results of the simulation experiments with the obtained algorithm on standard test functions are provided, and a comparison between the analogs is shown.

  8. Optimization of lime treatment processes

    International Nuclear Information System (INIS)

    Zinck, J. M.; Aube, B. C.

    2000-01-01

    Lime neutralization technology used in the treatment of acid mine drainage and other acidic effluents is discussed. Theoretical studies and laboratory experiments designed to optimize the technology of lime neutralization processes and to improve the cost efficiency of the treatment process are described. Effluent quality, slaking temperature, aeration, solid-liquid separation, sludge production and geochemical stability have been studied experimentally and on site. Results show that through minor modification of the treatment process, costs, sludge volume generated, and metal released to the environment can be significantly reduced. 17 refs., 4 figs

  9. PM - processing for manufacturing of metals with cellular structures

    International Nuclear Information System (INIS)

    Strobl, S.; Danninger, H.

    2001-01-01

    In this review the major Processes about manufacturing of metals with cellular structure are described - based on powder metallurgy, chemical deposition and some other methods (without melting techniques). It can be shown that during the last decade many interesting innovations led to new production methods to design cellular materials. Some of them are used nowadays in industry. Also characterization and properties become more important and have therefore been carried out carefully, because of their strong influence on the functions and applications of such materials. (author)

  10. Piezo proteins: regulators of mechanosensation and other cellular processes.

    Science.gov (United States)

    Bagriantsev, Sviatoslav N; Gracheva, Elena O; Gallagher, Patrick G

    2014-11-14

    Piezo proteins have recently been identified as ion channels mediating mechanosensory transduction in mammalian cells. Characterization of these channels has yielded important insights into mechanisms of somatosensation, as well as other mechano-associated biologic processes such as sensing of shear stress, particularly in the vasculature, and regulation of urine flow and bladder distention. Other roles for Piezo proteins have emerged, some unexpected, including participation in cellular development, volume regulation, cellular migration, proliferation, and elongation. Mutations in human Piezo proteins have been associated with a variety of disorders including hereditary xerocytosis and several syndromes with muscular contracture as a prominent feature. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.

  11. Piezo Proteins: Regulators of Mechanosensation and Other Cellular Processes*

    Science.gov (United States)

    Bagriantsev, Sviatoslav N.; Gracheva, Elena O.; Gallagher, Patrick G.

    2014-01-01

    Piezo proteins have recently been identified as ion channels mediating mechanosensory transduction in mammalian cells. Characterization of these channels has yielded important insights into mechanisms of somatosensation, as well as other mechano-associated biologic processes such as sensing of shear stress, particularly in the vasculature, and regulation of urine flow and bladder distention. Other roles for Piezo proteins have emerged, some unexpected, including participation in cellular development, volume regulation, cellular migration, proliferation, and elongation. Mutations in human Piezo proteins have been associated with a variety of disorders including hereditary xerocytosis and several syndromes with muscular contracture as a prominent feature. PMID:25305018

  12. MODERNIZATION OF TECHNOLOGICAL LINE FOR CELLULAR EXTRUSION PROCESS

    Directory of Open Access Journals (Sweden)

    Tomasz Garbacz

    2014-06-01

    As part of the modernization of the cellular extrusion technology the extrusion head was designed and made. During the designing and modeling of the head the Auto CAD programe was used. After the prototyping the extrusion head was tested. In the article specification of cellular extrusion process of thermoplastics was presented. In the research, the endothermal chemical blowing agents in amount 1,0% by mass were used. The quantity of used blowing agent has a direct influence on density and structure of the extruded product of modified polymers. However, these properties have further influence on porosity, impact strength, hardness, tensile strength and another.

  13. Simulation Based Optimization of Complex Monolithic Composite Structures Using Cellular Core Technology

    Science.gov (United States)

    Hickmott, Curtis W.

    Cellular core tooling is a new technology which has the capability to manufacture complex integrated monolithic composite structures. This novel tooling method utilizes thermoplastic cellular cores as inner tooling. The semi-rigid nature of the cellular cores makes them convenient for lay-up, and under autoclave temperature and pressure they soften and expand providing uniform compaction on all surfaces including internal features such as ribs and spar tubes. This process has the capability of developing fully optimized aerospace structures by reducing or eliminating assembly using fasteners or bonded joints. The technology is studied in the context of evaluating its capabilities, advantages, and limitations in developing high quality structures. The complex nature of these parts has led to development of a model using the Finite Element Analysis (FEA) software Abaqus and the plug-in COMPRO Common Component Architecture (CCA) provided by Convergent Manufacturing Technologies. This model utilizes a "virtual autoclave" technique to simulate temperature profiles, resin flow paths, and ultimately deformation from residual stress. A model has been developed simulating the temperature profile during curing of composite parts made with the cellular core technology. While modeling of composites has been performed in the past, this project will look to take this existing knowledge and apply it to this new manufacturing method capable of building more complex parts and develop a model designed specifically for building large, complex components with a high degree of accuracy. The model development has been carried out in conjunction with experimental validation. A double box beam structure was chosen for analysis to determine the effects of the technology on internal ribs and joints. Double box beams were manufactured and sectioned into T-joints for characterization. Mechanical behavior of T-joints was performed using the T-joint pull-off test and compared to traditional

  14. Optimization of airport security process

    Science.gov (United States)

    Wei, Jianan

    2017-05-01

    In order to facilitate passenger travel, on the basis of ensuring public safety, the airport security process and scheduling to optimize. The stochastic Petri net is used to simulate the single channel security process, draw the reachable graph, construct the homogeneous Markov chain to realize the performance analysis of the security process network, and find the bottleneck to limit the passenger throughput. Curve changes in the flow of passengers to open a security channel for the initial state. When the passenger arrives at a rate that exceeds the processing capacity of the security channel, it is queued. The passenger reaches the acceptable threshold of the queuing time as the time to open or close the next channel, simulate the number of dynamic security channel scheduling to reduce the passenger queuing time.

  15. A Markov Process Inspired Cellular Automata Model of Road Traffic

    OpenAIRE

    Wang, Fa; Li, Li; Hu, Jianming; Ji, Yan; Yao, Danya; Zhang, Yi; Jin, Xuexiang; Su, Yuelong; Wei, Zheng

    2008-01-01

    To provide a more accurate description of the driving behaviors in vehicle queues, a namely Markov-Gap cellular automata model is proposed in this paper. It views the variation of the gap between two consequent vehicles as a Markov process whose stationary distribution corresponds to the observed distribution of practical gaps. The multiformity of this Markov process provides the model enough flexibility to describe various driving behaviors. Two examples are given to show how to specialize i...

  16. Energy Management Optimization for Cellular Networks under Renewable Energy Generation Uncertainty

    KAUST Repository

    Rached, Nadhir B.

    2017-03-28

    The integration of renewable energy (RE) as an alternative power source for cellular networks has been deeply investigated in literature. However, RE generation is often assumed to be deterministic; an impractical assumption for realistic scenarios. In this paper, an efficient energy procurement strategy for cellular networks powered simultaneously by the smart grid (SG) and locally deployed RE sources characterized by uncertain processes is proposed. For a one-day operation cycle, the mobile operator aims to reduce its total energy cost by optimizing the amounts of energy to be procured from the local RE sources and SG at each time period. Additionally, it aims to determine the amount of extra generated RE to be sold back to SG. A chance constrained optimization is first proposed to deal with the RE generation uncertainty. Then, two convex approximation approaches: Chernoff and Chebyshev methods, characterized by different levels of knowledge about the RE generation, are developed to determine the energy procurement strategy for different risk levels. In addition, their performances are analyzed for various daily scenarios through selected simulation results. It is shown that the higher complex Chernoff method outperforms the Chebyshev one for different risk levels set by the operator.

  17. Energy Management Optimization for Cellular Networks under Renewable Energy Generation Uncertainty

    KAUST Repository

    Rached, Nadhir B.; Ghazzai, Hakim; Kadri, Abdullah; Alouini, Mohamed-Slim

    2017-01-01

    The integration of renewable energy (RE) as an alternative power source for cellular networks has been deeply investigated in literature. However, RE generation is often assumed to be deterministic; an impractical assumption for realistic scenarios. In this paper, an efficient energy procurement strategy for cellular networks powered simultaneously by the smart grid (SG) and locally deployed RE sources characterized by uncertain processes is proposed. For a one-day operation cycle, the mobile operator aims to reduce its total energy cost by optimizing the amounts of energy to be procured from the local RE sources and SG at each time period. Additionally, it aims to determine the amount of extra generated RE to be sold back to SG. A chance constrained optimization is first proposed to deal with the RE generation uncertainty. Then, two convex approximation approaches: Chernoff and Chebyshev methods, characterized by different levels of knowledge about the RE generation, are developed to determine the energy procurement strategy for different risk levels. In addition, their performances are analyzed for various daily scenarios through selected simulation results. It is shown that the higher complex Chernoff method outperforms the Chebyshev one for different risk levels set by the operator.

  18. An Optimization Framework for Travel Pattern Interpretation of Cellular Data

    Directory of Open Access Journals (Sweden)

    Sarit Freund

    2013-09-01

    This paper explores methods for identifying travel patterns from cellular data. A primary challenge in this research is to provide an interpretation of the raw data that distinguishes between activity durations and travel durations. A novel framework is proposed for this purpose, based on a grading scheme for candidate interpretations of the raw data. A genetic algorithm is used to find interpretations with high grades, which are considered as the most reasonable ones. The proposed method is tested on a dataset of records covering 9454 cell-phone users over a period of one week. Preliminary evaluation of the resulting interpretations is presented.

  19. Optimal channel utilization and service protection in cellular communication systems

    DEFF Research Database (Denmark)

    Iversen, Villy Bæk

    1997-01-01

    In mobile communications an efficient utilization of the channels is of great importance.In this paper we consider the basic principles for obtaining the maximum utilization, and we study strategies for obtaining these limits.In general a high degree of sharing is efficient, but requires service...... protection mechanisms for protecting services and subscriber groups.We study cellular systems with overlaid cells, and the effect of overlapping cells, and we show that by dynamic channel allocation we obtain a high utilization.The models are generalizations of the Erlang-B formula, and can be evaluated...

  20. Optimization of spectral printer modeling based on a modified cellular Yule-Nielsen spectral Neugebauer model.

    Science.gov (United States)

    Liu, Qiang; Wan, Xiaoxia; Xie, Dehong

    2014-06-01

    The study presented here optimizes several steps in the spectral printer modeling workflow based on a cellular Yule-Nielsen spectral Neugebauer (CYNSN) model. First, a printer subdividing method was developed that reduces the number of sub-models while maintaining the maximum device gamut. Second, the forward spectral prediction accuracy of the CYNSN model for each subspace of the printer was improved using back propagation artificial neural network (BPANN) estimated n values. Third, a sequential gamut judging method, which clearly reduced the complexity of the optimal sub-model and cell searching process during printer backward modeling, was proposed. After that, we further modified the use of the modeling color metric and comprehensively improved the spectral and perceptual accuracy of the spectral printer model. The experimental results show that the proposed optimization approaches provide obvious improvements in aspects of the modeling accuracy or efficiency for each of the corresponding steps, and an overall improvement of the optimized spectral printer modeling workflow was also demonstrated.

  1. Improving processes through evolutionary optimization.

    Science.gov (United States)

    Clancy, Thomas R

    2011-09-01

    As systems evolve over time, their natural tendency is to become increasingly more complex. Studies on complex systems have generated new perspectives on management in social organizations such as hospitals. Much of this research appears as a natural extension of the cross-disciplinary field of systems theory. This is the 18th in a series of articles applying complex systems science to the traditional management concepts of planning, organizing, directing, coordinating, and controlling. In this article, I discuss methods to optimize complex healthcare processes through learning, adaptation, and evolutionary planning.

  2. The optimal density of cellular solids in axial tension.

    Science.gov (United States)

    Mihai, L Angela; Alayyash, Khulud; Wyatt, Hayley

    2017-05-01

    For cellular bodies with uniform cell size, wall thickness, and shape, an important question is whether the same volume of material has the same effect when arranged as many small cells or as fewer large cells. To answer this question, for finite element models of periodic structures of Mooney-type material with different structural geometry and subject to large strain deformations, we identify a nonlinear elastic modulus as the ratio between the mean effective stress and the mean effective strain in the solid cell walls, and show that this modulus increases when the thickness of the walls increases, as well as when the number of cells increases while the volume of solid material remains fixed. Since, under the specified conditions, this nonlinear elastic modulus increases also as the corresponding mean stress increases, either the mean modulus or the mean stress can be employed as indicator when the optimum wall thickness or number of cells is sought.

  3. Topology optimization of adaptive fluid-actuated cellular structures with arbitrary polygonal motor cells

    International Nuclear Information System (INIS)

    Lv, Jun; Tang, Liang; Li, Wenbo; Liu, Lei; Zhang, Hongwu

    2016-01-01

    This paper mainly focuses on the fast and efficient design method for plant bioinspired fluidic cellular materials and structures composed of polygonal motor cells. Here we developed a novel structural optimization method with arbitrary polygonal coarse-grid elements based on multiscale finite element frameworks. The fluidic cellular structures are meshed with irregular polygonal coarse-grid elements according to their natural size and the shape of the imbedded motor cells. The multiscale base functions of solid displacement and hydraulic pressure are then constructed to bring the small-scale information of the irregular motor cells to the large-scale simulations on the polygonal coarse-grid elements. On this basis, a new topology optimization method based on the resulting polygonal coarse-grid elements is proposed to determine the optimal distributions or number of motor cells in the smart cellular structures. Three types of optimization problems are solved according to the usages of the fluidic cellular structures. Firstly, the proposed optimization method is utilized to minimize the system compliance of the load-bearing fluidic cellular structures. Second, the method is further extended to design biomimetic compliant actuators of the fluidic cellular materials due to the fact that non-uniform volume expansions of fluid in the cells can induce elastic action. Third, the optimization problem focuses on the weight minimization of the cellular structure under the constraints for the compliance of the whole system. Several representative examples are investigated to validate the effectiveness of the proposed polygon-based topology optimization method of the smart materials. (paper)

  4. Can complex cellular processes be governed by simple linear rules?

    Science.gov (United States)

    Selvarajoo, Kumar; Tomita, Masaru; Tsuchiya, Masa

    2009-02-01

    Complex living systems have shown remarkably well-orchestrated, self-organized, robust, and stable behavior under a wide range of perturbations. However, despite the recent generation of high-throughput experimental datasets, basic cellular processes such as division, differentiation, and apoptosis still remain elusive. One of the key reasons is the lack of understanding of the governing principles of complex living systems. Here, we have reviewed the success of perturbation-response approaches, where without the requirement of detailed in vivo physiological parameters, the analysis of temporal concentration or activation response unravels biological network features such as causal relationships of reactant species, regulatory motifs, etc. Our review shows that simple linear rules govern the response behavior of biological networks in an ensemble of cells. It is daunting to know why such simplicity could hold in a complex heterogeneous environment. Provided physical reasons can be explained for these phenomena, major advancement in the understanding of basic cellular processes could be achieved.

  5. Piezo Proteins: Regulators of Mechanosensation and Other Cellular Processes*

    OpenAIRE

    Bagriantsev, Sviatoslav N.; Gracheva, Elena O.; Gallagher, Patrick G.

    2014-01-01

    Piezo proteins have recently been identified as ion channels mediating mechanosensory transduction in mammalian cells. Characterization of these channels has yielded important insights into mechanisms of somatosensation, as well as other mechano-associated biologic processes such as sensing of shear stress, particularly in the vasculature, and regulation of urine flow and bladder distention. Other roles for Piezo proteins have emerged, some unexpected, including participation in cellular deve...

  6. Two-material optimization of plate armour for blast mitigation using hybrid cellular automata

    Science.gov (United States)

    Goetz, J.; Tan, H.; Renaud, J.; Tovar, A.

    2012-08-01

    With the increased use of improvised explosive devices in regions at war, the threat to military and civilian life has risen. Cabin penetration and gross acceleration are the primary threats in an explosive event. Cabin penetration crushes occupants, damaging the lower body. Acceleration causes death at high magnitudes. This investigation develops a process of designing armour that simultaneously mitigates cabin penetration and acceleration. The hybrid cellular automaton (HCA) method of topology optimization has proven efficient and robust in problems involving large, plastic deformations such as crash impact. Here HCA is extended to the design of armour under blast loading. The ability to distribute two metallic phases, as opposed to one material and void, is also added. The blast wave energy transforms on impact into internal energy (IE) inside the solid medium. Maximum attenuation occurs with maximized IE. The resulting structures show HCA's potential for designing blast mitigating armour structures.

  7. Dynamic Optimization of UV Flash Processes

    DEFF Research Database (Denmark)

    Ritschel, Tobias Kasper Skovborg; Capolei, Andrea; Jørgensen, John Bagterp

    2017-01-01

    UV ash processes, also referred to as isoenergetic-isochoric ash processes, occur for dynamic simulation and optimization of vapor-liquid equilibrium processes. Dynamic optimization and nonlinear model predictive control of distillation columns, certain two-phase ow problems, as well as oil reser...... that the optimization solver, the compiler, and high-performance linear algebra software are all important for e_cient dynamic optimization of UV ash processes....

  8. Reducing residual stresses and deformations in selective laser melting through multi-level multi-scale optimization of cellular scanning strategy

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2016-01-01

    . A multilevel optimization strategy is adopted using a customized genetic algorithm developed for optimizing cellular scanning strategy for selective laser melting, with an objective of reducing residual stresses and deformations. The resulting thermo-mechanically optimized cellular scanning strategies......, a calibrated, fast, multiscale thermal model coupled with a 3D finite element mechanical model is used to simulate residual stress formation and deformations during selective laser melting. The resulting reduction in thermal model computation time allows evolutionary algorithm-based optimization of the process...

  9. Smallest-Small-World Cellular Harmony Search for Optimization of Unconstrained Benchmark Problems

    Directory of Open Access Journals (Sweden)

    Sung Soo Im

    2013-01-01

    Full Text Available We presented a new hybrid method that combines cellular harmony search algorithms with the Smallest-Small-World theory. A harmony search (HS algorithm is based on musical performance processes that occur when a musician searches for a better state of harmony. Harmony search has successfully been applied to a wide variety of practical optimization problems. Most of the previous researches have sought to improve the performance of the HS algorithm by changing the pitch adjusting rate and harmony memory considering rate. However, there has been a lack of studies to improve the performance of the algorithm by the formation of population structures. Therefore, we proposed an improved HS algorithm that uses the cellular automata formation and the topological structure of Smallest-Small-World network. The improved HS algorithm has a high clustering coefficient and a short characteristic path length, having good exploration and exploitation efficiencies. Nine benchmark functions were applied to evaluate the performance of the proposed algorithm. Unlike the existing improved HS algorithm, the proposed algorithm is expected to have improved algorithmic efficiency from the formation of the population structure.

  10. Cellular Neural Network for Real Time Image Processing

    International Nuclear Information System (INIS)

    Vagliasindi, G.; Arena, P.; Fortuna, L.; Mazzitelli, G.; Murari, A.

    2008-01-01

    Since their introduction in 1988, Cellular Nonlinear Networks (CNNs) have found a key role as image processing instruments. Thanks to their structure they are able of processing individual pixels in a parallel way providing fast image processing capabilities that has been applied to a wide range of field among which nuclear fusion. In the last years, indeed, visible and infrared video cameras have become more and more important in tokamak fusion experiments for the twofold aim of understanding the physics and monitoring the safety of the operation. Examining the output of these cameras in real-time can provide significant information for plasma control and safety of the machines. The potentiality of CNNs can be exploited to this aim. To demonstrate the feasibility of the approach, CNN image processing has been applied to several tasks both at the Frascati Tokamak Upgrade (FTU) and the Joint European Torus (JET)

  11. Near-Optimal Resource Allocation in Cooperative Cellular Networks Using Genetic Algorithms

    OpenAIRE

    Luo, Zihan; Armour, Simon; McGeehan, Joe

    2015-01-01

    This paper shows how a genetic algorithm can be used as a method of obtaining the near-optimal solution of the resource block scheduling problem in a cooperative cellular network. An exhaustive search is initially implementedto guarantee that the optimal result, in terms of maximizing the bandwidth efficiency of the overall network, is found, and then the genetic algorithm with the properly selected termination conditions is used in the same network. The simulation results show that the genet...

  12. Experimental design for dynamics identification of cellular processes.

    Science.gov (United States)

    Dinh, Vu; Rundell, Ann E; Buzzard, Gregery T

    2014-03-01

    We address the problem of using nonlinear models to design experiments to characterize the dynamics of cellular processes by using the approach of the Maximally Informative Next Experiment (MINE), which was introduced in W. Dong et al. (PLoS ONE 3(8):e3105, 2008) and independently in M.M. Donahue et al. (IET Syst. Biol. 4:249-262, 2010). In this approach, existing data is used to define a probability distribution on the parameters; the next measurement point is the one that yields the largest model output variance with this distribution. Building upon this approach, we introduce the Expected Dynamics Estimator (EDE), which is the expected value using this distribution of the output as a function of time. We prove the consistency of this estimator (uniform convergence to true dynamics) even when the chosen experiments cluster in a finite set of points. We extend this proof of consistency to various practical assumptions on noisy data and moderate levels of model mismatch. Through the derivation and proof, we develop a relaxed version of MINE that is more computationally tractable and robust than the original formulation. The results are illustrated with numerical examples on two nonlinear ordinary differential equation models of biomolecular and cellular processes.

  13. Cellular Neural Networks: A genetic algorithm for parameters optimization in artificial vision applications

    Energy Technology Data Exchange (ETDEWEB)

    Taraglio, S. [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dipt. Innovazione; Zanela, A. [Rome Univ. `La Sapienza` (Italy). Dipt. di Fisica

    1997-03-01

    An optimization method for some of the CNN`s (Cellular Neural Network) parameters, based on evolutionary strategies, is proposed. The new class of feedback template found is more effective in extracting features from the images that an autonomous vehicle acquires, than in the previous CNN`s literature.

  14. Cellular Neural Networks: A genetic algorithm for parameters optimization in artificial vision applications

    International Nuclear Information System (INIS)

    Taraglio, S.; Zanela, A.

    1997-03-01

    An optimization method for some of the CNN's (Cellular Neural Network) parameters, based on evolutionary strategies, is proposed. The new class of feedback template found is more effective in extracting features from the images that an autonomous vehicle acquires, than in the previous CNN's literature

  15. A material optimization model to approximate energy bounds for cellular materials under multiload conditions

    DEFF Research Database (Denmark)

    Guedes, J.M.; Rodrigues, H.C.; Bendsøe, Martin P.

    2003-01-01

    This paper describes a computational model, based on inverse homogenization and topology design, for approximating energy bounds for two-phase composites under multiple load cases. The approach allows for the identification of possible single-scale cellular materials that give rise to the optimal...

  16. Optimal operation of batch membrane processes

    CERN Document Server

    Paulen, Radoslav

    2016-01-01

    This study concentrates on a general optimization of a particular class of membrane separation processes: those involving batch diafiltration. Existing practices are explained and operational improvements based on optimal control theory are suggested. The first part of the book introduces the theory of membrane processes, optimal control and dynamic optimization. Separation problems are defined and mathematical models of batch membrane processes derived. The control theory focuses on problems of dynamic optimization from a chemical-engineering point of view. Analytical and numerical methods that can be exploited to treat problems of optimal control for membrane processes are described. The second part of the text builds on this theoretical basis to establish solutions for membrane models of increasing complexity. Each chapter starts with a derivation of optimal operation and continues with case studies exemplifying various aspects of the control problems under consideration. The authors work their way from th...

  17. Optimization and control of metal forming processes

    NARCIS (Netherlands)

    Havinga, Gosse Tjipke

    2016-01-01

    Inevitable variations in process and material properties limit the accuracy of metal forming processes. Robust optimization methods or control systems can be used to improve the production accuracy. Robust optimization methods are used to design production processes with low sensitivity to the

  18. Using Electromagnetic Algorithm for Total Costs of Sub-contractor Optimization in the Cellular Manufacturing Problem

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Shahriari

    2016-12-01

    Full Text Available In this paper, we present a non-linear binary programing for optimizing a specific cost in cellular manufacturing system in a controlled production condition. The system parameters are determined by the continuous distribution functions. The aim of the presented model is to optimize the total cost of imposed sub-contractors to the manufacturing system by determining how to allocate the machines and parts to each seller. In this system, DM could control the occupation level of each machine in the system. For solving the presented model, we used the electromagnetic meta-heuristic algorithm and Taguchi method for determining the optimal algorithm parameters.

  19. A new optimization method based on cellular automata for VVER-1000 nuclear reactor loading pattern

    International Nuclear Information System (INIS)

    Fadaei, Amir Hosein; Setayeshi, Saeed

    2009-01-01

    This paper presents a new and innovative optimization technique, which uses cellular automata for solving multi-objective optimization problems. Due to its ability in simulating the local information while taking neighboring effects into account, the cellular automata technique is a powerful tool for optimization. The fuel-loading pattern in nuclear reactor cores is a major optimization problem. Due to the immensity of the search space in fuel management optimization problems, finding the optimum solution requires a huge amount of calculations in the classical method. The cellular automata models, based on local information, can reduce the computations significantly. In this study, reducing the power peaking factor, while increasing the initial excess reactivity inside the reactor core of VVER-1000, which are two apparently contradictory objectives, are considered as the objective functions. The result is an optimum configuration, which is in agreement with the pattern proposed by the designer. In order to gain confidence in the reliability of this method, the aforementioned problem was also solved using neural network and simulated annealing, and the results and procedures were compared.

  20. Optimization of the diabetic nephropathy treatment with attention to the special features of cellular inflammation mechanisms

    Directory of Open Access Journals (Sweden)

    Тетяна Дмитрівна Щербань

    2016-02-01

    Full Text Available Aim. Optimization of the diabetic nephropathy (DN treatment in association with hypertonic disease (HD based on the study of neutrophil chain of pathogenic cellular mechanisms of these diseases development and the special features of its clinical course.Materials and methods. There were complexly examined 86 patients with HD associated with DN and 30 patients with isolated HD. The control group was formed by 30 practically healthy persons. The activity of NO-synthases in neutrophils was detected by Green colorimetric methods using Griess reagent. The expression of ІСАМ-1 (CD54, CD11b-integrin and inducible NO-synthase on neutrophils was detected by the indirect immunocytochemical method. Oxygen-depending activity of neutrophils was assessed in NBT-test.Results. Expression of adhesive molecules of CD54and CD11b-integrin on neutrophils of peripheral blood essentially increases (р <0,001 in patients with DN in association with HD comparing with isolated HD and the control group.At associated pathology on the background of high oxygen-depending activity of neutrophils its functional reserve decreases that results in intensification of inflammatory processes in kidneys (р<0,001.In comorbid patients chronization of pathological process results in imbalance of NO-synthases system in neutrophils: on the background of decrease of activity of constituent NO-synthases the expression and activity of inducible NO-synthase increase (р<0,001 .The use of L-arginine hydrochloride in the complex therapy of patients with DN associated with HD intensifies organoprotective effect of basal therapy, results in facilitation of the clinical course, decreases albuminuria, corrects the functional indices of neutrophils and diminishes imbalance in NO-synthases system.Conclusions. In patients with DN in association with HD the neutrophil chain of cellular inflammation mechanisms are activated: expression of adhesive molecules grows, oxygen-depending metabolism is

  1. Cellular processing and destinies of artificial DNA nanostructures.

    Science.gov (United States)

    Lee, Di Sheng; Qian, Hang; Tay, Chor Yong; Leong, David Tai

    2016-08-07

    Since many bionanotechnologies are targeted at cells, understanding how and where their interactions occur and the subsequent results of these interactions is important. Changing the intrinsic properties of DNA nanostructures and linking them with interactions presents a holistic and powerful strategy for understanding dual nanostructure-biological systems. With the recent advances in DNA nanotechnology, DNA nanostructures present a great opportunity to understand the often convoluted mass of information pertaining to nanoparticle-biological interactions due to the more precise control over their chemistry, sizes, and shapes. Coupling just some of these designs with an understanding of biological processes is both a challenge and a source of opportunities. Despite continuous advances in the field of DNA nanotechnology, the intracellular fate of DNA nanostructures has remained unclear and controversial. Because understanding its cellular processing and destiny is a necessary prelude to any rational design of exciting and innovative bionanotechnology, in this review, we will discuss and provide a comprehensive picture relevant to the intracellular processing and the fate of various DNA nanostructures which have been remained elusive for some time. We will also link the unique capabilities of DNA to some novel ideas for developing next-generation bionanotechnologies.

  2. Optimized Energy Procurement for Cellular Networks with Uncertain Renewable Energy Generation

    KAUST Repository

    Rached, Nadhir B.

    2017-02-07

    Renewable energy (RE) is an emerging solution for reducing carbon dioxide (CO2) emissions from cellular networks. One of the challenges of using RE sources is to handle its inherent uncertainty. In this paper, a RE powered cellular network is investigated. For a one-day operation cycle, the cellular network aims to reduce energy procurement costs from the smart grid by optimizing the amounts of energy procured from their locally deployed RE sources as well as from the smart grid. In addition to that, it aims to determine the extra amount of energy to be sold to the electrical grid at each time period. Chance constrained optimization is first proposed to deal with the randomness in the RE generation. Then, to make the optimization problem tractable, two well- know convex approximation methods, namely; Chernoff and Chebyshev based-approaches, are analyzed in details. Numerical results investigate the optimized energy procurement for various daily scenarios and compare between the performances of the employed convex approximation approaches.

  3. Food processing optimization using evolutionary algorithms | Enitan ...

    African Journals Online (AJOL)

    Evolutionary algorithms are widely used in single and multi-objective optimization. They are easy to use and provide solution(s) in one simulation run. They are used in food processing industries for decision making. Food processing presents constrained and unconstrained optimization problems. This paper reviews the ...

  4. Optimized Energy Efficiency and Spectral Efficiency Resource Allocation Strategies for Phantom Cellular Networks

    KAUST Repository

    Abdelhady, Amr, M.; Amin, Osama; Alouini, Mohamed-Slim

    2016-01-01

    Multi-teir hetrogeneous networks have become an essential constituent for next generation cellular networks. Meanwhile, energy efficiency (EE) has been considered a critical design criterion along with the traditional spectral efficiency (SE) metric. In this context, we study power and spectrum allocation for the recently proposed two-teir architecture known as Phantom cellular networks. The optimization framework includes both EE and SE, where we propose an algorithm that computes the SE and EE resource allocation for Phantom cellular networks. Then, we compare the performance of both design strategies versus the number of users, and the ration of Phantom cellresource blocks to the total number or resource blocks. We aim to investigate the effect of some system parameters to acheive improved SE or EE performance at a non-significant loss in EE or SE performance, respectively. It was found that the system parameters can be tuned so that the EE solution does not yield a significant loss in the SE performance.

  5. Optimized Energy Efficiency and Spectral Efficiency Resource Allocation Strategies for Phantom Cellular Networks

    KAUST Repository

    Abdelhady, Amr, M.

    2016-01-06

    Multi-teir hetrogeneous networks have become an essential constituent for next generation cellular networks. Meanwhile, energy efficiency (EE) has been considered a critical design criterion along with the traditional spectral efficiency (SE) metric. In this context, we study power and spectrum allocation for the recently proposed two-teir architecture known as Phantom cellular networks. The optimization framework includes both EE and SE, where we propose an algorithm that computes the SE and EE resource allocation for Phantom cellular networks. Then, we compare the performance of both design strategies versus the number of users, and the ration of Phantom cellresource blocks to the total number or resource blocks. We aim to investigate the effect of some system parameters to acheive improved SE or EE performance at a non-significant loss in EE or SE performance, respectively. It was found that the system parameters can be tuned so that the EE solution does not yield a significant loss in the SE performance.

  6. Tube formation by complex cellular processes in Ciona intestinalis notochord.

    Science.gov (United States)

    Dong, Bo; Horie, Takeo; Denker, Elsa; Kusakabe, Takehiro; Tsuda, Motoyuki; Smith, William C; Jiang, Di

    2009-06-15

    In the course of embryogenesis multicellular structures and organs are assembled from constituent cells. One structural component common to many organs is the tube, which consists most simply of a luminal space surrounded by a single layer of epithelial cells. The notochord of ascidian Ciona forms a tube consisting of only 40 cells, and serves as a hydrostatic "skeleton" essential for swimming. While the early processes of convergent extension in ascidian notochord development have been extensively studied, the later phases of development, which include lumen formation, have not been well characterized. Here we used molecular markers and confocal imaging to describe tubulogenesis in the developing Ciona notochord. We found that during tubulogenesis each notochord cell established de novo apical domains, and underwent a mesenchymal-epithelial transition to become an unusual epithelial cell with two opposing apical domains. Concomitantly, extracellular luminal matrix was produced and deposited between notochord cells. Subsequently, each notochord cell simultaneously executed two types of crawling movements bi-directionally along the anterior/posterior axis on the inner surface of notochordal sheath. Lamellipodia-like protrusions resulted in cell lengthening along the anterior/posterior axis, while the retraction of trailing edges of the same cell led to the merging of the two apical domains. As a result, the notochord cells acquired endothelial-like shape and formed the wall of the central lumen. Inhibition of actin polymerization prevented the cell movement and tube formation. Ciona notochord tube formation utilized an assortment of common and fundamental cellular processes including cell shape change, apical membrane biogenesis, cell/cell adhesion remodeling, dynamic cell crawling, and lumen matrix secretion.

  7. Study on Parameter Optimization Design of Drum Brake Based on Hybrid Cellular Multiobjective Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Yi Zhang

    2012-01-01

    Full Text Available In consideration of the significant role the brake plays in ensuring the fast and safe running of vehicles, and since the present parameter optimization design models of brake are far from the practical application, this paper proposes a multiobjective optimization model of drum brake, aiming at maximizing the braking efficiency and minimizing the volume and temperature rise of drum brake. As the commonly used optimization algorithms are of some deficiency, we present a differential evolution cellular multiobjective genetic algorithm (DECell by introducing differential evolution strategy into the canonical cellular genetic algorithm for tackling this problem. For DECell, the gained Pareto front could be as close as possible to the exact Pareto front, and also the diversity of nondominated individuals could be better maintained. The experiments on the test functions reveal that DECell is of good performance in solving high-dimension nonlinear multiobjective problems. And the results of optimizing the new brake model indicate that DECell obviously outperforms the compared popular algorithm NSGA-II concerning the number of obtained brake design parameter sets, the speed, and stability for finding them.

  8. Optimizing Processes to Minimize Risk

    Science.gov (United States)

    Loyd, David

    2017-01-01

    NASA, like the other hazardous industries, has suffered very catastrophic losses. Human error will likely never be completely eliminated as a factor in our failures. When you can't eliminate risk, focus on mitigating the worst consequences and recovering operations. Bolstering processes to emphasize the role of integration and problem solving is key to success. Building an effective Safety Culture bolsters skill-based performance that minimizes risk and encourages successful engagement.

  9. Iterative feedback bio-printing-derived cell-laden hydrogel scaffolds with optimal geometrical fidelity and cellular controllability.

    Science.gov (United States)

    Wang, Ling; Xu, Ming-En; Luo, Li; Zhou, Yongyong; Si, Peijian

    2018-02-12

    For three-dimensional bio-printed cell-laden hydrogel tissue constructs, the well-designed internal porous geometry is tailored to obtain the desired structural and cellular properties. However, significant differences often exist between the designed and as-printed scaffolds because of the inherent characteristics of hydrogels and cells. In this study, an iterative feedback bio-printing (IFBP) approach based on optical coherence tomography (OCT) for the fabrication of cell-laden hydrogel scaffolds with optimal geometrical fidelity and cellular controllability was proposed. A custom-made swept-source OCT (SS-OCT) system was applied to characterize the printed scaffolds quantitatively. Based on the obtained empirical linear formula from the first experimental feedback loop, we defined the most appropriate design constraints and optimized the printing process to improve the geometrical fidelity. The effectiveness of IFBP was verified from the second run using gelatin/alginate hydrogel scaffolds laden with C3A cells. The mismatch of the morphological parameters greatly decreased from 40% to within 7%, which significantly optimized the cell viability, proliferation, and morphology, as well as the representative expression of hepatocyte markers, including CYP3A4 and albumin, of the printed cell-laden hydrogel scaffolds. The demonstrated protocol paves the way for the mass fabrication of cell-laden hydrogel scaffolds, engineered tissues, and scaled-up applications of the 3D bio-printing technique.

  10. STATISTICAL OPTIMIZATION OF PROCESS VARIABLES FOR ...

    African Journals Online (AJOL)

    2012-11-03

    Nov 3, 2012 ... The osmotic dehydration process was optimized for water loss and solutes gain. ... basis) with safe moisture content for storage (10% wet basis) [3]. Due to ... sucrose, glucose, fructose, corn syrup and sodium chlo- ride have ...

  11. Optimization and standardization of pavement management processes.

    Science.gov (United States)

    2004-08-01

    This report addresses issues related to optimization and standardization of current pavement management processes in Kentucky. Historical pavement management records were analyzed, which indicates that standardization is necessary in future pavement ...

  12. Design and optimization of food processing conditions

    OpenAIRE

    Silva, C. L. M.

    1996-01-01

    The main research objectives of the group are the design and optimization of food processing conditions. Most of the work already developed is on the use of mathematical modeling of transport phenomena and quantification of degradation kinetics as two tools to optimize the final quality of thermally processed food products. Recently, we initiated a project with the main goal of studying the effects of freezing and frozen storage on orange and melon juice pectinesterase activity and q...

  13. Power Consumption Optimization in Tooth Gears Processing

    Science.gov (United States)

    Kanatnikov, N.; Harlamov, G.; Kanatnikova, P.; Pashmentova, A.

    2018-01-01

    The paper reviews the issue of optimization of technological process of tooth gears production of the power consumption criteria. The authors dwell on the indices used for cutting process estimation by the consumed energy criteria and their applicability in the analysis of the toothed wheel production process. The inventors proposed a method for optimization of power consumptions based on the spatial modeling of cutting pattern. The article is aimed at solving the problem of effective source management in order to achieve economical and ecological effect during the mechanical processing of toothed gears. The research was supported by Russian Science Foundation (project No. 17-79-10316).

  14. A sentinel protein assay for simultaneously quantifying cellular processes

    Czech Academy of Sciences Publication Activity Database

    Soste, M.; Hrabáková, Rita; Wanka, S.; Melnik, A.; Boersema, P.; Maiolica, A.; Wernas, T.; Tognetti, M.; von Mering, Ch.; Picotti, P.

    2014-01-01

    Roč. 11, č. 10 (2014), s. 1045-1048 ISSN 1548-7091 R&D Projects: GA MŠk ED2.1.00/03.0124 Institutional support: RVO:67985904 Keywords : targeted proteomics * selected reaction monitoring * cellular signaling Subject RIV: CE - Biochemistry Impact factor: 32.072, year: 2014

  15. Gaussian process regression for geometry optimization

    Science.gov (United States)

    Denzel, Alexander; Kästner, Johannes

    2018-03-01

    We implemented a geometry optimizer based on Gaussian process regression (GPR) to find minimum structures on potential energy surfaces. We tested both a two times differentiable form of the Matérn kernel and the squared exponential kernel. The Matérn kernel performs much better. We give a detailed description of the optimization procedures. These include overshooting the step resulting from GPR in order to obtain a higher degree of interpolation vs. extrapolation. In a benchmark against the Limited-memory Broyden-Fletcher-Goldfarb-Shanno optimizer of the DL-FIND library on 26 test systems, we found the new optimizer to generally reduce the number of required optimization steps.

  16. Tank Waste Remediation System optimized processing strategy

    International Nuclear Information System (INIS)

    Slaathaug, E.J.; Boldt, A.L.; Boomer, K.D.; Galbraith, J.D.; Leach, C.E.; Waldo, T.L.

    1996-03-01

    This report provides an alternative strategy evolved from the current Hanford Site Tank Waste Remediation System (TWRS) programmatic baseline for accomplishing the treatment and disposal of the Hanford Site tank wastes. This optimized processing strategy performs the major elements of the TWRS Program, but modifies the deployment of selected treatment technologies to reduce the program cost. The present program for development of waste retrieval, pretreatment, and vitrification technologies continues, but the optimized processing strategy reuses a single facility to accomplish the separations/low-activity waste (LAW) vitrification and the high-level waste (HLW) vitrification processes sequentially, thereby eliminating the need for a separate HLW vitrification facility

  17. On the theory of optimal processes

    International Nuclear Information System (INIS)

    Goldenberg, P.; Provenzano, V.

    1975-01-01

    The theory of optimal processes is a recent mathematical formalism that is used to solve an important class of problems in science and in technology, that cannot be solved by classical variational techniques. An example of such processes would be the control of a nuclear reactor. Certain features of the theory of optimal processes are discussed, emphasizing the central contribution of Pontryagin with his formulation of the maximum principle. An application of the theory of optimum control is presented. The example is a time optimum problem applied to a simplified model of a nuclear reactor. It deals with the question of changing the equilibrium power level of the reactor in an optimum time

  18. Heterogeneous architecture to process swarm optimization algorithms

    Directory of Open Access Journals (Sweden)

    Maria A. Dávila-Guzmán

    2014-01-01

    Full Text Available Since few years ago, the parallel processing has been embedded in personal computers by including co-processing units as the graphics processing units resulting in a heterogeneous platform. This paper presents the implementation of swarm algorithms on this platform to solve several functions from optimization problems, where they highlight their inherent parallel processing and distributed control features. In the swarm algorithms, each individual and dimension problem are parallelized by the granularity of the processing system which also offer low communication latency between individuals through the embedded processing. To evaluate the potential of swarm algorithms on graphics processing units we have implemented two of them: the particle swarm optimization algorithm and the bacterial foraging optimization algorithm. The algorithms’ performance is measured using the acceleration where they are contrasted between a typical sequential processing platform and the NVIDIA GeForce GTX480 heterogeneous platform; the results show that the particle swarm algorithm obtained up to 36.82x and the bacterial foraging swarm algorithm obtained up to 9.26x. Finally, the effect to increase the size of the population is evaluated where we show both the dispersion and the quality of the solutions are decreased despite of high acceleration performance since the initial distribution of the individuals can converge to local optimal solution.

  19. Theoretical aspects of cellular decision-making and information-processing.

    Science.gov (United States)

    Kobayashi, Tetsuya J; Kamimura, Atsushi

    2012-01-01

    Microscopic biological processes have extraordinary complexity and variety at the sub-cellular, intra-cellular, and multi-cellular levels. In dealing with such complex phenomena, conceptual and theoretical frameworks are crucial, which enable us to understand seemingly different intra- and inter-cellular phenomena from unified viewpoints. Decision-making is one such concept that has attracted much attention recently. Since a number of cellular behavior can be regarded as processes to make specific actions in response to external stimuli, decision-making can cover and has been used to explain a broad range of different cellular phenomena [Balázsi et al. (Cell 144(6):910, 2011), Zeng et al. (Cell 141(4):682, 2010)]. Decision-making is also closely related to cellular information-processing because appropriate decisions cannot be made without exploiting the information that the external stimuli contain. Efficiency of information transduction and processing by intra-cellular networks determines the amount of information obtained, which in turn limits the efficiency of subsequent decision-making. Furthermore, information-processing itself can serve as another concept that is crucial for understanding of other biological processes than decision-making. In this work, we review recent theoretical developments on cellular decision-making and information-processing by focusing on the relation between these two concepts.

  20. Optimization of thermal processing of canned mussels.

    Science.gov (United States)

    Ansorena, M R; Salvadori, V O

    2011-10-01

    The design and optimization of thermal processing of solid-liquid food mixtures, such as canned mussels, requires the knowledge of the thermal history at the slowest heating point. In general, this point does not coincide with the geometrical center of the can, and the results show that it is located along the axial axis at a height that depends on the brine content. In this study, a mathematical model for the prediction of the temperature at this point was developed using the discrete transfer function approach. Transfer function coefficients were experimentally obtained, and prediction equations fitted to consider other can dimensions and sampling interval. This model was coupled with an optimization routine in order to search for different retort temperature profiles to maximize a quality index. Both constant retort temperature (CRT) and variable retort temperature (VRT; discrete step-wise and exponential) were considered. In the CRT process, the optimal retort temperature was always between 134 °C and 137 °C, and high values of thiamine retention were achieved. A significant improvement in surface quality index was obtained for optimal VRT profiles compared to optimal CRT. The optimization procedure shown in this study produces results that justify its utilization in the industry.

  1. [Imaging center - optimization of the imaging process].

    Science.gov (United States)

    Busch, H-P

    2013-04-01

    Hospitals around the world are under increasing pressure to optimize the economic efficiency of treatment processes. Imaging is responsible for a great part of the success but also of the costs of treatment. In routine work an excessive supply of imaging methods leads to an "as well as" strategy up to the limit of the capacity without critical reflection. Exams that have no predictable influence on the clinical outcome are an unjustified burden for the patient. They are useless and threaten the financial situation and existence of the hospital. In recent years the focus of process optimization was exclusively on the quality and efficiency of performed single examinations. In the future critical discussion of the effectiveness of single exams in relation to the clinical outcome will be more important. Unnecessary exams can be avoided, only if in addition to the optimization of single exams (efficiency) there is an optimization strategy for the total imaging process (efficiency and effectiveness). This requires a new definition of processes (Imaging Pathway), new structures for organization (Imaging Center) and a new kind of thinking on the part of the medical staff. Motivation has to be changed from gratification of performed exams to gratification of process quality (medical quality, service quality, economics), including the avoidance of additional (unnecessary) exams. © Georg Thieme Verlag KG Stuttgart · New York.

  2. Bidirectional optimization of the melting spinning process.

    Science.gov (United States)

    Liang, Xiao; Ding, Yongsheng; Wang, Zidong; Hao, Kuangrong; Hone, Kate; Wang, Huaping

    2014-02-01

    A bidirectional optimizing approach for the melting spinning process based on an immune-enhanced neural network is proposed. The proposed bidirectional model can not only reveal the internal nonlinear relationship between the process configuration and the quality indices of the fibers as final product, but also provide a tool for engineers to develop new fiber products with expected quality specifications. A neural network is taken as the basis for the bidirectional model, and an immune component is introduced to enlarge the searching scope of the solution field so that the neural network has a larger possibility to find the appropriate and reasonable solution, and the error of prediction can therefore be eliminated. The proposed intelligent model can also help to determine what kind of process configuration should be made in order to produce satisfactory fiber products. To make the proposed model practical to the manufacturing, a software platform is developed. Simulation results show that the proposed model can eliminate the approximation error raised by the neural network-based optimizing model, which is due to the extension of focusing scope by the artificial immune mechanism. Meanwhile, the proposed model with the corresponding software can conduct optimization in two directions, namely, the process optimization and category development, and the corresponding results outperform those with an ordinary neural network-based intelligent model. It is also proved that the proposed model has the potential to act as a valuable tool from which the engineers and decision makers of the spinning process could benefit.

  3. Receptor Oligomerization as a Process Modulating Cellular Semiotics

    DEFF Research Database (Denmark)

    Giorgi, Franco; Bruni, Luis Emilio; Maggio, Roberto

    2010-01-01

    be another level of quality control that may help maintaining GPCRs rather stable throughout evolution. We propose here receptor oligomerization to be a basic molecular mechanism controlling GPCRs redundancy in many different cell types, and the plasma membrane as the first hierarchical cell structure...... at which selective categorical sensing may occur. Categorical sensing can be seen as the cellular capacity for identifying and ordering complex patterns of mixed signals out of a contextual matrix, i.e., the recognition of meaningful patterns out of ubiquitous signals. In this context, redundancy...

  4. Discovery of Transition Rules for Cellular Automata Using Artificial Bee Colony and Particle Swarm Optimization Algorithms in Urban Growth Modeling

    Directory of Open Access Journals (Sweden)

    Fereydoun Naghibi

    2016-12-01

    Full Text Available This paper presents an advanced method in urban growth modeling to discover transition rules of cellular automata (CA using the artificial bee colony (ABC optimization algorithm. Also, comparisons between the simulation results of CA models optimized by the ABC algorithm and the particle swarm optimization algorithms (PSO as intelligent approaches were performed to evaluate the potential of the proposed methods. According to previous studies, swarm intelligence algorithms for solving optimization problems such as discovering transition rules of CA in land use change/urban growth modeling can produce reasonable results. Modeling of urban growth as a dynamic process is not straightforward because of the existence of nonlinearity and heterogeneity among effective involved variables which can cause a number of challenges for traditional CA. ABC algorithm, the new powerful swarm based optimization algorithms, can be used to capture optimized transition rules of CA. This paper has proposed a methodology based on remote sensing data for modeling urban growth with CA calibrated by the ABC algorithm. The performance of ABC-CA, PSO-CA, and CA-logistic models in land use change detection is tested for the city of Urmia, Iran, between 2004 and 2014. Validations of the models based on statistical measures such as overall accuracy, figure of merit, and total operating characteristic were made. We showed that the overall accuracy of the ABC-CA model was 89%, which was 1.5% and 6.2% higher than those of the PSO-CA and CA-logistic model, respectively. Moreover, the allocation disagreement (simulation error of the simulation results for the ABC-CA, PSO-CA, and CA-logistic models are 11%, 12.5%, and 17.2%, respectively. Finally, for all evaluation indices including running time, convergence capability, flexibility, statistical measurements, and the produced spatial patterns, the ABC-CA model performance showed relative improvement and therefore its superiority was

  5. Simulation and Optimization of Foam EOR Processes

    NARCIS (Netherlands)

    Namdar Zanganeh, M.

    2011-01-01

    Chemical enhanced oil recovery (EOR) is relatively expensive due to the high cost of the injected chemicals such as surfactants. Excessive use of these chemicals leads to processes that are not economically feasible. Therefore, optimizing the volume of these injected chemicals is of extreme

  6. Synthesis and Optimization of a Methanol Process

    DEFF Research Database (Denmark)

    Grue, J.; Bendtsen, Jan Dimon

    2003-01-01

    of reaction. The resulting model consists of a system of DAEs. The model is compared with rigorous simulation results from Pro/II and good agreement is found. The process is optimized followed by heat integration and large differences in the operating economy of the plant can be observed as a result hereof...

  7. On the optimization of endoreversible processes

    Science.gov (United States)

    Pescetti, D.

    2014-03-01

    This paper is intended for undergraduates and specialists in thermodynamics and related areas. We consider and discuss the optimization of endoreversible thermodynamic processes under the condition of maximum work production. Explicit thermodynamic analyses of the solutions are carried out for the Novikov and Agrawal processes. It is shown that the efficiencies at maximum work production and maximum power output are not necessarily equal. They are for the Novikov process but not for the Agrawal process. The role of the constraints is put into evidence. The physical aspects are enhanced by the simplicity of the involved mathematics.

  8. Optimal control of a CSTR process

    Directory of Open Access Journals (Sweden)

    A. Soukkou

    2008-12-01

    Full Text Available Designing an effective criterion and learning algorithm for find the best structure is a major problem in the control design process. In this paper, the fuzzy optimal control methodology is applied to the design of the feedback loops of an Exothermic Continuous Stirred Tank Reactor system. The objective of design process is to find an optimal structure/gains of the Robust and Optimal Takagi Sugeno Fuzzy Controller (ROFLC. The control signal thus obtained will minimize a performance index, which is a function of the tracking/regulating errors, the quantity of the energy of the control signal applied to the system, and the number of fuzzy rules. The genetic learning is proposed for constructing the ROFLC. The chromosome genes are arranged into two parts, the binary-coded part contains the control genes and the real-coded part contains the genes parameters representing the fuzzy knowledge base. The effectiveness of this chromosome formulation enables the fuzzy sets and rules to be optimally reduced. The performances of the ROFLC are compared to these found by the traditional PD controller with Genetic Optimization (PD_GO. Simulations demonstrate that the proposed ROFLC and PD_GO has successfully met the design specifications.

  9. (Sub-)Optimality of Treating Interference as Noise in the Cellular Uplink With Weak Interference

    KAUST Repository

    Gherekhloo, Soheil; Chaaban, Anas; Di, Chen; Sezgin, Aydin

    2015-01-01

    Despite the simplicity of the scheme of treating interference as noise (TIN), it was shown to be sum-capacity optimal in the Gaussian interference channel (IC) with very-weak (noisy) interference. In this paper, the two-user IC is altered by introducing an additional transmitter that wants to communicate with one of the receivers of the IC. The resulting network thus consists of a point-to-point channel interfering with a multiple access channel (MAC) and is denoted by PIMAC. The sum-capacity of the PIMAC is studied with main focus on the optimality of TIN. It turns out that TIN in its naive variant, where all transmitters are active and both receivers use TIN for decoding, is not the best choice for the PIMAC. In fact, a scheme that combines both time division multiple access and TIN (TDMA-TIN) strictly outperforms the naive-TIN scheme. Furthermore, it is shown that in some regimes, TDMA-TIN achieves the sum-capacity for the deterministic PIMAC and the sum-capacity within a constant gap for the Gaussian PIMAC. In addition, it is shown that, even for very-weak interference, there are some regimes where a combination of interference alignment with power control and TIN at the receiver side outperforms TDMA-TIN. As a consequence, on the one hand, TIN in a cellular uplink is approximately optimal in certain regimes. On the other hand, those regimes cannot be simply described by the strength of interference.

  10. (Sub-)Optimality of Treating Interference as Noise in the Cellular Uplink With Weak Interference

    KAUST Repository

    Gherekhloo, Soheil

    2015-11-09

    Despite the simplicity of the scheme of treating interference as noise (TIN), it was shown to be sum-capacity optimal in the Gaussian interference channel (IC) with very-weak (noisy) interference. In this paper, the two-user IC is altered by introducing an additional transmitter that wants to communicate with one of the receivers of the IC. The resulting network thus consists of a point-to-point channel interfering with a multiple access channel (MAC) and is denoted by PIMAC. The sum-capacity of the PIMAC is studied with main focus on the optimality of TIN. It turns out that TIN in its naive variant, where all transmitters are active and both receivers use TIN for decoding, is not the best choice for the PIMAC. In fact, a scheme that combines both time division multiple access and TIN (TDMA-TIN) strictly outperforms the naive-TIN scheme. Furthermore, it is shown that in some regimes, TDMA-TIN achieves the sum-capacity for the deterministic PIMAC and the sum-capacity within a constant gap for the Gaussian PIMAC. In addition, it is shown that, even for very-weak interference, there are some regimes where a combination of interference alignment with power control and TIN at the receiver side outperforms TDMA-TIN. As a consequence, on the one hand, TIN in a cellular uplink is approximately optimal in certain regimes. On the other hand, those regimes cannot be simply described by the strength of interference.

  11. Differential sensitivity of cellular membranes to peroxidative processes

    International Nuclear Information System (INIS)

    Huijbers, W.A.R.

    1976-01-01

    A description is given of a morphological and cytochemical investigation into the effects of both vitamin E deficiency and X-irradiation on the ultrastructure and enzyme activities of several cellular membranes, particularly the plasma membrane and the membranes of lysosomes, mitochondria and endoplasmic reticulum. In the vitamin E deficient situation, the radicals and peroxides only originate near mitochondria and endoplasmic reticulum, so that these membrane systems suffer from changes. After irradiation of the liver of both the control duckling and the deficient duckling, radicals originate in all parts of the cell. Due to their high content of lipids and cholesterols, peroxides will occur mainly in plasma membranes and lysosomal membranes. Moreover, in these membranes there is hardly any protection by vitamin E

  12. Computer performance optimization systems, applications, processes

    CERN Document Server

    Osterhage, Wolfgang W

    2013-01-01

    Computing power performance was important at times when hardware was still expensive, because hardware had to be put to the best use. Later on this criterion was no longer critical, since hardware had become inexpensive. Meanwhile, however, people have realized that performance again plays a significant role, because of the major drain on system resources involved in developing complex applications. This book distinguishes between three levels of performance optimization: the system level, application level and business processes level. On each, optimizations can be achieved and cost-cutting p

  13. PROPOSAL OF SPATIAL OPTIMIZATION OF PRODUCTION PROCESS IN PROCESS DESIGNER

    Directory of Open Access Journals (Sweden)

    Peter Malega

    2015-03-01

    Full Text Available This contribution is focused on optimizing the use of space in the production process using software Process Designer. The aim of this contribution is to suggest possible improvements to the existing layout of the selected production process. Production process was analysed in terms of inputs, outputs and course of actions. Nowadays there are many software solutions aimed at optimizing the use of space. One of these software products is the Process Designer, which belongs to the product line Tecnomatix. This software is primarily aimed at production planning. With Process Designer is possible to design the layout of production and subsequently to analyse the production or to change according to the current needs of the company.

  14. An Optimized Three-Level Design of Decoder Based on Nanoscale Quantum-Dot Cellular Automata

    Science.gov (United States)

    Seyedi, Saeid; Navimipour, Nima Jafari

    2018-03-01

    Quantum-dot Cellular Automata (QCA) has been potentially considered as a supersede to Complementary Metal-Oxide-Semiconductor (CMOS) because of its inherent advantages. Many QCA-based logic circuits with smaller feature size, improved operating frequency, and lower power consumption than CMOS have been offered. This technology works based on electron relations inside quantum-dots. Due to the importance of designing an optimized decoder in any digital circuit, in this paper, we design, implement and simulate a new 2-to-4 decoder based on QCA with low delay, area, and complexity. The logic functionality of the 2-to-4 decoder is verified using the QCADesigner tool. The results have shown that the proposed QCA-based decoder has high performance in terms of a number of cells, covered area, and time delay. Due to the lower clock pulse frequency, the proposed 2-to-4 decoder is helpful for building QCA-based sequential digital circuits with high performance.

  15. Simulation and optimization of fractional crystallization processes

    DEFF Research Database (Denmark)

    Thomsen, Kaj; Rasmussen, Peter; Gani, Rafiqul

    1998-01-01

    A general method for the calculation of various types of phase diagrams for aqueous electrolyte mixtures is outlined. It is shown how the thermodynamic equilibrium precipitation process can be used to satisfy the operational needs of industrial crystallizer/centrifuge units. Examples of simulation...... and optimization of fractional crystallization processes are shown. In one of these examples, a process with multiple steady states is analyzed. The thermodynamic model applied for describing the highly non-ideal aqueous electrolyte systems is the Extended UNIQUAC model. (C) 1998 Published by Elsevier Science Ltd...

  16. Mobile Phone Service Process Hiccups at Cellular Inc.

    Science.gov (United States)

    Edgington, Theresa M.

    2010-01-01

    This teaching case documents an actual case of process execution and failure. The case is useful in MIS introductory courses seeking to demonstrate the interdependencies within a business process, and the concept of cascading failure at the process level. This case demonstrates benefits and potential problems with information technology systems,…

  17. Ring rolling process simulation for geometry optimization

    Science.gov (United States)

    Franchi, Rodolfo; Del Prete, Antonio; Donatiello, Iolanda; Calabrese, Maurizio

    2017-10-01

    Ring Rolling is a complex hot forming process where different rolls are involved in the production of seamless rings. Since each roll must be independently controlled, different speed laws must be set; usually, in the industrial environment, a milling curve is introduced to monitor the shape of the workpiece during the deformation in order to ensure the correct ring production. In the present paper a ring rolling process has been studied and optimized in order to obtain anular components to be used in aerospace applications. In particular, the influence of process input parameters (feed rate of the mandrel and angular speed of main roll) on geometrical features of the final ring has been evaluated. For this purpose, a three-dimensional finite element model for HRR (Hot Ring Rolling) has been implemented in SFTC DEFORM V11. The FEM model has been used to formulate a proper optimization problem. The optimization procedure has been implemented in the commercial software DS ISight in order to find the combination of process parameters which allows to minimize the percentage error of each obtained dimension with respect to its nominal value. The software allows to find the relationship between input and output parameters applying Response Surface Methodology (RSM), by using the exact values of output parameters in the control points of the design space explored through FEM simulation. Once this relationship is known, the values of the output parameters can be calculated for each combination of the input parameters. After the calculation of the response surfaces for the selected output parameters, an optimization procedure based on Genetic Algorithms has been applied. At the end, the error between each obtained dimension and its nominal value has been minimized. The constraints imposed were the maximum values of standard deviations of the dimensions obtained for the final ring.

  18. Optimization Of A Mass Spectrometry Process

    International Nuclear Information System (INIS)

    Lopes, Jose; Alegria, F. Correa; Redondo, Luis; Barradas, N. P.; Alves, E.; Rocha, Jorge

    2011-01-01

    In this paper we present and discuss a system developed in order to optimize the mass spectrometry process of an ion implanter. The system uses a PC to control and display the mass spectrum. The operator interacts with the I/O board, that interfaces with the computer and the ion implanter by a LabVIEW code. Experimental results are shown and the capabilities of the system are discussed.

  19. On process optimization considering LCA methodology.

    Science.gov (United States)

    Pieragostini, Carla; Mussati, Miguel C; Aguirre, Pío

    2012-04-15

    The goal of this work is to research the state-of-the-art in process optimization techniques and tools based on LCA, focused in the process engineering field. A collection of methods, approaches, applications, specific software packages, and insights regarding experiences and progress made in applying the LCA methodology coupled to optimization frameworks is provided, and general trends are identified. The "cradle-to-gate" concept to define the system boundaries is the most used approach in practice, instead of the "cradle-to-grave" approach. Normally, the relationship between inventory data and impact category indicators is linearly expressed by the characterization factors; then, synergic effects of the contaminants are neglected. Among the LCIA methods, the eco-indicator 99, which is based on the endpoint category and the panel method, is the most used in practice. A single environmental impact function, resulting from the aggregation of environmental impacts, is formulated as the environmental objective in most analyzed cases. SimaPro is the most used software for LCA applications in literature analyzed. The multi-objective optimization is the most used approach for dealing with this kind of problems, where the ε-constraint method for generating the Pareto set is the most applied technique. However, a renewed interest in formulating a single economic objective function in optimization frameworks can be observed, favored by the development of life cycle cost software and progress made in assessing costs of environmental externalities. Finally, a trend to deal with multi-period scenarios into integrated LCA-optimization frameworks can be distinguished providing more accurate results upon data availability. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Simulation of electrochemical processes in cardiac tissue based on cellular automaton

    International Nuclear Information System (INIS)

    Avdeev, S A; Bogatov, N M

    2014-01-01

    A new class of cellular automata using special accumulative function for nonuniformity distribution is presented. Usage of this automata type for simulation of excitable media applied to electrochemical processes in human cardiac tissue is shown

  1. Multi-objective optimization of cellular scanning strategy in selective laser melting

    DEFF Research Database (Denmark)

    Ahrari, Ali; Deb, Kalyanmoy; Mohanty, Sankhya

    2017-01-01

    The scanning strategy for selective laser melting - an additive manufacturing process - determines the temperature fields during the manufacturing process, which in turn affects residual stresses and distortions, two of the main sources of process-induced defects. The goal of this study is to dev......The scanning strategy for selective laser melting - an additive manufacturing process - determines the temperature fields during the manufacturing process, which in turn affects residual stresses and distortions, two of the main sources of process-induced defects. The goal of this study......, the problem is a combination of combinatorial and choice optimization, which makes the problem difficult to solve. On a process simulation domain consisting of 32 cells, our multi-objective evolutionary method is able to find a set of trade-off solutions for the defined conflicting objectives, which cannot...

  2. THE OPTIMIZATION OF PLUSH YARNS BULKING PROCESS

    Directory of Open Access Journals (Sweden)

    VINEREANU Adam

    2014-05-01

    Full Text Available This paper presents the experiments that were conducted on the installation of continuous bulking and thermofixing “SUPERBA” type TVP-2S for optimization of the plush yarns bulking process. There were considered plush yarns Nm 6.5/2, made of the fibrous blend of 50% indigenous wool sort 41 and 50% PES. In the first stage, it performs a thermal treatment with a turboprevaporizer at a temperature lower than thermofixing temperature, at atmospheric pressure, such that the plush yarns - deposed in a freely state on a belt conveyor - are uniformly bulking and contracting. It was followed the mathematical modeling procedure, working with a factorial program, rotatable central composite type, and two independent variables. After analyzing the parameters that have a direct influence on the bulking degree, there were selected the pre-vaporization temperature (coded x1,oC and the velocity of belt inside pre-vaporizer (coded x 2, m/min. As for the dependent variable, it was chosen the plush yarn diameter (coded y, mm. There were found the coordinates of the optimal point, and then this pair of values was verified in practice. These coordinates are: x1optim= 90oC and x 2optim= 6.5 m/min. The conclusion is that the goal was accomplished: it was obtained a good cover degree f or double-plush carpets by reducing the number of tufts per unit surface.

  3. Sequential metabolic phases as a means to optimize cellular output in a constant environment.

    Science.gov (United States)

    Palinkas, Aljoscha; Bulik, Sascha; Bockmayr, Alexander; Holzhütter, Hermann-Georg

    2015-01-01

    Temporal changes of gene expression are a well-known regulatory feature of all cells, which is commonly perceived as a strategy to adapt the proteome to varying external conditions. However, temporal (rhythmic and non-rhythmic) changes of gene expression are also observed under virtually constant external conditions. Here we hypothesize that such changes are a means to render the synthesis of the metabolic output more efficient than under conditions of constant gene activities. In order to substantiate this hypothesis, we used a flux-balance model of the cellular metabolism. The total time span spent on the production of a given set of target metabolites was split into a series of shorter time intervals (metabolic phases) during which only selected groups of metabolic genes are active. The related flux distributions were calculated under the constraint that genes can be either active or inactive whereby the amount of protein related to an active gene is only controlled by the number of active genes: the lower the number of active genes the more protein can be allocated to the enzymes carrying non-zero fluxes. This concept of a predominantly protein-limited efficiency of gene expression clearly differs from other concepts resting on the assumption of an optimal gene regulation capable of allocating to all enzymes and transporters just that fraction of protein necessary to prevent rate limitation. Applying this concept to a simplified metabolic network of the central carbon metabolism with glucose or lactate as alternative substrates, we demonstrate that switching between optimally chosen stationary flux modes comprising different sets of active genes allows producing a demanded amount of target metabolites in a significantly shorter time than by a single optimal flux mode at fixed gene activities. Our model-based findings suggest that temporal expression of metabolic genes can be advantageous even under conditions of constant external substrate supply.

  4. PROCESS TIME OPTIMIZATION IN DEPOSITOR AND FILLER

    Directory of Open Access Journals (Sweden)

    Jesús Iván Ruíz-Ibarra

    2017-07-01

    Full Text Available As in any industry, in soft drink manufacturing demand, customer service and production is of great importance that forces this production to have their equipment and production machines in optimal conditions for the product to be in the hands of the consumer without delays, therefore it is important to have the established times of each process, since the syrup is elaborated, packaged, distributed, until it is purchased by the consumer. After a chronometer analysis, the most common faults were detected in each analyzed process. In the filler machine the most frequent faults are: accumulation of bottles in the subsequent and previous processes to filling process, which in general the cause of the collection of bottles is due to failures in the other equipment of the production line. In the process of unloading the most common faults are: boxes jammed in bump and pusher (pushing boxes; boxes fallen in rollers and platforms transporter. According to observations in each machine, the actions to be followed are presented to solve the problems that arise. Also described the methodology to obtain results, to data analyze and decisions. Firstly an analysis of operations is done to know each machine, supported by the manuals of the machines and the operators themselves a study of times is done by chronometer to determine the standard time of the process where also they present the most common faults, then observations are made on the machines according to the determined sample size, thus obtaining the information necessary to take measurements and to make the study of optimization of the production processes. An analysis of the predetermined process times is also performed by the MTM methods and the MOST time analysis. The results of operators with MTM: Fault Filler = 0.846 minutes, Faultless Filler = 0.61 minutes, Fault Breaker = 0.74 minutes and Fault Flasher = 0.45 minutes. The results of MOST operators are: Fault Filler = 2.58 minutes, Filler Fails

  5. Optimal Information Processing in Biochemical Networks

    Science.gov (United States)

    Wiggins, Chris

    2012-02-01

    A variety of experimental results over the past decades provide examples of near-optimal information processing in biological networks, including in biochemical and transcriptional regulatory networks. Computing information-theoretic quantities requires first choosing or computing the joint probability distribution describing multiple nodes in such a network --- for example, representing the probability distribution of finding an integer copy number of each of two interacting reactants or gene products while respecting the `intrinsic' small copy number noise constraining information transmission at the scale of the cell. I'll given an overview of some recent analytic and numerical work facilitating calculation of such joint distributions and the associated information, which in turn makes possible numerical optimization of information flow in models of noisy regulatory and biochemical networks. Illustrating cases include quantification of form-function relations, ideal design of regulatory cascades, and response to oscillatory driving.

  6. Optimized Energy Procurement for Cellular Networks with Uncertain Renewable Energy Generation

    KAUST Repository

    Rached, Nadhir B.; Ghazzai, Hakim; Kadri, Abdullah; Alouini, Mohamed-Slim

    2017-01-01

    Renewable energy (RE) is an emerging solution for reducing carbon dioxide (CO2) emissions from cellular networks. One of the challenges of using RE sources is to handle its inherent uncertainty. In this paper, a RE powered cellular network

  7. Filtering and spectral processing of 1-D signals using cellular neural networks

    NARCIS (Netherlands)

    Moreira-Tamayo, O.; Pineda de Gyvez, J.

    1996-01-01

    This paper presents cellular neural networks (CNN) for one-dimensional discrete signal processing. Although CNN has been extensively used in image processing applications, little has been done for 1-dimensional signal processing. We propose a novel CNN architecture to carry out these tasks. This

  8. Design and optimization of sustainable process technologies

    DEFF Research Database (Denmark)

    Mussatto, Solange I.; Qin, Fen; Yamakawa, Celina Kiyomi

    has been then considered a keypoint to achieve such purposes, being also able to result in potential environmental, economic, and social benefits. In this sense, the Biomass Conversion and Bioprocess TechnologyGroup (BCBT) has been working on the development of newstrategies for the use of biomass......, minimizing the costs and maximizing the efficiencyand productivity.Once the optimal conditions are identified, the process scale-up can be then evaluated. This could be translated in a faster time to market for newprocess technologies....

  9. Discrete stochastic processes and optimal filtering

    CERN Document Server

    Bertein, Jean-Claude

    2012-01-01

    Optimal filtering applied to stationary and non-stationary signals provides the most efficient means of dealing with problems arising from the extraction of noise signals. Moreover, it is a fundamental feature in a range of applications, such as in navigation in aerospace and aeronautics, filter processing in the telecommunications industry, etc. This book provides a comprehensive overview of this area, discussing random and Gaussian vectors, outlining the results necessary for the creation of Wiener and adaptive filters used for stationary signals, as well as examining Kalman filters which ar

  10. Vascular Ageing and Exercise: Focus on Cellular Reparative Processes

    Directory of Open Access Journals (Sweden)

    Mark D. Ross

    2016-01-01

    Full Text Available Ageing is associated with an increased risk of developing noncommunicable diseases (NCDs, such as diabetes and cardiovascular disease (CVD. The increased risk can be attributable to increased prolonged exposure to oxidative stress. Often, CVD is preceded by endothelial dysfunction, which carries with it a proatherothrombotic phenotype. Endothelial senescence and reduced production and release of nitric oxide (NO are associated with “vascular ageing” and are often accompanied by a reduced ability for the body to repair vascular damage, termed “reendothelialization.” Exercise has been repeatedly shown to confer protection against CVD and diabetes risk and incidence. Regular exercise promotes endothelial function and can prevent endothelial senescence, often through a reduction in oxidative stress. Recently, endothelial precursors, endothelial progenitor cells (EPC, have been shown to repair damaged endothelium, and reduced circulating number and/or function of these cells is associated with ageing. Exercise can modulate both number and function of these cells to promote endothelial homeostasis. In this review we look at the effects of advancing age on the endothelium and these endothelial precursors and how exercise appears to offset this “vascular ageing” process.

  11. Vascular Ageing and Exercise: Focus on Cellular Reparative Processes.

    Science.gov (United States)

    Ross, Mark D; Malone, Eva; Florida-James, Geraint

    2016-01-01

    Ageing is associated with an increased risk of developing noncommunicable diseases (NCDs), such as diabetes and cardiovascular disease (CVD). The increased risk can be attributable to increased prolonged exposure to oxidative stress. Often, CVD is preceded by endothelial dysfunction, which carries with it a proatherothrombotic phenotype. Endothelial senescence and reduced production and release of nitric oxide (NO) are associated with "vascular ageing" and are often accompanied by a reduced ability for the body to repair vascular damage, termed "reendothelialization." Exercise has been repeatedly shown to confer protection against CVD and diabetes risk and incidence. Regular exercise promotes endothelial function and can prevent endothelial senescence, often through a reduction in oxidative stress. Recently, endothelial precursors, endothelial progenitor cells (EPC), have been shown to repair damaged endothelium, and reduced circulating number and/or function of these cells is associated with ageing. Exercise can modulate both number and function of these cells to promote endothelial homeostasis. In this review we look at the effects of advancing age on the endothelium and these endothelial precursors and how exercise appears to offset this "vascular ageing" process.

  12. Using exploratory regression to identify optimal driving factors for cellular automaton modeling of land use change.

    Science.gov (United States)

    Feng, Yongjiu; Tong, Xiaohua

    2017-09-22

    Defining transition rules is an important issue in cellular automaton (CA)-based land use modeling because these models incorporate highly correlated driving factors. Multicollinearity among correlated driving factors may produce negative effects that must be eliminated from the modeling. Using exploratory regression under pre-defined criteria, we identified all possible combinations of factors from the candidate factors affecting land use change. Three combinations that incorporate five driving factors meeting pre-defined criteria were assessed. With the selected combinations of factors, three logistic regression-based CA models were built to simulate dynamic land use change in Shanghai, China, from 2000 to 2015. For comparative purposes, a CA model with all candidate factors was also applied to simulate the land use change. Simulations using three CA models with multicollinearity eliminated performed better (with accuracy improvements about 3.6%) than the model incorporating all candidate factors. Our results showed that not all candidate factors are necessary for accurate CA modeling and the simulations were not sensitive to changes in statistically non-significant driving factors. We conclude that exploratory regression is an effective method to search for the optimal combinations of driving factors, leading to better land use change models that are devoid of multicollinearity. We suggest identification of dominant factors and elimination of multicollinearity before building land change models, making it possible to simulate more realistic outcomes.

  13. Optimal cellular mobility for synchronization arising from the gradual recovery of intercellular interactions

    International Nuclear Information System (INIS)

    Uriu, Koichiro; Ares, Saúl; Oates, Andrew C; Morelli, Luis G

    2012-01-01

    Cell movement and intercellular signaling occur simultaneously during the development of tissues, but little is known about how movement affects signaling. Previous theoretical studies have shown that faster moving cells favor synchronization across a population of locally coupled genetic oscillators. An important assumption in these studies is that cells can immediately interact with their new neighbors after arriving at a new location. However, intercellular interactions in cellular systems may need some time to become fully established. How movement affects synchronization in this situation has not been examined. Here, we develop a coupled phase oscillator model in which we consider cell movement and the gradual recovery of intercellular coupling experienced by a cell after movement, characterized by a moving rate and a coupling recovery rate, respectively. We find (1) an optimal moving rate for synchronization and (2) a critical moving rate above which achieving synchronization is not possible. These results indicate that the extent to which movement enhances synchrony is limited by a gradual recovery of coupling. These findings suggest that the ratio of time scales of movement and signaling recovery is critical for information transfer between moving cells. (paper)

  14. Big Data Components for Business Process Optimization

    Directory of Open Access Journals (Sweden)

    Mircea Raducu TRIFU

    2016-01-01

    Full Text Available In these days, more and more people talk about Big Data, Hadoop, noSQL and so on, but very few technical people have the necessary expertise and knowledge to work with those concepts and technologies. The present issue explains one of the concept that stand behind two of those keywords, and this is the map reduce concept. MapReduce model is the one that makes the Big Data and Hadoop so powerful, fast, and diverse for business process optimization. MapReduce is a programming model with an implementation built to process and generate large data sets. In addition, it is presented the benefits of integrating Hadoop in the context of Business Intelligence and Data Warehousing applications. The concepts and technologies behind big data let organizations to reach a variety of objectives. Like other new information technologies, the main important objective of big data technology is to bring dramatic cost reduction.

  15. The Brewing Process: Optimizing the Fermentation

    Directory of Open Access Journals (Sweden)

    Teodora Coldea

    2014-11-01

    Full Text Available Beer is a carbonated alcoholic beverage obtained by alcoholic fermentation of malt wort boiled with hops. Brown beer obtained at Beer Pilot Station of University of Agricultural Sciences and Veterinary Medicine Cluj-Napoca was the result of a recipe based on blond, caramel and black malt in different proportions, water, hops and yeast. This study aimed to monitorize the evolution of wort in primary and secondary alcoholic fermentation in order to optimize the process. Two wort batches were assambled in order to increase the brewing yeast fermentation performance. The primary fermentation was 14 days, followed by another 14 days of secondary fermentation (maturation. The must fermentation monitoring was done by the automatic FermentoStar analyzer. The whole fermentation process was monitorized (temperature, pH, alcohol concentration, apparent and total wort extract.

  16. Anode baking process optimization through computer modelling

    Energy Technology Data Exchange (ETDEWEB)

    Wilburn, D.; Lancaster, D.; Crowell, B. [Noranda Aluminum, New Madrid, MO (United States); Ouellet, R.; Jiao, Q. [Noranda Technology Centre, Pointe Claire, PQ (Canada)

    1998-12-31

    Carbon anodes used in aluminum electrolysis are produced in vertical or horizontal type anode baking furnaces. The carbon blocks are formed from petroleum coke aggregate mixed with a coal tar pitch binder. Before the carbon block can be used in a reduction cell it must be heated to pyrolysis. The baking process represents a large portion of the aluminum production cost, and also has a significant effect on anode quality. To ensure that the baking of the anode is complete, it must be heated to about 1100 degrees C. To improve the understanding of the anode baking process and to improve its efficiency, a menu-driven heat, mass and fluid flow simulation tool, called NABSIM (Noranda Anode Baking SIMulation), was developed and calibrated in 1993 and 1994. It has been used since then to evaluate and screen firing practices, and to determine which firing procedure will produce the optimum heat-up rate, final temperature, and soak time, without allowing unburned tar to escape. NABSIM is used as a furnace simulation tool on a daily basis by Noranda plant process engineers and much effort is expended in improving its utility by creating new versions, and the addition of new modules. In the immediate future, efforts will be directed towards optimizing the anode baking process to improve temperature uniformity from pit to pit. 3 refs., 4 figs.

  17. Mitochondrial correlates of signaling processes involved with the cellular response to eimeria infection in broiler chickens

    Science.gov (United States)

    Host cellular responses to coccidiosis infection are consistent with elements of apoptosis, autophagy, and necrosis. These processes are enhanced in the cell through cell-directed signaling or repressed through parasite-derived inhibitors of these processes favoring the survival of the parasite. Acr...

  18. Laser Processing of Multilayered Thermal Spray Coatings: Optimal Processing Parameters

    Science.gov (United States)

    Tewolde, Mahder; Zhang, Tao; Lee, Hwasoo; Sampath, Sanjay; Hwang, David; Longtin, Jon

    2017-12-01

    Laser processing offers an innovative approach for the fabrication and transformation of a wide range of materials. As a rapid, non-contact, and precision material removal technology, lasers are natural tools to process thermal spray coatings. Recently, a thermoelectric generator (TEG) was fabricated using thermal spray and laser processing. The TEG device represents a multilayer, multimaterial functional thermal spray structure, with laser processing serving an essential role in its fabrication. Several unique challenges are presented when processing such multilayer coatings, and the focus of this work is on the selection of laser processing parameters for optimal feature quality and device performance. A parametric study is carried out using three short-pulse lasers, where laser power, repetition rate and processing speed are varied to determine the laser parameters that result in high-quality features. The resulting laser patterns are characterized using optical and scanning electron microscopy, energy-dispersive x-ray spectroscopy, and electrical isolation tests between patterned regions. The underlying laser interaction and material removal mechanisms that affect the feature quality are discussed. Feature quality was found to improve both by using a multiscanning approach and an optional assist gas of air or nitrogen. Electrically isolated regions were also patterned in a cylindrical test specimen.

  19. Kanban simulation model for production process optimization

    Directory of Open Access Journals (Sweden)

    Golchev Riste

    2015-01-01

    Full Text Available A long time has passed since the KANBAN system has been established as an efficient method for coping with the excessive inventory. Still, the possibilities for its improvement through its integration with other different approaches should be investigated further. The basic research challenge of this paper is to present benefits of KANBAN implementation supported with Discrete Event Simulation (DES. In that direction, at the beginning, the basics of KANBAN system are presented with emphasis on the information and material flow, together with a methodology for implementation of KANBAN system. Certain analysis on combining the simulation with this methodology is presented. The paper is concluded with a practical example which shows that through understanding the philosophy of the implementation methodology of KANBAN system and the simulation methodology, a simulation model can be created which can serve as a basis for a variety of experiments that can be conducted within a short period of time, resulting with production process optimization.

  20. Optimal control of raw timber production processes

    Science.gov (United States)

    Ivan Kolenka

    1978-01-01

    This paper demonstrates the possibility of optimal planning and control of timber harvesting activ-ities with mathematical optimization models. The separate phases of timber harvesting are represented by coordinated models which can be used to select the optimal decision for the execution of any given phase. The models form a system whose components are connected and...

  1. Modeling and optimization of wet sizing process

    International Nuclear Information System (INIS)

    Thai Ba Cau; Vu Thanh Quang and Nguyen Ba Tien

    2004-01-01

    Mathematical simulation on basis of Stock law has been done for wet sizing process on cylinder equipment of laboratory and semi-industrial scale. The model consists of mathematical equations describing relations between variables, such as: - Resident time distribution function of emulsion particles in the separating zone of the equipment depending on flow-rate, height, diameter and structure of the equipment. - Size-distribution function in the fine and coarse parts depending on resident time distribution function of emulsion particles, characteristics of the material being processed, such as specific density, shapes, and characteristics of the environment of classification, such as specific density, viscosity. - Experimental model was developed on data collected from an experimental cylindrical equipment with diameter x height of sedimentation chamber equal to 50 x 40 cm for an emulsion of zirconium silicate in water. - Using this experimental model allows to determine optimal flow-rate in order to obtain product with desired grain size in term of average size or size distribution function. (author)

  2. Optimization of process variables for the microbial degradation of ...

    African Journals Online (AJOL)

    STORAGESEVER

    2008-07-18

    Jul 18, 2008 ... The optimum process conditions for maximizing phenol degradation (removal) ... cellular maintenance requirements on temperature makes it an important ..... the International Foundation for Science (IFS) for the financial ...

  3. Intracellular response to process optimization and impact on productivity and product aggregates for a high-titer CHO cell process.

    Science.gov (United States)

    Handlogten, Michael W; Lee-O'Brien, Allison; Roy, Gargi; Levitskaya, Sophia V; Venkat, Raghavan; Singh, Shailendra; Ahuja, Sanjeev

    2018-01-01

    A key goal in process development for antibodies is to increase productivity while maintaining or improving product quality. During process development of an antibody, titers were increased from 4 to 10 g/L while simultaneously decreasing aggregates. Process development involved optimization of media and feed formulations, feed strategy, and process parameters including pH and temperature. To better understand how CHO cells respond to process changes, the changes were implemented in a stepwise manner. The first change was an optimization of the feed formulation, the second was an optimization of the medium, and the third was an optimization of process parameters. Multiple process outputs were evaluated including cell growth, osmolality, lactate production, ammonium concentration, antibody production, and aggregate levels. Additionally, detailed assessment of oxygen uptake, nutrient and amino acid consumption, extracellular and intracellular redox environment, oxidative stress, activation of the unfolded protein response (UPR) pathway, protein disulfide isomerase (PDI) expression, and heavy and light chain mRNA expression provided an in-depth understanding of the cellular response to process changes. The results demonstrate that mRNA expression and UPR activation were unaffected by process changes, and that increased PDI expression and optimized nutrient supplementation are required for higher productivity processes. Furthermore, our findings demonstrate the role of extra- and intracellular redox environment on productivity and antibody aggregation. Processes using the optimized medium, with increased concentrations of redox modifying agents, had the highest overall specific productivity, reduced aggregate levels, and helped cells better withstand the high levels of oxidative stress associated with increased productivity. Specific productivities of different processes positively correlated to average intracellular values of total glutathione. Additionally

  4. Using Primary Literature in an Undergraduate Assignment: Demonstrating Connections among Cellular Processes

    Science.gov (United States)

    Yeong, Foong May

    2015-01-01

    Learning basic cell biology in an essential module can be daunting to second-year undergraduates, given the depth of information that is provided in major molecular and cell biology textbooks. Moreover, lectures on cellular pathways are organised into sections, such that at the end of lectures, students might not see how various processes are…

  5. Magnetohydrodynamics cellular automata

    International Nuclear Information System (INIS)

    Hatori, Tadatsugu.

    1990-02-01

    There has been a renewal of interest in cellular automata, partly because they give an architecture for a special purpose computer with parallel processing optimized to solve a particular problem. The lattice gas cellular automata are briefly surveyed, which are recently developed to solve partial differential equations such as hydrodynamics or magnetohydrodynamics. A new model is given in the present paper to implement the magnetic Lorentz force in a more deterministic and local procedure than the previous one. (author)

  6. Magnetohydrodynamic cellular automata

    Energy Technology Data Exchange (ETDEWEB)

    Hatori, Tadatsugu [National Inst. for Fusion Science, Nagoya (Japan)

    1990-03-01

    There has been a renewal of interest in cellular automata, partly because they give an architecture for a special purpose computer with parallel processing optimized to solve a particular problem. The lattice gas cellular automata are briefly surveyed, which are recently developed to solve partial differential equations such as hydrodynamics or magnetohydrodynamics. A new model is given in the present paper to implement the magnetic Lorentz force in a more deterministic and local procedure than the previous one. (author).

  7. Magnetohydrodynamic cellular automata

    International Nuclear Information System (INIS)

    Hatori, Tadatsugu

    1990-01-01

    There has been a renewal of interest in cellular automata, partly because they give an architecture for a special purpose computer with parallel processing optimized to solve a particular problem. The lattice gas cellular automata are briefly surveyed, which are recently developed to solve partial differential equations such as hydrodynamics or magnetohydrodynamics. A new model is given in the present paper to implement the magnetic Lorentz force in a more deterministic and local procedure than the previous one. (author)

  8. Transition from a planar interface to cellular and dendritic structures during rapid solidification processing

    Science.gov (United States)

    Laxmanan, V.

    1986-01-01

    The development of theoretical models which characterize the planar-cellular and cell-dendrite transitions is described. The transitions are analyzed in terms of the Chalmers number, the solute Peclet number, and the tip stability parameter, which correlate microstructural features and processing conditions. The planar-cellular transition is examined using the constitutional supercooling theory of Chalmers et al., (1953) and it is observed that the Chalmers number is between 0 and 1 during dendritic and cellular growth. Analysis of cell-dendrite transition data reveal that the transition occurs when the solute Peclet number goes through a minimum, the primary arm spacings go through a maximum, and the Chalmers number is equal to 1/2. The relation between the tip stability parameter and the solute Peclet number is investigated and it is noted that the tip stability parameter is useful for studying dendritic growth in alloys.

  9. GA based CNC turning center exploitation process parameters optimization

    Directory of Open Access Journals (Sweden)

    Z. Car

    2009-01-01

    Full Text Available This paper presents machining parameters (turning process optimization based on the use of artificial intelligence. To obtain greater efficiency and productivity of the machine tool, optimal cutting parameters have to be obtained. In order to find optimal cutting parameters, the genetic algorithm (GA has been used as an optimal solution finder. Optimization has to yield minimum machining time and minimum production cost, while considering technological and material constrains.

  10. Optimal processing of reversible quantum channels

    Energy Technology Data Exchange (ETDEWEB)

    Bisio, Alessandro, E-mail: alessandro.bisio@unipv.it [QUIT Group, Dipartimento di Fisica, INFN Sezione di Pavia, via Bassi 6, 27100 Pavia (Italy); D' Ariano, Giacomo Mauro; Perinotti, Paolo [QUIT Group, Dipartimento di Fisica, INFN Sezione di Pavia, via Bassi 6, 27100 Pavia (Italy); Sedlák, Michal [Department of Optics, Palacký University, 17. Listopadu 1192/12, CZ-771 46 Olomouc (Czech Republic); Institute of Physics, Slovak Academy of Sciences, Dúbravská Cesta 9, 845 11 Bratislava (Slovakia)

    2014-05-01

    We consider the general problem of the optimal transformation of N uses of (possibly different) unitary channels to a single use of another unitary channel in any finite dimension. We show how the optimal transformation can be fully parallelized, consisting in a preprocessing channel followed by a parallel action of all the N unitaries and a final postprocessing channel. Our techniques allow to achieve an exponential reduction in the number of the free parameters of the optimization problem making it amenable to an efficient numerical treatment. Finally, we apply our general results to find the analytical solution for special cases of interest like the cloning of qubit phase gates.

  11. Optimal tightening process of bolted joints

    Directory of Open Access Journals (Sweden)

    Monville Jean-Michel

    2016-01-01

    tensioner to show how the tightening load can be obtained from the applied tension load and to propose a way to optimize and secure the tightening process. However, for the reasons above mentioned it appears necessary to first make a general description of technical aspects of bolted joints. What really happens when tightening with torque wrench or with bolt tensioner is explained.

  12. Stochastic processes, multiscale modeling, and numerical methods for computational cellular biology

    CERN Document Server

    2017-01-01

    This book focuses on the modeling and mathematical analysis of stochastic dynamical systems along with their simulations. The collected chapters will review fundamental and current topics and approaches to dynamical systems in cellular biology. This text aims to develop improved mathematical and computational methods with which to study biological processes. At the scale of a single cell, stochasticity becomes important due to low copy numbers of biological molecules, such as mRNA and proteins that take part in biochemical reactions driving cellular processes. When trying to describe such biological processes, the traditional deterministic models are often inadequate, precisely because of these low copy numbers. This book presents stochastic models, which are necessary to account for small particle numbers and extrinsic noise sources. The complexity of these models depend upon whether the biochemical reactions are diffusion-limited or reaction-limited. In the former case, one needs to adopt the framework of s...

  13. Process optimization of friction stir welding based on thermal models

    DEFF Research Database (Denmark)

    Larsen, Anders Astrup

    2010-01-01

    This thesis investigates how to apply optimization methods to numerical models of a friction stir welding process. The work is intended as a proof-of-concept using different methods that are applicable to models of high complexity, possibly with high computational cost, and without the possibility...... information of the high-fidelity model. The optimization schemes are applied to stationary thermal models of differing complexity of the friction stir welding process. The optimization problems considered are based on optimizing the temperature field in the workpiece by finding optimal translational speed....... Also an optimization problem based on a microstructure model is solved, allowing the hardness distribution in the plate to be optimized. The use of purely thermal models represents a simplification of the real process; nonetheless, it shows the applicability of the optimization methods considered...

  14. Point process models for localization and interdependence of punctate cellular structures.

    Science.gov (United States)

    Li, Ying; Majarian, Timothy D; Naik, Armaghan W; Johnson, Gregory R; Murphy, Robert F

    2016-07-01

    Accurate representations of cellular organization for multiple eukaryotic cell types are required for creating predictive models of dynamic cellular function. To this end, we have previously developed the CellOrganizer platform, an open source system for generative modeling of cellular components from microscopy images. CellOrganizer models capture the inherent heterogeneity in the spatial distribution, size, and quantity of different components among a cell population. Furthermore, CellOrganizer can generate quantitatively realistic synthetic images that reflect the underlying cell population. A current focus of the project is to model the complex, interdependent nature of organelle localization. We built upon previous work on developing multiple non-parametric models of organelles or structures that show punctate patterns. The previous models described the relationships between the subcellular localization of puncta and the positions of cell and nuclear membranes and microtubules. We extend these models to consider the relationship to the endoplasmic reticulum (ER), and to consider the relationship between the positions of different puncta of the same type. Our results do not suggest that the punctate patterns we examined are dependent on ER position or inter- and intra-class proximity. With these results, we built classifiers to update previous assignments of proteins to one of 11 patterns in three distinct cell lines. Our generative models demonstrate the ability to construct statistically accurate representations of puncta localization from simple cellular markers in distinct cell types, capturing the complex phenomena of cellular structure interaction with little human input. This protocol represents a novel approach to vesicular protein annotation, a field that is often neglected in high-throughput microscopy. These results suggest that spatial point process models provide useful insight with respect to the spatial dependence between cellular structures.

  15. Spectral and Energy Efficiencies in mmWave Cellular Networks for Optimal Utilization

    Directory of Open Access Journals (Sweden)

    Abdulbaset M. Hamed

    2018-01-01

    Full Text Available Millimeter wave (mmWave spectrum has been proposed for use in commercial cellular networks to relieve the already severely congested microwave spectrum. Thus, the design of an efficient mmWave cellular network has gained considerable importance and has to take into account regulations imposed by government agencies with regard to global warming and sustainable development. In this paper, a dense mmWave hexagonal cellular network with each cell consisting of a number of smaller cells with their own Base Stations (BSs is presented as a solution to meet the increasing demand for a variety of high data rate services and growing number of users of cellular networks. Since spectrum and power are critical resources in the design of such a network, a framework is presented that addresses efficient utilization of these resources in mmWave cellular networks in the 28 and 73 GHz bands. These bands are already an integral part of well-known standards such as IEEE 802.15.3c, IEEE 802.11ad, and IEEE 802.16.1. In the analysis, a well-known accurate mmWave channel model for Line of Sight (LOS and Non-Line of Sight (NLOS links is used. The cellular network is analyzed in terms of spectral efficiency, bit/s, energy efficiency, bit/J, area spectral efficiency, bit/s/m2, area energy efficiency, bit/J/m2, and network latency, s/bit. These efficiency metrics are illustrated, using Monte Carlo simulation, as a function of Signal-to-Noise Ratio (SNR, channel model parameters, user distance from BS, and BS transmission power. The efficiency metrics for optimum deployment of cellular networks in 28 and 73 GHz bands are identified. Results show that 73 GHz band achieves better spectrum efficiency and the 28 GHz band is superior in terms of energy efficiency. It is observed that while the latter band is expedient for indoor networks, the former band is appropriate for outdoor networks.

  16. English Law Terms: Optimizing Education Process

    Directory of Open Access Journals (Sweden)

    Alexandra G. Anisimova

    2014-01-01

    systemic nature of terminology is the existence of antonymous relations between terminological units. Undoubtedly, systemic approach to terminological studies allows optimizing the learning process.

  17. Optimization of processing technology of Rhizoma Pinelliae ...

    African Journals Online (AJOL)

    soaking time and processing temperature on processing technology of Rhizoma ... Results: During the processing of Rhizoma Pinelliae Praeparatum, the size of influence of licorice .... Table 1: Factors and levels of orthogonal experiment.

  18. Physical bases for diffusion welding processes optimization

    International Nuclear Information System (INIS)

    Bulygina, S.M.; Berber, N.N.; Mukhambetov, D.G.

    1999-01-01

    One of wide-spread method of different materials joint is diffusion welding. It has being brought off at the expense of mutual diffusion of atoms of contacting surfaces under long-duration curing at its heating and compression. Welding regime in dependence from properties of welding details is defining of three parameters: temperature, pressure, time. Problem of diffusion welding optimization concludes in determination less values of these parameters, complying with requirements for quality of welded joint. In the work experiments on diffusion welding for calculated temperature and for given surface's roughness were carried out. Tests conduct on samples of iron and iron-nickel alloy with size 1·1·1 cm 3 . Optimal regime of diffusion welding of examined samples in vacuum is defined. It includes compression of welding samples, heating, isothermal holding at temperature 650 deg C during 0.5 h and affords the required homogeneity of joint

  19. Optimizing Governed Blockchains for Financial Process Authentications

    OpenAIRE

    Lundbaek, Leif-Nissen; D'Iddio, Andrea Callia; Huth, Michael

    2016-01-01

    We propose the formal study of governed blockchains that are owned and controlled by organizations and that neither create cryptocurrencies nor provide any incentives to solvers of cryptographic puzzles. We view such approaches as frameworks in which system parts, such as the cryptographic puzzle, may be instantiated with different technology. Owners of such a blockchain procure puzzle solvers as resources they control, and use a mathematical model to compute optimal parameters for the crypto...

  20. Entanglement and optimal quantum information processing

    International Nuclear Information System (INIS)

    Siomau, Michael

    2011-01-01

    Today we are standing on the verge of new enigmatic era of quantum technologies. In spite of the significant progress that has been achieved over the last three decades in experimental generation and manipulation as well as in theoretical description of evolution of single quantum systems, there are many open problems in understanding the behavior and properties of complex multiparticle quantum systems. In this thesis, we investigate theoretically a number of problems related to the description of entanglement - the nonlocal feature of complex quantum systems - of multiparticle states of finite-dimensional quantum systems. We also consider the optimal ways of manipulation of such systems. The focus is made, especially, on such optimal quantum transformations that provide a desired operation independently on the initial state of the given system. The first part of this thesis, in particular, is devoted to the detailed analysis of evolution of entanglement of complex quantum systems subjected to general non-unitary dynamics. In the second part of the thesis we construct several optimal state independent transformations, analyze their properties and suggest their applications in quantum communication and quantum computing. (orig.)

  1. Optimization of forging processes using finite element simulations

    NARCIS (Netherlands)

    Bonte, M.H.A.; Fourment, Lionel; Do, Tien-tho; van den Boogaard, Antonius H.; Huetink, Han

    2010-01-01

    During the last decades, simulation software based on the Finite Element Method (FEM) has significantly contributed to the design of feasible forming processes. Coupling FEM to mathematical optimization algorithms offers a promising opportunity to design optimal metal forming processes rather than

  2. Use of Experimental Design for Peuhl Cheese Process Optimization ...

    African Journals Online (AJOL)

    Use of Experimental Design for Peuhl Cheese Process Optimization. ... Journal of Applied Sciences and Environmental Management ... This work consisting in use of a central composite design enables the determination of optimal process conditions concerning: leaf extract volume added (7 mL), heating temperature ...

  3. An optimization framework for process discovery algorithms

    NARCIS (Netherlands)

    Weijters, A.J.M.M.; Stahlbock, R.

    2011-01-01

    Today there are many process mining techniques that, based on an event log, allow for the automatic induction of a process model. The process mining algorithms that are able to deal with incomplete event logs, exceptions, and noise typically have many parameters to tune the algorithm. Therefore, the

  4. Evolutionary optimization of production materials workflow processes

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    We present an evolutionary optimisation technique for stochastic production processes, which is able to find improved production materials workflow processes with respect to arbitrary combinations of numerical quantities associated with the production process. Working from a core fragment...... of the BPMN language, we employ an evolutionary algorithm where stochastic model checking is used as a fitness function to determine the degree of improvement of candidate processes derived from the original process through mutation and cross-over operations. We illustrate this technique using a case study...

  5. Optimization of equipment for electron radiation processing

    Science.gov (United States)

    Tartz, M.; Hartmann, E.; Lenk, M.; Mehnert, R.

    1999-05-01

    In the course of the last decade, IOM Leipzig has developed low-energy electron accelerators for electron beam curing of polymer coatings and printing inks. In order to optimize the electron irradiation field, electron optical calculations have been carried out using the commercially available EGUN code. The present study outlines the design of the diode-type low-energy electron accelerators LEA and EBOGEN, taking into account the electron optical effects of secondary components such as the retaining rods installed in the cathode assembly.

  6. Optimization of equipment for electron radiation processing

    International Nuclear Information System (INIS)

    Tartz, M.; Hartmann, E.; Lenk, M.; Mehnert, R.

    1999-01-01

    In the course of the last decade, IOM Leipzig has developed low-energy electron accelerators for electron beam curing of polymer coatings and printing inks. In order to optimize the electron irradiation field, electron optical calculations have been carried out using the commercially available EGUN code. The present study outlines the design of the diode-type low-energy electron accelerators LEA and EBOGEN, taking into account the electron optical effects of secondary components such as the retaining rods installed in the cathode assembly

  7. OPTIMAL PROCESSES IN IRREVERSIBLE THERMODYNAMICS AND MICROECONOMICS

    Directory of Open Access Journals (Sweden)

    Vladimir A. Kazakov

    2004-06-01

    Full Text Available This paper describes general methodology that allows one to extend Carnot efficiency of classical thermodynamic for zero rate processes onto thermodynamic systems with finite rate. We define the class of minimal dissipation processes and show that it represents generalization of reversible processes and determines the limiting possibilities of finite rate systems. The described methodology is then applied to microeconomic exchange systems yielding novel estimates of limiting efficiencies for such systems.

  8. Optimization of the investment casting process

    Directory of Open Access Journals (Sweden)

    M. Martinez-Hernandez

    2012-04-01

    Full Text Available Rapid prototyping is an important technique for manufacturing. This work refers to the manufacture of hollow patterns made of polymeric materials by rapid prototyping technologies for its use in the preparation of ceramic molds in the investment casting process. This work is focused on the development of a process for manufacturing patterns different from those that currently exist due to its hollow interior design, allowing its direct use in the fabrication of ceramic molds; avoiding cracking and fracture during the investment casting process, which is an important process for the foundry industry.

  9. Optimization of processing technology of Rhizoma Pinelliae ...

    African Journals Online (AJOL)

    Methods: Orthogonal design method was applied to analyze the effects of factors such as licorice concentration volume, soaking time and processing temperature on processing technology of Rhizoma Pinelliae Praeparatum; MTT assay and flow cytometry were used to determine the inhibitory effect of Rhizoma Pinelliae ...

  10. Optimal control of switched systems arising in fermentation processes

    CERN Document Server

    Liu, Chongyang

    2014-01-01

    The book presents, in a systematic manner, the optimal controls under different mathematical models in fermentation processes. Variant mathematical models – i.e., those for multistage systems; switched autonomous systems; time-dependent and state-dependent switched systems; multistage time-delay systems and switched time-delay systems – for fed-batch fermentation processes are proposed and the theories and algorithms of their optimal control problems are studied and discussed. By putting forward novel methods and innovative tools, the book provides a state-of-the-art and comprehensive systematic treatment of optimal control problems arising in fermentation processes. It not only develops nonlinear dynamical system, optimal control theory and optimization algorithms, but can also help to increase productivity and provide valuable reference material on commercial fermentation processes.

  11. Thermodynamic Aspects and Reprogramming Cellular Energy Metabolism during the Fibrosis Process

    Directory of Open Access Journals (Sweden)

    Alexandre Vallée

    2017-11-01

    Full Text Available Fibrosis is characterized by fibroblast proliferation and fibroblast differentiation into myofibroblasts, which generate a relaxation-free contraction mechanism associated with excessive collagen synthesis in the extracellular matrix, which promotes irreversible tissue retraction evolving towards fibrosis. From a thermodynamic point of view, the mechanisms leading to fibrosis are irreversible processes that can occur through changing the entropy production rate. The thermodynamic behaviors of metabolic enzymes involved in fibrosis are modified by the dysregulation of both transforming growth factor β (TGF-β signaling and the canonical WNT/β-catenin pathway, leading to aerobic glycolysis, called the Warburg effect. Molecular signaling pathways leading to fibrosis are considered dissipative structures that exchange energy or matter with their environment far from the thermodynamic equilibrium. The myofibroblastic cells arise from exergonic processes by switching the core metabolism from oxidative phosphorylation to glycolysis, which generates energy and reprograms cellular energy metabolism to induce the process of myofibroblast differentiation. Circadian rhythms are far-from-equilibrium thermodynamic processes. They directly participate in regulating the TGF-β and WNT/β-catenin pathways involved in energetic dysregulation and enabling fibrosis. The present review focusses on the thermodynamic implications of the reprogramming of cellular energy metabolism, leading to fibroblast differentiation into myofibroblasts through the positive interplay between TGF-β and WNT/β-catenin pathways underlying in fibrosis.

  12. Processed fruit juice ready to drink: screening acute toxicity at the cellular level

    Directory of Open Access Journals (Sweden)

    Erick Leal da Silva

    2017-06-01

    Full Text Available The present study evaluated the acute toxicity at the cellular level of processed juice ready for consumption Orange and Grape flavors, produced by five companies with significant influence on the food market of South American countries, especially in Brazil. This evaluation was performed in root meristem cells of Allium cepa L., at the exposure times of 24 and 48 hours, directly with marketed liquid preparations. Based on the results, it was found that fruit juices, of all companies considered, promoted significant antiproliferative effect to root meristems at the exposure time of 24 hours and resulted in at both exposure times, statistically significant number of mitotic spindle changes and chromosomal breaks. Therefore, under the study conditions, all juice samples analyzed were cytotoxic, genotoxic and mutagenic to root meristem cells. These results indicate that such beverages have relevant potential to cause cellular disorders and, thus, need to be evaluated more fully in more complex test systems, as those in rodents, and then establish specific toxicity at the cellular level of these juices and ensure the well-being of those who consume them.

  13. Cooking and drying processes optimization of Pentadesma ...

    African Journals Online (AJOL)

    Aghomotsegin

    2015-09-30

    Sep 30, 2015 ... This work determined the optimum conditions of cooking and drying processes. ... Key words: Forest galeries, Pentadesma butyraceae, cosmetic industry, ..... butyracea kernels can lead to the production of butter of.

  14. Nanohydroxyapatite synthesis using optimized process parameters

    Indian Academy of Sciences (India)

    Nanohydroxyapatite; ultrasonication; response surface methodology; calcination; ... Three independent process parameters: temperature () (70, 80 and 90°C), ... Bangi, Selangor, Malaysia; Energy Research Group, School of Engineering, ...

  15. Simulation-based optimization for product and process design

    NARCIS (Netherlands)

    Driessen, L.

    2006-01-01

    The design of products and processes has gradually shifted from a purely physical process towards a process that heavily relies on computer simulations (virtual prototyping). To optimize this virtual design process in terms of speed and final product quality, statistical methods and mathematical

  16. Energy optimization of integrated process plants

    Energy Technology Data Exchange (ETDEWEB)

    Sandvig Nielsen, J

    1996-10-01

    A general approach for viewing the process synthesis as an evolutionary process is proposed. Each step is taken according to the present level of information and knowledge. This is formulated in a Process Synthesis Cycle. Initially the synthesis is conducted at a high abstraction level maximizing use of heuristics (prior experience, rules of thumbs etc). When further knowledge and information are available, heuristics will gradually be replaced by exact problem formulations. The principles in the Process Synthesis Cycle, is used to develop a general procedure for energy synthesis, based on available tools. The procedure is based on efficient use of process simulators with integrated Pinch capabilities (energy targeting). The proposed general procedure is tailored to three specific problems (Humid Air Turbine power plant synthesis, Nitric Acid process synthesis and Sulphuric Acid synthesis). Using the procedure reduces the problem dimension considerable and thus allows for faster evaluation of more alternatives. At more detailed level a new framework for the Heat Exchanger Network synthesis problem is proposed. The new framework is object oriented based on a general functional description of all elements potentially present in the heat exchanger network (streams, exchangers, pumps, furnaces etc.). (LN) 116 refs.

  17. Optimizing ISOCAM data processing using spatial redundancy

    Science.gov (United States)

    Miville-Deschênes, M.-A.; Boulanger, F.; Abergel, A.; Bernard, J.-P.

    2000-11-01

    Several instrumental effects of the Long Wavelength channel of ISOCAM, the camera on board the Infrared Space Observatory, degrade the processed images. We present new data-processing techniques that correct these defects, taking advantage of the fact that a position in the sky has been observed by several pixels at different times. We use this redundant information (1) to correct the long-term variation of the detector response, (2) to correct memory effects after glitches and point sources, and (3) to refine the deglitching process. As an example we have applied our processing to the gamma-ray burst observation GRB 970402. Our new data-processing techniques allow the detection of faint extended emission with contrast smaller than 1% of the zodiacal background. The data reduction corrects instrumental effects to the point where the noise in the final map is dominated by the readout and the photon noises. All raster ISOCAM observations can benefit from the data processing described here. This includes mapping of solar system extended objects (comet dust trails), nearby clouds and star forming regions, images from diffuse emission in the Galactic plane and external galaxies. These techniques could also be applied to other raster type observations (e.g. ISOPHOT). Based on observations with ISO, an ESA project with instruments funded by ESA Member States (especially the PI countries: France, Germany, The Netherlands and the UK) and with the participation of ISAS and NASA.

  18. Energy optimization of bread baking process undergoing quality constraints

    International Nuclear Information System (INIS)

    Papasidero, Davide; Pierucci, Sauro; Manenti, Flavio

    2016-01-01

    International home energy rating regulations are forcing to use efficient cooking equipment and processes towards energy saving and sustainability. For this reason gas ovens are replaced by the electric ones, to get the highest energy rating. Due to this fact, the study of the technologies related to the energy efficiency in cooking is increasingly developing. Indeed, big industries are working to the energy optimization of their processes since decades, while there is still a lot of room in energy optimization of single household appliances. The achievement of a higher efficiency can have a big impact on the society only if the use of modern equipment gets widespread. The combination of several energy sources (e.g. forced convection, irradiation, microwave, etc.) and their optimization is an emerging target for oven manufacturers towards optimal oven design. In this work, an energy consumption analysis and optimization is applied to the case of bread baking. Each source of energy gets the due importance and the process conditions are compared. A basic quality standard is guaranteed by taking into account some quality markers, which are relevant based on a consumer viewpoint. - Highlights: • Energy optimization is based on a validated finite-element model for bread baking. • Quality parameters for the product acceptability are introduced as constraints. • Dynamic optimization leads to 20% energy saving compared to non-optimized case. • The approach is applicable to many products, quality parameters, thermal processes. • Other heating processes can be easily integrated in the presented model.

  19. Alternative oxidase pathway optimizes photosynthesis during osmotic and temperature stress by regulating cellular ROS, malate valve and antioxidative systems

    Directory of Open Access Journals (Sweden)

    DINAKAR eCHALLABATHULA

    2016-02-01

    Full Text Available The present study reveals the importance of alternative oxidase (AOX pathway in optimizing photosynthesis under osmotic and temperature stress conditions in the mesophyll protoplasts of Pisum sativum. The responses of photosynthesis and respiration were monitored at saturating light intensity of 1000 µmoles m-2 s-1 at 25 oC under a range of sorbitol concentrations from 0.4 M to 1.0M to induce hyper-osmotic stress and by varying the temperature of the thermo-jacketed pre-incubation chamber from 25 oC to 10 oC to impose sub-optimal temperature stress. Compared to controls (0.4 M sorbitol and 25 OC, the mesophyll protoplasts showed remarkable decrease in NaHCO3-dependent O2 evolution (indicator of photosynthetic carbon assimilation, under both hyper-osmotic (1.0 M sorbitol and sub-optimal temperature stress conditions (10 OC, while the decrease in rates of respiratory O2 uptake were marginal. The capacity of AOX pathway increased significantly in parallel to increase in intracellular pyruvate and reactive oxygen species (ROS levels under both hyper-osmotic stress and sub-optimal temperature stress under the background of saturating light. The ratio of redox couple (Malate/OAA related to malate valve increased in contrast to the ratio of redox couple (GSH/GSSG related to antioxidative system during hyper-osmotic stress. Nevertheless, the ratio of GSH/GSSG decreased in the presence of sub-optimal temperature, while the ratio of Malate/OAA showed no visible changes. Also, the redox ratios of pyridine nucleotides increased under hyper-osmotic (NADH/NAD and sub-optimal temperature (NADPH/NADP stresses, respectively. However, upon restriction of AOX pathway by using salicylhydroxamic acid (SHAM, the observed changes in NaHCO3 dependent O2 evolution, cellular ROS, redox ratios of Malate/OAA, NAD(PH/NAD(P and GSH/GSSG were further aggravated under stress conditions with concomitant modulations in NADP-MDH and antioxidant enzymes. Taken together, the

  20. Cellular compartments cause multistability and allow cells to process more information

    DEFF Research Database (Denmark)

    Harrington, Heather A; Feliu, Elisenda; Wiuf, Carsten

    2013-01-01

    recent developments from dynamical systems and chemical reaction network theory to identify and characterize the key-role of the spatial organization of eukaryotic cells in cellular information processing. In particular, the existence of distinct compartments plays a pivotal role in whether a system...... is capable of multistationarity (multiple response states), and is thus directly linked to the amount of information that the signaling molecules can represent in the nucleus. Multistationarity provides a mechanism for switching between different response states in cell signaling systems and enables multiple...

  1. Process control and optimization with simple interval calculation method

    DEFF Research Database (Denmark)

    Pomerantsev, A.; Rodionova, O.; Høskuldsson, Agnar

    2006-01-01

    for the quality improvement in the course of production. The latter is an active quality optimization, which takes into account the actual history of the process. The advocate approach is allied to the conventional method of multivariate statistical process control (MSPC) as it also employs the historical process......Methods of process control and optimization are presented and illustrated with a real world example. The optimization methods are based on the PLS block modeling as well as on the simple interval calculation methods of interval prediction and object status classification. It is proposed to employ...... the series of expanding PLS/SIC models in order to support the on-line process improvements. This method helps to predict the effect of planned actions on the product quality and thus enables passive quality control. We have also considered an optimization approach that proposes the correcting actions...

  2. Optimization of industrial processes using radiation sources

    International Nuclear Information System (INIS)

    Salles, Claudio G.; Silva Filho, Edmundo D. da; Toribio, Norberto M.; Gandara, Leonardo A.

    1996-01-01

    Aiming the enhancement of the staff protection against radiation in operational areas, the SAMARCO Mineracao S.A. proceeded a reevaluation and analysis of the real necessity of the densimeters/radioactive sources in the operational area, and also the development of an alternative control process for measurement the ore pulp, and introduced of the advanced equipment for sample chemical analysis

  3. Nanohydroxyapatite synthesis using optimized process parameters ...

    Indian Academy of Sciences (India)

    3Energy Research Group, School of Engineering, Taylor's University, 47500 ... influence of different ultrasonication parameters on the prop- ... to evaluate multiple process parameters and their interaction. ..... dent and dependent variables by a 3-D representation of .... The intensities of O–H functional groups are seen to.

  4. Impact of light on Hypocrea jecorina and the multiple cellular roles of ENVOY in this process

    Directory of Open Access Journals (Sweden)

    Druzhinina Irina S

    2007-12-01

    Full Text Available Abstract Background In fungi, light is primarily known to influence general morphogenesis and both sexual and asexual sporulation. In order to expand the knowledge on the effect of light in fungi and to determine the role of the light regulatory protein ENVOY in the implementation of this effect, we performed a global screen for genes, which are specifically effected by light in the fungus Hypocrea jecorina (anamorph Trichoderma reesei using Rapid Subtraction Hybridization (RaSH. Based on these data, we analyzed whether these genes are influenced by ENVOY and if overexpression of ENVOY in darkness would be sufficient to execute its function. Results The cellular functions of the detected light responsive genes comprised a variety of roles in transcription, translation, signal transduction, metabolism, and transport. Their response to light with respect to the involvement of ENVOY could be classified as follows: (i ENVOY-mediated upregulation by light; (ii ENVOY-independent upregulation by light; (iii ENVOY-antagonized upregulation by light; ENVOY-dependent repression by light; (iv ENVOY-independent repression by light; and (v both positive and negative regulation by ENVOY of genes not responsive to light in the wild-type. ENVOY was found to be crucial for normal growth in light on various carbon sources and is not able to execute its regulatory function if overexpressed in the darkness. Conclusion The different responses indicate that light impacts fungi like H. jecorina at several cellular processes, and that it has both positive and negative effects. The data also emphasize that ENVOY has an apparently more widespread cellular role in this process than only in modulating the response to light.

  5. Optimization of process parameters for synthesis of silica–Ni ...

    Indian Academy of Sciences (India)

    Optimization of process parameters for synthesis of silica–Ni nanocomposite by design of experiment ... Sol–gel; Ni; design of experiments; nanocomposites. ... Kolkata 700 032, India; Rustech Products Pvt. Ltd., Kolkata 700 045, India ...

  6. Advanced Process Control Application and Optimization in Industrial Facilities

    Directory of Open Access Journals (Sweden)

    Howes S.

    2015-01-01

    Full Text Available This paper describes application of the new method and tool for system identification and PID tuning/advanced process control (APC optimization using the new 3G (geometric, gradient, gravity optimization method. It helps to design and implement control schemes directly inside the distributed control system (DCS or programmable logic controller (PLC. Also, the algorithm helps to identify process dynamics in closed-loop mode, optimizes controller parameters, and helps to develop adaptive control and model-based control (MBC. Application of the new 3G algorithm for designing and implementing APC schemes is presented. Optimization of primary and advanced control schemes stabilizes the process and allows the plant to run closer to process, equipment and economic constraints. This increases production rates, minimizes operating costs and improves product quality.

  7. Optimization of process and solution parameters in electrospinning polyethylene oxide

    CSIR Research Space (South Africa)

    Jacobs, V

    2011-11-01

    Full Text Available This paper reports the optimization of electrospinning process and solution parameters using factorial design approach to obtain uniform polyethylene oxide (PEO) nanofibers. The parameters studied were distance between nozzle and collector screen...

  8. Process optimization and insecticidal activity of alkaloids from the ...

    African Journals Online (AJOL)

    Process optimization and insecticidal activity of alkaloids from the root bark of Catalpa ovata G. Don by response surface methodology. ... Tropical Journal of Pharmaceutical Research. Journal Home · ABOUT THIS JOURNAL · Advanced ...

  9. OPTIMAL SIGNAL PROCESSING METHODS IN GPR

    Directory of Open Access Journals (Sweden)

    Saeid Karamzadeh

    2014-01-01

    Full Text Available In the past three decades, a lot of various applications of Ground Penetrating Radar (GPR took place in real life. There are important challenges of this radar in civil applications and also in military applications. In this paper, the fundamentals of GPR systems will be covered and three important signal processing methods (Wavelet Transform, Matched Filter and Hilbert Huang will be compared to each other in order to get most accurate information about objects which are in subsurface or behind the wall.

  10. Proteomic characterization of cellular and molecular processes that enable the Nanoarchaeum equitans--Ignicoccus hospitalis relationship.

    Directory of Open Access Journals (Sweden)

    Richard J Giannone

    Full Text Available Nanoarchaeum equitans, the only cultured representative of the Nanoarchaeota, is dependent on direct physical contact with its host, the hyperthermophile Ignicoccus hospitalis. The molecular mechanisms that enable this relationship are unknown. Using whole-cell proteomics, differences in the relative abundance of >75% of predicted protein-coding genes from both Archaea were measured to identify the specific response of I. hospitalis to the presence of N. equitans on its surface. A purified N. equitans sample was also analyzed for evidence of interspecies protein transfer. The depth of cellular proteome coverage achieved here is amongst the highest reported for any organism. Based on changes in the proteome under the specific conditions of this study, I. hospitalis reacts to N. equitans by curtailing genetic information processing (replication, transcription in lieu of intensifying its energetic, protein processing and cellular membrane functions. We found no evidence of significant Ignicoccus biosynthetic enzymes being transported to N. equitans. These results suggest that, under laboratory conditions, N. equitans diverts some of its host's metabolism and cell cycle control to compensate for its own metabolic shortcomings, thus appearing to be entirely dependent on small, transferable metabolites and energetic precursors from I. hospitalis.

  11. [Optimization of the pertussis vaccine production process].

    Science.gov (United States)

    Germán Santiago, J; Zamora, N; de la Rosa, E; Alba Carrión, C; Padrón, P; Hernández, M; Betancourt, M; Moretti, N

    1995-01-01

    The production of Pertussis Vaccine was reevaluated at the Instituto Nacional de Higiene "Rafael Rangel" in order to optimise it in terms of vaccine yield, potency, specific toxicity and efficiency (cost per doses). Four different processes, using two culture media (Cohen-Wheeler and Fermentación Glutamato Prolina-1) and two types of bioreactors (25 L Fermentador Caracas and a 450 L industrial fermentor) were compared. Runs were started from freeze-dried strains (134 or 509) and continued until the obtention of the maximal yield. It was found that the combination Fermentación Glutamato Prolina-1/industrial fermentor, shortened the process to 40 hours while consistently yielding a vaccine of higher potency (7.91 +/- 2.56 IU/human dose) and lower specific toxicity in a mice bioassay. In addition, the physical aspect of the preparation was rather homogeneous and free of dark aggregates. Most importantly, the biomass yield more than doubled those of the Fermentador Caracas using the two different media and that in the industrial fermentor with the Cohen-Wheeler medium. Therefore, the cost per doses was substantially decreased.

  12. A hybrid gene selection approach for microarray data classification using cellular learning automata and ant colony optimization.

    Science.gov (United States)

    Vafaee Sharbaf, Fatemeh; Mosafer, Sara; Moattar, Mohammad Hossein

    2016-06-01

    This paper proposes an approach for gene selection in microarray data. The proposed approach consists of a primary filter approach using Fisher criterion which reduces the initial genes and hence the search space and time complexity. Then, a wrapper approach which is based on cellular learning automata (CLA) optimized with ant colony method (ACO) is used to find the set of features which improve the classification accuracy. CLA is applied due to its capability to learn and model complicated relationships. The selected features from the last phase are evaluated using ROC curve and the most effective while smallest feature subset is determined. The classifiers which are evaluated in the proposed framework are K-nearest neighbor; support vector machine and naïve Bayes. The proposed approach is evaluated on 4 microarray datasets. The evaluations confirm that the proposed approach can find the smallest subset of genes while approaching the maximum accuracy. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. A Generic Methodology for Superstructure Optimization of Different Processing Networks

    DEFF Research Database (Denmark)

    Bertran, Maria-Ona; Frauzem, Rebecca; Zhang, Lei

    2016-01-01

    In this paper, we propose a generic computer-aided methodology for synthesis of different processing networks using superstructure optimization. The methodology can handle different network optimization problems of various application fields. It integrates databases with a common data architecture......, a generic model to represent the processing steps, and appropriate optimization tools. A special software interface has been created to automate the steps in the methodology workflow, allow the transfer of data between tools and obtain the mathematical representation of the problem as required...

  14. Signal processing for molecular and cellular biological physics: an emerging field.

    Science.gov (United States)

    Little, Max A; Jones, Nick S

    2013-02-13

    Recent advances in our ability to watch the molecular and cellular processes of life in action--such as atomic force microscopy, optical tweezers and Forster fluorescence resonance energy transfer--raise challenges for digital signal processing (DSP) of the resulting experimental data. This article explores the unique properties of such biophysical time series that set them apart from other signals, such as the prevalence of abrupt jumps and steps, multi-modal distributions and autocorrelated noise. It exposes the problems with classical linear DSP algorithms applied to this kind of data, and describes new nonlinear and non-Gaussian algorithms that are able to extract information that is of direct relevance to biological physicists. It is argued that these new methods applied in this context typify the nascent field of biophysical DSP. Practical experimental examples are supplied.

  15. Optimality of Poisson Processes Intensity Learning with Gaussian Processes

    NARCIS (Netherlands)

    Kirichenko, A.; van Zanten, H.

    2015-01-01

    In this paper we provide theoretical support for the so-called "Sigmoidal Gaussian Cox Process" approach to learning the intensity of an inhomogeneous Poisson process on a d-dimensional domain. This method was proposed by Adams, Murray and MacKay (ICML, 2009), who developed a tractable computational

  16. Optimal protocol for teleconsultation with a cellular phone for dentoalveolar trauma: an in-vitro study

    International Nuclear Information System (INIS)

    Park, Won Se; Lee, Hae Na; Jeong, Jin Sun; Kwon, Jung Hoon; Lee, Grace H; Kim, Kee Dong

    2012-01-01

    Dental trauma is frequently unpredictable. The initial assessment and urgent treatment are essential for dentists to save the patient's teeth. Mobile-phone-assisted teleconsultation and telediagnosis for dental trauma could be an aid when a dentist is not available. In the present in-vitro study, we evaluated the success rate and time to transfer images under various conditions. We analyzed the image quality of cameras built into mobile phones based on their resolution, autofocus, white-balance, and anti-movement functions. The image quality of most built-in cameras was acceptable to perform the initial assessment, with the autofocus function being essential to obtain high-quality images. The transmission failure rate increased markedly when the image size exceeded 500 kB and the additional text messaging did not improve the success rate or the transmission time. Our optimal protocol could be useful for emergency programs running on the mobile phones.

  17. Optimal protocol for teleconsultation with a cellular phone for dentoalveolar trauma: an in-vitro study

    Energy Technology Data Exchange (ETDEWEB)

    Park, Won Se; Lee, Hae Na; Jeong, Jin Sun; Kwon, Jung Hoon; Lee, Grace H; Kim, Kee Dong [College of Dentistry, Yonsei University, Seoul (Korea, Republic of)

    2012-06-15

    Dental trauma is frequently unpredictable. The initial assessment and urgent treatment are essential for dentists to save the patient's teeth. Mobile-phone-assisted teleconsultation and telediagnosis for dental trauma could be an aid when a dentist is not available. In the present in-vitro study, we evaluated the success rate and time to transfer images under various conditions. We analyzed the image quality of cameras built into mobile phones based on their resolution, autofocus, white-balance, and anti-movement functions. The image quality of most built-in cameras was acceptable to perform the initial assessment, with the autofocus function being essential to obtain high-quality images. The transmission failure rate increased markedly when the image size exceeded 500 kB and the additional text messaging did not improve the success rate or the transmission time. Our optimal protocol could be useful for emergency programs running on the mobile phones.

  18. Nitric Oxide and ERK mediates regulation of cellular processes by Ecdysterone

    Energy Technology Data Exchange (ETDEWEB)

    Omanakuttan, Athira; Bose, Chinchu; Pandurangan, Nanjan; Kumar, Geetha B.; Banerji, Asoke; Nair, Bipin G., E-mail: bipin@amrita.edu

    2016-08-15

    The complex process of wound healing is a major problem associated with diabetes, venous or arterial disease, old age and infection. A wide range of pharmacological effects including anabolic, anti-diabetic and hepato-protective activities have been attributed to Ecdysterone. In earlier studies, Ecdysterone has been shown to modulate eNOS and iNOS expression in diabetic animals and activate osteogenic differentiation through the Extracellular-signal-Regulated Kinase (ERK) pathway in periodontal ligament stem cells. However, in the wound healing process, Ecdysterone has only been shown to enhance granulation tissue formation in rabbits. There have been no studies to date, which elucidate the molecular mechanism underlying the complex cellular process involved in wound healing. The present study, demonstrates a novel interaction between the phytosteroid Ecdysterone and Nitric Oxide Synthase (NOS), in an Epidermal Growth Factor Receptor (EGFR)-dependent manner, thereby promoting cell proliferation, cell spreading and cell migration. These observations were further supported by the 4-amino-5-methylamino- 2′ ,7′ -difluorofluorescein diacetate (DAF FM) fluorescence assay which indicated that Ecdysterone activates NOS resulting in increased Nitric Oxide (NO) production. Additionally, studies with inhibitors of both the EGFR and ERK, demonstrated that Ecdysterone activates NOS through modulation of EGFR and ERK. These results clearly demonstrate, for the first time, that Ecdysterone enhances Nitric Oxide production and modulates complex cellular processes by activating ERK1/2 through the EGF pathway. - Highlights: • Ecdysterone significantly enhances cell migration in a dose dependent manner. • Ecdysterone augments cell spreading during the initial phase of cell migration through actin cytoskeletal rearrangement. • Ecdysterone enhances cell proliferation in a nitric oxide dependent manner. • Ecdysterone enhances nitric oxide production via activation of EGFR

  19. Nitric Oxide and ERK mediates regulation of cellular processes by Ecdysterone

    International Nuclear Information System (INIS)

    Omanakuttan, Athira; Bose, Chinchu; Pandurangan, Nanjan; Kumar, Geetha B.; Banerji, Asoke; Nair, Bipin G.

    2016-01-01

    The complex process of wound healing is a major problem associated with diabetes, venous or arterial disease, old age and infection. A wide range of pharmacological effects including anabolic, anti-diabetic and hepato-protective activities have been attributed to Ecdysterone. In earlier studies, Ecdysterone has been shown to modulate eNOS and iNOS expression in diabetic animals and activate osteogenic differentiation through the Extracellular-signal-Regulated Kinase (ERK) pathway in periodontal ligament stem cells. However, in the wound healing process, Ecdysterone has only been shown to enhance granulation tissue formation in rabbits. There have been no studies to date, which elucidate the molecular mechanism underlying the complex cellular process involved in wound healing. The present study, demonstrates a novel interaction between the phytosteroid Ecdysterone and Nitric Oxide Synthase (NOS), in an Epidermal Growth Factor Receptor (EGFR)-dependent manner, thereby promoting cell proliferation, cell spreading and cell migration. These observations were further supported by the 4-amino-5-methylamino- 2′ ,7′ -difluorofluorescein diacetate (DAF FM) fluorescence assay which indicated that Ecdysterone activates NOS resulting in increased Nitric Oxide (NO) production. Additionally, studies with inhibitors of both the EGFR and ERK, demonstrated that Ecdysterone activates NOS through modulation of EGFR and ERK. These results clearly demonstrate, for the first time, that Ecdysterone enhances Nitric Oxide production and modulates complex cellular processes by activating ERK1/2 through the EGF pathway. - Highlights: • Ecdysterone significantly enhances cell migration in a dose dependent manner. • Ecdysterone augments cell spreading during the initial phase of cell migration through actin cytoskeletal rearrangement. • Ecdysterone enhances cell proliferation in a nitric oxide dependent manner. • Ecdysterone enhances nitric oxide production via activation of EGFR

  20. Image processing to optimize wave energy converters

    Science.gov (United States)

    Bailey, Kyle Marc-Anthony

    The world is turning to renewable energies as a means of ensuring the planet's future and well-being. There have been a few attempts in the past to utilize wave power as a means of generating electricity through the use of Wave Energy Converters (WEC), but only recently are they becoming a focal point in the renewable energy field. Over the past few years there has been a global drive to advance the efficiency of WEC. Placing a mechanical device either onshore or offshore that captures the energy within ocean surface waves to drive a mechanical device is how wave power is produced. This paper seeks to provide a novel and innovative way to estimate ocean wave frequency through the use of image processing. This will be achieved by applying a complex modulated lapped orthogonal transform filter bank to satellite images of ocean waves. The complex modulated lapped orthogonal transform filterbank provides an equal subband decomposition of the Nyquist bounded discrete time Fourier Transform spectrum. The maximum energy of the 2D complex modulated lapped transform subband is used to determine the horizontal and vertical frequency, which subsequently can be used to determine the wave frequency in the direction of the WEC by a simple trigonometric scaling. The robustness of the proposed method is provided by the applications to simulated and real satellite images where the frequency is known.

  1. When teams shift among processes: insights from simulation and optimization.

    Science.gov (United States)

    Kennedy, Deanna M; McComb, Sara A

    2014-09-01

    This article introduces process shifts to study the temporal interplay among transition and action processes espoused in the recurring phase model proposed by Marks, Mathieu, and Zacarro (2001). Process shifts are those points in time when teams complete a focal process and change to another process. By using team communication patterns to measure process shifts, this research explores (a) when teams shift among different transition processes and initiate action processes and (b) the potential of different interventions, such as communication directives, to manipulate process shift timing and order and, ultimately, team performance. Virtual experiments are employed to compare data from observed laboratory teams not receiving interventions, simulated teams receiving interventions, and optimal simulated teams generated using genetic algorithm procedures. Our results offer insights about the potential for different interventions to affect team performance. Moreover, certain interventions may promote discussions about key issues (e.g., tactical strategies) and facilitate shifting among transition processes in a manner that emulates optimal simulated teams' communication patterns. Thus, we contribute to theory regarding team processes in 2 important ways. First, we present process shifts as a way to explore the timing of when teams shift from transition to action processes. Second, we use virtual experimentation to identify those interventions with the greatest potential to affect performance by changing when teams shift among processes. Additionally, we employ computational methods including neural networks, simulation, and optimization, thereby demonstrating their applicability in conducting team research. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  2. Optimization of frying process in food safety

    Directory of Open Access Journals (Sweden)

    Quaglia, G.

    1998-08-01

    Full Text Available The mechanics of frying are fairly simple. Hot oil serves as a heat exchange medium in which heat is transferred to the food being fried. As a result, the heat converts water within the food to steam and melts the fat within the food. The steam and fat then migrate from the interior of the food through the exterior and into the oil. Conversely, some of the frying oil is absorbed into the food being fried. The chemistry occurring in the frying oil and in the food being fried includes a myriad of thermal and oxidative reactions involving lipids, proteins, carbohydrates and minor food constituents. Decomposition products by autoxidation above 100°C, polimerization without oxigen between 200-300°C and thermal oxidation at 200°C, can be produced in frying oil and their amounts are related to different chemical and physical parameters such as temperature, heating time, type of oil used and food being fried, oil turnover rate, management of the oil and finally type of equipment used. Different studies have remarked as the toxicity of these by-products, is due to their chemistry and concentration. Since the prime requirement in food quality is the safety of the products, attainable through preventive analysis of the risks and total control through all frying processes, in this work the critical points of particular importance are identify and showed: Oil composition, and in particular its antioxidant capacity. Proper fryer design. Food/oil ratio. Good manufactured practice. Beside the quality screening has to be direct towards the chemical quality evaluation by easy and rapid analysis of oil (colour, polar compounds, free fatty acids and antioxidant capacity and food fried (panel test and/or consumer test. Conclusion, to maintain high quality in the frying medium, choose efficient equipment, select a fat with desirable flavour and good antioxidant capacity, eliminate crackling as soon and often as possible, choose better components with minimal but

  3. A Thermodynamic Library for Simulation and Optimization of Dynamic Processes

    DEFF Research Database (Denmark)

    Ritschel, Tobias Kasper Skovborg; Gaspar, Jozsef; Jørgensen, John Bagterp

    2017-01-01

    Process system tools, such as simulation and optimization of dynamic systems, are widely used in the process industries for development of operational strategies and control for process systems. These tools rely on thermodynamic models and many thermodynamic models have been developed for different...... compounds and mixtures. However, rigorous thermodynamic models are generally computationally intensive and not available as open-source libraries for process simulation and optimization. In this paper, we describe the application of a novel open-source rigorous thermodynamic library, ThermoLib, which...... is designed for dynamic simulation and optimization of vapor-liquid processes. ThermoLib is implemented in Matlab and C and uses cubic equations of state to compute vapor and liquid phase thermodynamic properties. The novelty of ThermoLib is that it provides analytical first and second order derivatives...

  4. The epidermis of grhl3-null mice displays altered lipid processing and cellular hyperproliferation.

    Science.gov (United States)

    Ting, Stephen B; Caddy, Jacinta; Wilanowski, Tomasz; Auden, Alana; Cunningham, John M; Elias, Peter M; Holleran, Walter M; Jane, Stephen M

    2005-04-01

    The presence of an impermeable surface barrier is an essential homeostatic mechanism in almost all living organisms. We have recently described a novel gene that is critical for the developmental instruction and repair of the integument in mammals. This gene, Grainy head-like 3 (Grhl3) is a member of a large family of transcription factors that are homologs of the Drosophila developmental gene grainy head (grh). Mice lacking Grhl3 fail to form an adequate skin barrier, and die at birth due to dehydration. These animals are also unable to repair the epidermis, exhibiting failed wound healing in both fetal and adult stages of development. These defects are due, in part, to diminished expression of a Grhl3 target gene, Transglutaminase 1 (TGase 1), which encodes a key enzyme involved in cross-linking of epidermal structural proteins and lipids into the cornified envelope (CE). Remarkably, the Drosophila grh gene plays an analogous role, regulating enzymes involved in the generation of quinones, which are essential for cross-linking structural components of the fly epidermis. In an extension of our initial analyses, we focus this report on additional defects observed in the Grhl3-null epidermis, namely defective extra-cellular lipid processing, altered lamellar lipid architecture and cellular hyperproliferation. These abnormalities suggest that Grhl3 plays diverse mechanistic roles in maintaining homeostasis in the skin.

  5. Systemic evaluation of cellular reprogramming processes exploiting a novel R-tool: eegc.

    Science.gov (United States)

    Zhou, Xiaoyuan; Meng, Guofeng; Nardini, Christine; Mei, Hongkang

    2017-08-15

    Cells derived by cellular engineering, i.e. differentiation of induced pluripotent stem cells and direct lineage reprogramming, carry a tremendous potential for medical applications and in particular for regenerative therapies. These approaches consist in the definition of lineage-specific experimental protocols that, by manipulation of a limited number of biological cues-niche mimicking factors, (in)activation of transcription factors, to name a few-enforce the final expression of cell-specific (marker) molecules. To date, given the intricate complexity of biological pathways, these approaches still present imperfect reprogramming fidelity, with uncertain consequences on the functional properties of the resulting cells. We propose a novel tool eegc to evaluate cellular engineering processes, in a systemic rather than marker-based fashion, by integrating transcriptome profiling and functional analysis. Our method clusters genes into categories representing different states of (trans)differentiation and further performs functional and gene regulatory network analyses for each of the categories of the engineered cells, thus offering practical indications on the potential lack of the reprogramming protocol. eegc R package is released under the GNU General Public License within the Bioconductor project, freely available at https://bioconductor.org/packages/eegc/. christine.nardini.rsrc@gmail.com or hongkang.k.mei@gsk.com. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  6. Sequential optimization and reliability assessment method for metal forming processes

    International Nuclear Information System (INIS)

    Sahai, Atul; Schramm, Uwe; Buranathiti, Thaweepat; Chen Wei; Cao Jian; Xia, Cedric Z.

    2004-01-01

    Uncertainty is inevitable in any design process. The uncertainty could be due to the variations in geometry of the part, material properties or due to the lack of knowledge about the phenomena being modeled itself. Deterministic design optimization does not take uncertainty into account and worst case scenario assumptions lead to vastly over conservative design. Probabilistic design, such as reliability-based design and robust design, offers tools for making robust and reliable decisions under the presence of uncertainty in the design process. Probabilistic design optimization often involves double-loop procedure for optimization and iterative probabilistic assessment. This results in high computational demand. The high computational demand can be reduced by replacing computationally intensive simulation models with less costly surrogate models and by employing Sequential Optimization and reliability assessment (SORA) method. The SORA method uses a single-loop strategy with a series of cycles of deterministic optimization and reliability assessment. The deterministic optimization and reliability assessment is decoupled in each cycle. This leads to quick improvement of design from one cycle to other and increase in computational efficiency. This paper demonstrates the effectiveness of Sequential Optimization and Reliability Assessment (SORA) method when applied to designing a sheet metal flanging process. Surrogate models are used as less costly approximations to the computationally expensive Finite Element simulations

  7. Cellular Decision Making by Non-Integrative Processing of TLR Inputs

    Directory of Open Access Journals (Sweden)

    Ryan A. Kellogg

    2017-04-01

    Full Text Available Cells receive a multitude of signals from the environment, but how they process simultaneous signaling inputs is not well understood. Response to infection, for example, involves parallel activation of multiple Toll-like receptors (TLRs that converge on the nuclear factor κB (NF-κB pathway. Although we increasingly understand inflammatory responses for isolated signals, it is not clear how cells process multiple signals that co-occur in physiological settings. We therefore examined a bacterial infection scenario involving co-stimulation of TLR4 and TLR2. Independent stimulation of these receptors induced distinct NF-κB dynamic profiles, although surprisingly, under co-stimulation, single cells continued to show ligand-specific dynamic responses characteristic of TLR2 or TLR4 signaling rather than a mixed response, comprising a cellular decision that we term “non-integrative” processing. Iterating modeling and microfluidic experiments revealed that non-integrative processing occurred through interaction of switch-like NF-κB activation, receptor-specific processing timescales, cell-to-cell variability, and TLR cross-tolerance mediated by multilayer negative feedback.

  8. Optimization of process parameters for friction stir processing (FSP ...

    Indian Academy of Sciences (India)

    Administrator

    al 2005; Yadav and Bauri 2011) as the thermo- mechanical aspect of the process provides enough driving force for occurrence of dynamic recovery (DRV) that precedes DRX leading to an equi-axed fine grain struc- ture. The microstructure evolution is further discussed below with the aid of transmission electron microscopy.

  9. Towards the prediction of essential genes by integration of network topology, cellular localization and biological process information

    Directory of Open Access Journals (Sweden)

    Lemke Ney

    2009-09-01

    Full Text Available Abstract Background The identification of essential genes is important for the understanding of the minimal requirements for cellular life and for practical purposes, such as drug design. However, the experimental techniques for essential genes discovery are labor-intensive and time-consuming. Considering these experimental constraints, a computational approach capable of accurately predicting essential genes would be of great value. We therefore present here a machine learning-based computational approach relying on network topological features, cellular localization and biological process information for prediction of essential genes. Results We constructed a decision tree-based meta-classifier and trained it on datasets with individual and grouped attributes-network topological features, cellular compartments and biological processes-to generate various predictors of essential genes. We showed that the predictors with better performances are those generated by datasets with integrated attributes. Using the predictor with all attributes, i.e., network topological features, cellular compartments and biological processes, we obtained the best predictor of essential genes that was then used to classify yeast genes with unknown essentiality status. Finally, we generated decision trees by training the J48 algorithm on datasets with all network topological features, cellular localization and biological process information to discover cellular rules for essentiality. We found that the number of protein physical interactions, the nuclear localization of proteins and the number of regulating transcription factors are the most important factors determining gene essentiality. Conclusion We were able to demonstrate that network topological features, cellular localization and biological process information are reliable predictors of essential genes. Moreover, by constructing decision trees based on these data, we could discover cellular rules governing

  10. Experimental Study On The Optimization Of Extraction Process Of ...

    African Journals Online (AJOL)

    The objective is to study the extraction process of garlic oil and its antibacterial effects. Materials and Methods: CO2 Supercritical extraction was used to investigate the optimal processing conditions for garlic oil extraction; filter paper test and suspension dilution test were applied to determine the bacteriostatic action of ...

  11. Optimal Control of Beer Fermentation Process Using Differential ...

    African Journals Online (AJOL)

    Optimal Control of Beer Fermentation Process Using Differential Transform Method. ... Journal of Applied Sciences and Environmental Management ... The method of differential transform was used to obtain the solution governing the fermentation process; the system of equation was transformed using the differential ...

  12. Hierarchical random cellular neural networks for system-level brain-like signal processing.

    Science.gov (United States)

    Kozma, Robert; Puljic, Marko

    2013-09-01

    Sensory information processing and cognition in brains are modeled using dynamic systems theory. The brain's dynamic state is described by a trajectory evolving in a high-dimensional state space. We introduce a hierarchy of random cellular automata as the mathematical tools to describe the spatio-temporal dynamics of the cortex. The corresponding brain model is called neuropercolation which has distinct advantages compared to traditional models using differential equations, especially in describing spatio-temporal discontinuities in the form of phase transitions. Phase transitions demarcate singularities in brain operations at critical conditions, which are viewed as hallmarks of higher cognition and awareness experience. The introduced Monte-Carlo simulations obtained by parallel computing point to the importance of computer implementations using very large-scale integration (VLSI) and analog platforms. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Cellular Automata Modelling of Photo-Induced Oxidation Processes in Molecularly Doped Polymers

    Directory of Open Access Journals (Sweden)

    David M. Goldie

    2016-11-01

    Full Text Available The possibility of employing cellular automata (CA to model photo-induced oxidation processes in molecularly doped polymers is explored. It is demonstrated that the oxidation dynamics generated using CA models exhibit stretched-exponential behavior. This dynamical characteristic is in general agreement with an alternative analysis conducted using standard rate equations provided the molecular doping levels are sufficiently low to prohibit the presence of safe-sites which are impenetrable to dissolved oxygen. The CA models therefore offer the advantage of exploring the effect of dopant agglomeration which is difficult to assess from standard rate equation solutions. The influence of UV-induced bleaching or darkening upon the resulting oxidation dynamics may also be easily incorporated into the CA models and these optical effects are investigated for various photo-oxidation product scenarios. Output from the CA models is evaluated for experimental photo-oxidation data obtained from a series of hydrazone-doped polymers.

  14. Cellular processes involved in human epidermal cells exposed to extremely low frequency electric fields.

    Science.gov (United States)

    Collard, J-F; Hinsenkamp, M

    2015-05-01

    We observed on different tissues and organisms a biological response after exposure to pulsed low frequency and low amplitude electric or electromagnetic fields but the precise mechanism of cell response remains unknown. The aim of this publication is to understand, using bioinformatics, the biological relevance of processes involved in the modification of gene expression. The list of genes analyzed was obtained after microarray protocol realized on cultures of human epidermal explants growing on deepidermized human skin exposed to a pulsed low frequency electric field. The directed acyclic graph on a WebGestalt Gene Ontology module shows six categories under the biological process root: "biological regulation", "cellular process", "cell proliferation", "death", "metabolic process" and "response to stimulus". Enriched derived categories are coherent with the type of in vitro culture, the stimulation protocol or with the previous results showing a decrease of cell proliferation and an increase of differentiation. The Kegg module on WebGestalt has highlighted "cell cycle" and "p53 signaling pathway" as significantly involved. The Kegg website brings out interactions between FoxO, MAPK, JNK, p53, p38, PI3K/Akt, Wnt, mTor or NF-KappaB. Some genes expressed by the stimulation are known to have an exclusive function on these pathways. Analyses performed with Pathway Studio linked cell proliferation, cell differentiation, apoptosis, cell cycle, mitosis, cell death etc. with our microarrays results. Medline citation generated by the software and the fold change variation confirms a diminution of the proliferation, activation of the differentiation and a less well-defined role of apoptosis or wound healing. Wnt and DKK functional classes, DKK1, MACF1, ATF3, MME, TXNRD1, and BMP-2 genes proposed in previous publications after a manual analysis are also highlighted with other genes after Pathway Studio automatic procedure. Finally, an analysis conducted on a list of genes

  15. Pavement maintenance optimization model using Markov Decision Processes

    Science.gov (United States)

    Mandiartha, P.; Duffield, C. F.; Razelan, I. S. b. M.; Ismail, A. b. H.

    2017-09-01

    This paper presents an optimization model for selection of pavement maintenance intervention using a theory of Markov Decision Processes (MDP). There are some particular characteristics of the MDP developed in this paper which distinguish it from other similar studies or optimization models intended for pavement maintenance policy development. These unique characteristics include a direct inclusion of constraints into the formulation of MDP, the use of an average cost method of MDP, and the policy development process based on the dual linear programming solution. The limited information or discussions that are available on these matters in terms of stochastic based optimization model in road network management motivates this study. This paper uses a data set acquired from road authorities of state of Victoria, Australia, to test the model and recommends steps in the computation of MDP based stochastic optimization model, leading to the development of optimum pavement maintenance policy.

  16. Hierarchical optimal control of large-scale nonlinear chemical processes.

    Science.gov (United States)

    Ramezani, Mohammad Hossein; Sadati, Nasser

    2009-01-01

    In this paper, a new approach is presented for optimal control of large-scale chemical processes. In this approach, the chemical process is decomposed into smaller sub-systems at the first level, and a coordinator at the second level, for which a two-level hierarchical control strategy is designed. For this purpose, each sub-system in the first level can be solved separately, by using any conventional optimization algorithm. In the second level, the solutions obtained from the first level are coordinated using a new gradient-type strategy, which is updated by the error of the coordination vector. The proposed algorithm is used to solve the optimal control problem of a complex nonlinear chemical stirred tank reactor (CSTR), where its solution is also compared with the ones obtained using the centralized approach. The simulation results show the efficiency and the capability of the proposed hierarchical approach, in finding the optimal solution, over the centralized method.

  17. Magnetic Resonance Microscopy of Human and Porcine Neurons and Cellular Processes

    Science.gov (United States)

    Flint, Jeremy J.; Hansen, Brian; Portnoy, Sharon; Lee, Choong-Heon; King, Michael A.; Fey, Michael; Vincent, Franck; Stanisz, Greg J; Vestergaard-Poulsen, Peter; Blackband, Stephen J

    2012-01-01

    With its unparalleled ability to safely generate high-contrast images of soft tissues, magnetic resonance imaging (MRI) has remained at the forefront of diagnostic clinical medicine. Unfortunately due to resolution limitations, clinical scans are most useful for detecting macroscopic structural changes associated with a small number of pathologies. Moreover, due to a longstanding inability to directly observe magnetic resonance (MR) signal behavior at the cellular level, such information is poorly characterized and generally must be inferred. With the advent of the MR microscope in 1986 came the ability to measure MR signal properties of theretofore unobservable tissue structures. Recently, further improvements in hardware technology have made possible the ability to visualize mammalian cellular structure. In the current study, we expand upon previous work by imaging the neuronal cell bodies and processes of human and porcine α-motor neurons. Complimentary imaging studies are conducted in pig tissue in order to demonstrate qualitative similarities to human samples. Also, apparent diffusion coefficient (ADC) maps were generated inside porcine α-motor neuron cell bodies and portions of their largest processes (mean = 1.7±0.5 μm2/ms based on 53 pixels) as well as in areas containing a mixture of extracellular space, microvasculature, and neuropil (0.59±0.37 μm2/ms based on 33 pixels). Three-dimensional reconstruction of MR images containing α-motor neurons shows the spatial arrangement of neuronal projections between adjacent cells. Such advancements in imaging portend the ability to construct accurate models of MR signal behavior based on direct observation and measurement of the components which comprise functional tissues. These tools would not only be useful for improving our interpretation of macroscopic MRI performed in the clinic, but they could potentially be used to develop new methods of differential diagnosis to aid in the early detection of a

  18. Optimization of turning process through the analytic flank wear modelling

    Science.gov (United States)

    Del Prete, A.; Franchi, R.; De Lorenzis, D.

    2018-05-01

    In the present work, the approach used for the optimization of the process capabilities for Oil&Gas components machining will be described. These components are machined by turning of stainless steel castings workpieces. For this purpose, a proper Design Of Experiments (DOE) plan has been designed and executed: as output of the experimentation, data about tool wear have been collected. The DOE has been designed starting from the cutting speed and feed values recommended by the tools manufacturer; the depth of cut parameter has been maintained as a constant. Wear data has been obtained by means the observation of the tool flank wear under an optical microscope: the data acquisition has been carried out at regular intervals of working times. Through a statistical data and regression analysis, analytical models of the flank wear and the tool life have been obtained. The optimization approach used is a multi-objective optimization, which minimizes the production time and the number of cutting tools used, under the constraint on a defined flank wear level. The technique used to solve the optimization problem is a Multi Objective Particle Swarm Optimization (MOPS). The optimization results, validated by the execution of a further experimental campaign, highlighted the reliability of the work and confirmed the usability of the optimized process parameters and the potential benefit for the company.

  19. Optimization of CernVM early boot process

    CERN Document Server

    Mazdin, Petra

    2015-01-01

    CernVM virtual machine is a Linux based virtual appliance optimized for High Energy Physics experiments. It is used for cloud computing, volunteer computing, and software development by the four large LHC experiments. The goal of this project is proling and optimizing the boot process of the CernVM. A key part was the development of a performance profiler for shell scripts as an extension to the popular BusyBox open source UNIX tool suite. Based on the measurements, costly shell code was replaced by more efficient, custom C programs. The results are compared to the original ones and successful optimization is proven.

  20. Topology Optimization for Reducing Additive Manufacturing Processing Distortions

    Science.gov (United States)

    2017-12-01

    distribution is unlimited. 1. Introduction Additive manufacturing (AM) is a production method that involves gradual, layer- by-layer building of material... design space—allowing the production of pre- viously unmanufacturable topologically optimized structures—constraints remain. One constraint, for...ARL-TR-8242•DEC 2017 US Army Research Laboratory Topology Optimization for ReducingAdditive Manufacturing ProcessingDistortions by Raymond A Wildman

  1. Multi-Objective Optimization of Squeeze Casting Process using Genetic Algorithm and Particle Swarm Optimization

    Directory of Open Access Journals (Sweden)

    Patel G.C.M.

    2016-09-01

    Full Text Available The near net shaped manufacturing ability of squeeze casting process requiresto set the process variable combinations at their optimal levels to obtain both aesthetic appearance and internal soundness of the cast parts. The aesthetic and internal soundness of cast parts deal with surface roughness and tensile strength those can readily put the part in service without the requirement of costly secondary manufacturing processes (like polishing, shot blasting, plating, hear treatment etc.. It is difficult to determine the levels of the process variable (that is, pressure duration, squeeze pressure, pouring temperature and die temperature combinations for extreme values of the responses (that is, surface roughness, yield strength and ultimate tensile strength due to conflicting requirements. In the present manuscript, three population based search and optimization methods, namely genetic algorithm (GA, particle swarm optimization (PSO and multi-objective particle swarm optimization based on crowding distance (MOPSO-CD methods have been used to optimize multiple outputs simultaneously. Further, validation test has been conducted for the optimal casting conditions suggested by GA, PSO and MOPSO-CD. The results showed that PSO outperformed GA with regard to computation time.

  2. Parameter optimization of electrochemical machining process using black hole algorithm

    Science.gov (United States)

    Singh, Dinesh; Shukla, Rajkamal

    2017-12-01

    Advanced machining processes are significant as higher accuracy in machined component is required in the manufacturing industries. Parameter optimization of machining processes gives optimum control to achieve the desired goals. In this paper, electrochemical machining (ECM) process is considered to evaluate the performance of the considered process using black hole algorithm (BHA). BHA considers the fundamental idea of a black hole theory and it has less operating parameters to tune. The two performance parameters, material removal rate (MRR) and overcut (OC) are considered separately to get optimum machining parameter settings using BHA. The variations of process parameters with respect to the performance parameters are reported for better and effective understanding of the considered process using single objective at a time. The results obtained using BHA are found better while compared with results of other metaheuristic algorithms, such as, genetic algorithm (GA), artificial bee colony (ABC) and bio-geography based optimization (BBO) attempted by previous researchers.

  3. Optimization and Improvement of Test Processes on a Production Line

    Science.gov (United States)

    Sujová, Erika; Čierna, Helena

    2018-06-01

    The paper deals with increasing processes efficiency at a production line of cylinder heads of engines in a production company operating in the automotive industry. The goal is to achieve improvement and optimization of test processes on a production line. It analyzes options for improving capacity, availability and productivity of processes of an output test by using modern technology available on the market. We have focused on analysis of operation times before and after optimization of test processes at specific production sections. By analyzing measured results we have determined differences in time before and after improvement of the process. We have determined a coefficient of efficiency OEE and by comparing outputs we have confirmed real improvement of the process of the output test of cylinder heads.

  4. Optimal redundant systems for works with random processing time

    International Nuclear Information System (INIS)

    Chen, M.; Nakagawa, T.

    2013-01-01

    This paper studies the optimal redundant policies for a manufacturing system processing jobs with random working times. The redundant units of the parallel systems and standby systems are subject to stochastic failures during the continuous production process. First, a job consisting of only one work is considered for both redundant systems and the expected cost functions are obtained. Next, each redundant system with a random number of units is assumed for a single work. The expected cost functions and the optimal expected numbers of units are derived for redundant systems. Subsequently, the production processes of N tandem works are introduced for parallel and standby systems, and the expected cost functions are also summarized. Finally, the number of works is estimated by a Poisson distribution for the parallel and standby systems. Numerical examples are given to demonstrate the optimization problems of redundant systems

  5. USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process. Volume 2

    National Research Council Canada - National Science Library

    Adamson, Anthony

    1998-01-01

    .... It is published as three separate volumes. Volume I, USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process -- Phase II Report, discusses the result and cost/benefit analysis of testing three initiatives...

  6. USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process. Volume 1

    National Research Council Canada - National Science Library

    Adamson, Anthony

    1998-01-01

    .... It is published as three separate volumes. Volume I, USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process -- Phase II Report, discusses the result and cost/benefit analysis of testing three initiatives...

  7. Optimizing a Laser Process for Making Carbon Nanotubes

    Science.gov (United States)

    Arepalli, Sivaram; Nikolaev, Pavel; Holmes, William

    2010-01-01

    A systematic experimental study has been performed to determine the effects of each of the operating conditions in a double-pulse laser ablation process that is used to produce single-wall carbon nanotubes (SWCNTs). The comprehensive data compiled in this study have been analyzed to recommend conditions for optimizing the process and scaling up the process for mass production. The double-pulse laser ablation process for making SWCNTs was developed by Rice University researchers. Of all currently known nanotube-synthesizing processes (arc and chemical vapor deposition), this process yields the greatest proportion of SWCNTs in the product material. The aforementioned process conditions are important for optimizing the production of SWCNTs and scaling up production. Reports of previous research (mostly at Rice University) toward optimization of process conditions mention effects of oven temperature and briefly mention effects of flow conditions, but no systematic, comprehensive study of the effects of process conditions was done prior to the study described here. This was a parametric study, in which several production runs were carried out, changing one operating condition for each run. The study involved variation of a total of nine parameters: the sequence of the laser pulses, pulse-separation time, laser pulse energy density, buffer gas (helium or nitrogen instead of argon), oven temperature, pressure, flow speed, inner diameter of the flow tube, and flow-tube material.

  8. Method of optimization of the natural gas refining process

    Energy Technology Data Exchange (ETDEWEB)

    Sadykh-Zade, E.S.; Bagirov, A.A.; Mardakhayev, I.M.; Razamat, M.S.; Tagiyev, V.G.

    1980-01-01

    The SATUM (automatic control system of technical operations) system introduced at the Shatlyk field should assure good quality of gas refining. In order to optimize the natural gas refining processes and experimental-analytical method is used in compiling the mathematical descriptions. The program, compiled in Fortran language, in addition to parameters of optimal conditions gives information on the yield of concentrate and water, concentration and consumption of DEG, composition and characteristics of the gas and condensate. The algorithm for calculating optimum engineering conditions of gas refining is proposed to be used in ''advice'' mode, and also for monitoring progress of the gas refining process.

  9. Experimental reversion of the optimal quantum cloning and flipping processes

    International Nuclear Information System (INIS)

    Sciarrino, Fabio; Secondi, Veronica; De Martini, Francesco

    2006-01-01

    The quantum cloner machine maps an unknown arbitrary input qubit into two optimal clones and one optimal flipped qubit. By combining linear and nonlinear optical methods we experimentally implement a scheme that, after the cloning transformation, restores the original input qubit in one of the output channels, by using local measurements, classical communication, and feedforward. This nonlocal method demonstrates how the information on the input qubit can be restored after the cloning process. The realization of the reversion process is expected to find useful applications in the field of modern multipartite quantum cryptography

  10. High-resolution imaging of cellular processes across textured surfaces using an indexed-matched elastomer.

    Science.gov (United States)

    Ravasio, Andrea; Vaishnavi, Sree; Ladoux, Benoit; Viasnoff, Virgile

    2015-03-01

    Understanding and controlling how cells interact with the microenvironment has emerged as a prominent field in bioengineering, stem cell research and in the development of the next generation of in vitro assays as well as organs on a chip. Changing the local rheology or the nanotextured surface of substrates has proved an efficient approach to improve cell lineage differentiation, to control cell migration properties and to understand environmental sensing processes. However, introducing substrate surface textures often alters the ability to image cells with high precision, compromising our understanding of molecular mechanisms at stake in environmental sensing. In this paper, we demonstrate how nano/microstructured surfaces can be molded from an elastomeric material with a refractive index matched to the cell culture medium. Once made biocompatible, contrast imaging (differential interference contrast, phase contrast) and high-resolution fluorescence imaging of subcellular structures can be implemented through the textured surface using an inverted microscope. Simultaneous traction force measurements by micropost deflection were also performed, demonstrating the potential of our approach to study cell-environment interactions, sensing processes and cellular force generation with unprecedented resolution. Copyright © 2014 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  11. Beyond voltage-gated ion channels: Voltage-operated membrane proteins and cellular processes.

    Science.gov (United States)

    Zhang, Jianping; Chen, Xingjuan; Xue, Yucong; Gamper, Nikita; Zhang, Xuan

    2018-04-18

    Voltage-gated ion channels were believed to be the only voltage-sensitive proteins in excitable (and some non-excitable) cells for a long time. Emerging evidence indicates that the voltage-operated model is shared by some other transmembrane proteins expressed in both excitable and non-excitable cells. In this review, we summarize current knowledge about voltage-operated proteins, which are not classic voltage-gated ion channels as well as the voltage-dependent processes in cells for which single voltage-sensitive proteins have yet to be identified. Particularly, we will focus on the following. (1) Voltage-sensitive phosphoinositide phosphatases (VSP) with four transmembrane segments homologous to the voltage sensor domain (VSD) of voltage-gated ion channels; VSPs are the first family of proteins, other than the voltage-gated ion channels, for which there is sufficient evidence for the existence of the VSD domain; (2) Voltage-gated proton channels comprising of a single voltage-sensing domain and lacking an identified pore domain; (3) G protein coupled receptors (GPCRs) that mediate the depolarization-evoked potentiation of Ca 2+ mobilization; (4) Plasma membrane (PM) depolarization-induced but Ca 2+ -independent exocytosis in neurons. (5) Voltage-dependent metabolism of phosphatidylinositol 4,5-bisphosphate (PtdIns[4,5]P 2 , PIP 2 ) in the PM. These recent discoveries expand our understanding of voltage-operated processes within cellular membranes. © 2018 Wiley Periodicals, Inc.

  12. The Process of Optimizing Mechanical Sound Quality in Product Design

    DEFF Research Database (Denmark)

    Eriksen, Kaare; Holst, Thomas

    2011-01-01

    The research field concerning optimizing product sound quality is a relatively unexplored area, and may become difficult for designers to operate in. To some degree, sound is a highly subjective parameter, which is normally targeted sound specialists. This paper describes the theoretical...... and practical background for managing a process of optimizing the mechanical sound quality in a product design by using simple tools and workshops systematically. The procedure is illustrated by a case study of a computer navigation tool (computer mouse or mouse). The process is divided into 4 phases, which...... clarify the importance of product sound, defining perceptive demands identified by users, and, finally, how to suggest mechanical principles for modification of an existing sound design. The optimized mechanical sound design is followed by tests on users of the product in its use context. The result...

  13. OPTIMIZATION OF FLOCCULATION PROCESS BY MICROBIAL COAGULANT IN RIVER WATER

    Directory of Open Access Journals (Sweden)

    Fatin Nabilah Murad

    2017-12-01

    Full Text Available The existing process of coagulation and flocculation are using chemicals that known as cationic coagulant such as alum, ferric sulfate, calcium oxide, and organic polymers.  Thus, this study concentrates on optimizing of flocculation process by microbial coagulant in river water. Turbidity and suspended solids are the main constraints of river water quality in Malaysia. Hence, a study is proposed to produce microbial coagulants isolated locally for river water treatment. The chosen microbe used as the bioflocculant producer is Aspergillus niger. The parameters to optimization in the flocculation process were pH, bioflocculant dosage and effluent concentration. The research was done in the jar test process and the process parameters for maximum turbidity removal was validated. The highest flocculating activity was obtained on day seven of cultivation in the supernatant. The optimum pH and bioflocculant dosage for an optimize sedimentation process were between 4-5 and 2-3 mL for 0.3 g/L of effluent concentration respectively. The model was validated by using a river water sample from Sg. Pusu and the result showed that the model was acceptable to evaluate the bioflocculation process.

  14. Statistical optimization of process parameters for the production of ...

    African Journals Online (AJOL)

    In this study, optimization of process parameters such as moisture content, incubation temperature and initial pH (fixed) for the improvement of citric acid production from oil palm empty fruit bunches through solid state bioconversion was carried out using traditional one-factor-at-a-time (OFAT) method and response surface ...

  15. Rate-optimal Bayesian intensity smoothing for inhomogeneous Poisson processes

    NARCIS (Netherlands)

    Belitser, E.; Andrade Serra, De P.J.; Zanten, van J.H.

    2013-01-01

    We apply nonparametric Bayesian methods to study the problem of estimating the intensity function of an inhomogeneous Poisson process. We exhibit a prior on intensities which both leads to a computationally feasible method and enjoys desirable theoretical optimality properties. The prior we use is

  16. Optimal design of nuclear mechanical dampers with analytical hierarchy process

    International Nuclear Information System (INIS)

    Zou Yuehua; Wen Bo; Xu Hongxiang; Qin Yonglie

    2000-01-01

    An optimal design with analytical hierarchy process on nuclear mechanical dampers manufactured by authors' university was described. By using fuzzy judgement matrix the coincidence was automatically satisfied without the need of coincidence test. The results obtained by this method have been put into the production practices

  17. Evaluation of the effect of advanced coagulation process to optimize ...

    African Journals Online (AJOL)

    Evaluation of the effect of advanced coagulation process to optimize the removal of natural organic matter in water (Case study: drinking water of Mashhad's ... and in addition to giving taste, color and odor to the water, they can intervene in the oxidization and removal of heavy metals such as arsenic, iron and manganese.

  18. Optimization of aqueous extraction process to enhance the ...

    African Journals Online (AJOL)

    Kumar Sudhir

    2014-02-12

    Feb 12, 2014 ... Aqueous extraction process was optimized to reduce endotoxins from mixed ... management and minimizes the initial capital costs for ... of about 40%, was suggested to be an economic ... industry and as the nutraceutical food for human due to ... economical production of industrial enzymes and as feed.

  19. Variance-optimal hedging for processes with stationary independent increments

    DEFF Research Database (Denmark)

    Hubalek, Friedrich; Kallsen, J.; Krawczyk, L.

    We determine the variance-optimal hedge when the logarithm of the underlying price follows a process with stationary independent increments in discrete or continuous time. Although the general solution to this problem is known as backward recursion or backward stochastic differential equation, we...

  20. Benchmarking of radiological departments. Starting point for successful process optimization

    International Nuclear Information System (INIS)

    Busch, Hans-Peter

    2010-01-01

    Continuous optimization of the process of organization and medical treatment is part of the successful management of radiological departments. The focus of this optimization can be cost units such as CT and MRI or the radiological parts of total patient treatment. Key performance indicators for process optimization are cost- effectiveness, service quality and quality of medical treatment. The potential for improvements can be seen by comparison (benchmark) with other hospitals and radiological departments. Clear definitions of key data and criteria are absolutely necessary for comparability. There is currently little information in the literature regarding the methodology and application of benchmarks especially from the perspective of radiological departments and case-based lump sums, even though benchmarking has frequently been applied to radiological departments by hospital management. The aim of this article is to describe and discuss systematic benchmarking as an effective starting point for successful process optimization. This includes the description of the methodology, recommendation of key parameters and discussion of the potential for cost-effectiveness analysis. The main focus of this article is cost-effectiveness (efficiency and effectiveness) with respect to cost units and treatment processes. (orig.)

  1. Optimization of CNC end milling process parameters using PCA ...

    African Journals Online (AJOL)

    Optimization of CNC end milling process parameters using PCA-based Taguchi method. ... International Journal of Engineering, Science and Technology ... To meet the basic assumption of Taguchi method; in the present work, individual response correlations have been eliminated first by means of Principal Component ...

  2. Aspects Concerning the Optimization of Authentication Process for Distributed Applications

    Directory of Open Access Journals (Sweden)

    Ion Ivan

    2008-06-01

    Full Text Available There will be presented the types of distributed applications. The quality characteristics for distributed applications will be analyzed. There will be established the ways to assign access rights. The authentication category will be analyzed. We will propose an algorithm for the optimization of authentication process.For the application “Evaluation of TIC projects” the algorithm proposed will be applied.

  3. Process and Energy Optimization Assessment, Tobyhanna Army Depot, PA

    Science.gov (United States)

    2006-04-17

    assembly of electronic-communication components, different welding processes are performed at TYAD. It uses shielded arc, metal inert gas (MIG...tungsten inert gas ( TIG ), and silver braz- ing oxygen/acetylene cutting plasma arc methods to complete mission re- quirements. Major welding jobs are...ER D C/ CE R L TR -0 6 -1 1 Process and Energy Optimization Assessment Tobyhanna Army Depot, PA Mike C.J. Lin, Alexander M. Zhivov

  4. Graphene transfer process and optimization of graphene coverage

    OpenAIRE

    Sabki Syarifah Norfaezah; Shamsuri Shafiq Hafly; Fauzi Siti Fazlina; Chon-Ki Meghashama Lim; Othman Noraini

    2017-01-01

    Graphene grown on transition metal is known to be high in quality due to its controlled amount of defects and potentially used for many electronic applications. The transfer process of graphene grown on transition metal to a new substrate requires optimization in order to ensure that high graphene coverage can be obtained. In this work, an improvement in the graphene transfer process is performed from graphene grown on copper foil. It has been observed that the graphene coverage is affected b...

  5. Systems analysis as a tool for optimal process strategy

    International Nuclear Information System (INIS)

    Ditterich, K.; Schneider, J.

    1975-09-01

    For the description and the optimal treatment of complex processes, the methods of Systems Analysis are used as the most promising approach in recent times. In general every process should be optimised with respect to reliability, safety, economy and environmental pollution. In this paper the complex relations between these general optimisation postulates are established in qualitative form. These general trend relations have to be quantified for every particular system studied in practice

  6. Focused Metabolite Profiling for Dissecting Cellular and Molecular Processes of Living Organisms in Space Environments

    Science.gov (United States)

    2008-01-01

    Regulatory control in biological systems is exerted at all levels within the central dogma of biology. Metabolites are the end products of all cellular regulatory processes and reflect the ultimate outcome of potential changes suggested by genomics and proteomics caused by an environmental stimulus or genetic modification. Following on the heels of genomics, transcriptomics, and proteomics, metabolomics has become an inevitable part of complete-system biology because none of the lower "-omics" alone provide direct information about how changes in mRNA or protein are coupled to changes in biological function. The challenges are much greater than those encountered in genomics because of the greater number of metabolites and the greater diversity of their chemical structures and properties. To meet these challenges, much developmental work is needed, including (1) methodologies for unbiased extraction of metabolites and subsequent quantification, (2) algorithms for systematic identification of metabolites, (3) expertise and competency in handling a large amount of information (data set), and (4) integration of metabolomics with other "omics" and data mining (implication of the information). This article reviews the project accomplishments.

  7. Increased cellular levels of spermidine or spermine are required for optimal DNA synthesis in lymphocytes activated by concanavalin A.

    Science.gov (United States)

    Fillingame, R H; Jorstad, C M; Morris, D R

    1975-01-01

    There are large increases in cellular levels of the polyamines spermidine and spermine in lymphocytes induced to transform by concanavalin A. The anti-leukemic agent methylglyoxal bis(guanylhydrazone) (MGBG) blocks synthesis of these polyamines by inhibiting S-adenosylmethionine decarboxylase. Previous results showed that when cells are activated in the presence of MGBG the synthesis and processing of RNA, as well as protein synthesis, proceed as in the absence of the drug. In contrast, the incorporation of [methyl-3H]thymidine into DNA and the rate of entry of the cells into mitosis are inhibited by 60% in the presence of MGBG. Several experiments suggest that MGBG inhibits cell proliferation by directly blocking polyamine synthesis and not by an unrelated pharmacological effect: (1) the inhibitory action of MGBG is reversed by exogenously added spermidine or spermine; (2) inhibition of DNA synthesis by MGBG shows the same dose-response curve as does inhibition of spermidine and spermine synthesis; and (3) if MGBG is added to cells which have been allowed to accumulate their maximum complement of polyamines, there is no inhibition of thymidine incorporation. MGBG-treated and control cultures initiate DNA synthesis at the same time and show the same percentage of labeled cells by autoradiography. Therefore, it appears that in the absence of increased cellular levels of polyamines, lymphocytes progress normally from G0 through G1 and into S-phase. Furthermore, these experiments suggest that the increased levels of spermidine and spermine generally seen in rapidly proliferating eukaryotic systems are necessary for enhanced rates of DNA replication. PMID:1060087

  8. Model Development and Process Analysis for Lean Cellular Design Planning in Aerospace Assembly and Manufacturing

    Science.gov (United States)

    Hilburn, Monty D.

    Successful lean manufacturing and cellular manufacturing execution relies upon a foundation of leadership commitment and strategic planning built upon solid data and robust analysis. The problem for this study was to create and employ a simple lean transformation planning model and review process that could be used to identify functional support staff resources required to plan and execute lean manufacturing cells within aerospace assembly and manufacturing sites. The lean planning model was developed using available literature for lean manufacturing kaizen best practices and validated through a Delphi panel of lean experts. The resulting model and a standardized review process were used to assess the state of lean transformation planning at five sites of an international aerospace manufacturing and assembly company. The results of the three day, on-site review were compared with baseline plans collected from each of the five sites to determine if there analyzed, with focus on three critical areas of lean planning: the number and type of manufacturing cells identified, the number, type, and duration of planned lean and continuous kaizen events, and the quantity and type of functional staffing resources planned to support the kaizen schedule. Summarized data of the baseline and on-site reviews was analyzed with descriptive statistics. ANOVAs and paired-t tests at 95% significance level were conducted on the means of data sets to determine if null hypotheses related to cell, kaizen event, and support resources could be rejected. The results of the research found significant differences between lean transformation plans developed by site leadership and plans developed utilizing the structured, on-site review process and lean transformation planning model. The null hypothesis that there was no difference between the means of pre-review and on-site cell counts was rejected, as was the null hypothesis that there was no significant difference in kaizen event plans. These

  9. Selected papers from the Fourth Annual q-bio Conference on Cellular Information Processing.

    Science.gov (United States)

    Nemenman, Ilya; Faeder, James R; Hlavacek, William S; Jiang, Yi; Wall, Michael E; Zilman, Anton

    2011-10-01

    This special issue consists of 11 original papers that elaborate on work presented at the Fourth Annual q-bio Conference on Cellular Information Processing, which was held on the campus of St John's College in Santa Fe, New Mexico, USA, 11-14 August 2010. Now in its fourth year, the q-bio conference has changed considerably over time. It is now well established and a major event in systems biology. The 2010 conference saw attendees from all continents (except Antarctica!) sharing novel results and participating in lively discussions at both the oral and poster sessions. The conference was oversubscribed and grew to 27 contributed talks, 16 poster spotlights and 137 contributed posters. We deliberately decreased the number of invited speakers to 21 to leave more space for contributed presentations, and the attendee feedback confirmed that the choice was a success. Although the q-bio conference has grown and matured, it has remained true to the original goal of being an intimate and dynamic event that brings together modeling, theory and quantitative experimentation for the study of cell regulation and information processing. Funded in part by a grant from NIGMS and by DOE funds through the Los Alamos National Laboratory Directed Research and Development program, the conference has continued to exhibit youth and vigor by attracting (and partially supporting) over 100 undergraduate, graduate and postdoctoral researchers. The associated q-bio summer school, which precedes the conference each year, further emphasizes the development of junior scientists and makes q-bio a singular event in its impact on the future of quantitative biology. In addition to an increased international presence, the conference has notably diversified its demographic representation within the USA, including increased participation from the southeastern corner of the country. One big change in the conference this year is our new publication partner, Physical Biology. Although we are very

  10. USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process. Volume 3. Future to be Asset Sustainment Process Model

    National Research Council Canada - National Science Library

    Adamson, Anthony

    1998-01-01

    .... It is published as three separate volumes. Volume I, USAF Logistics Process Optimization Study for the Aircraft Asset Sustainment Process -- Phase II Report, discusses the result and cost/benefit analysis of testing three initiatives...

  11. On-line optimal control improves gas processing

    International Nuclear Information System (INIS)

    Berkowitz, P.N.; Papadopoulos, M.N.

    1992-01-01

    This paper reports that the authors' companies jointly funded the first phase of a gas processing liquids optimization project that has the specific purposes to: Improve the return of processing natural gas liquids, Develop sets of control algorithms, Make available a low-cost solution suitable for small to medium-sized gas processing plants, Test and demonstrate the feasibility of line control. The ARCO Willard CO 2 gas recovery processing plant was chosen as the initial test site to demonstrate the application of multivariable on-line optimal control. One objective of this project is to support an R ampersand D effort to provide a standardized solution to the various types of gas processing plants in the U.S. Processes involved in these gas plants include cryogenic separations, demethanization, lean oil absorption, fractionation and gas treating. Next, the proposed solutions had to be simple yet comprehensive enough to allow an operator to maintain product specifications while operating over a wide range of gas input flow and composition. This had to be a supervisors system that remained on-line more than 95% of the time, and achieved reduced plant operating variability and improved variable cost control. It took more than a year to study various gas processes and to develop a control approach before a real application was finally exercised. An initial process for C 2 and CO 2 recoveries was chosen

  12. Equipment reliability process improvement and preventive maintenance optimization

    International Nuclear Information System (INIS)

    Darragi, M.; Georges, A.; Vaillancourt, R.; Komljenovic, D.; Croteau, M.

    2004-01-01

    The Gentilly-2 Nuclear Power Plant wants to optimize its preventive maintenance program through an Integrated Equipment Reliability Process. All equipment reliability related activities should be reviewed and optimized in a systematic approach especially for aging plants such as G2. This new approach has to be founded on best practices methods with the purpose of the rationalization of the preventive maintenance program and the performance monitoring of on-site systems, structures and components (SSC). A rational preventive maintenance strategy is based on optimized task scopes and frequencies depending on their applicability, critical effects on system safety and plant availability as well as cost-effectiveness. Preventive maintenance strategy efficiency is systematically monitored through degradation indicators. (author)

  13. Decomposition based parallel processing technique for efficient collaborative optimization

    International Nuclear Information System (INIS)

    Park, Hyung Wook; Kim, Sung Chan; Kim, Min Soo; Choi, Dong Hoon

    2000-01-01

    In practical design studies, most of designers solve multidisciplinary problems with complex design structure. These multidisciplinary problems have hundreds of analysis and thousands of variables. The sequence of process to solve these problems affects the speed of total design cycle. Thus it is very important for designer to reorder original design processes to minimize total cost and time. This is accomplished by decomposing large multidisciplinary problem into several MultiDisciplinary Analysis SubSystem (MDASS) and processing it in parallel. This paper proposes new strategy for parallel decomposition of multidisciplinary problem to raise design efficiency by using genetic algorithm and shows the relationship between decomposition and Multidisciplinary Design Optimization(MDO) methodology

  14. Expert system and process optimization techniques for real-time monitoring and control of plasma processes

    Science.gov (United States)

    Cheng, Jie; Qian, Zhaogang; Irani, Keki B.; Etemad, Hossein; Elta, Michael E.

    1991-03-01

    To meet the ever-increasing demand of the rapidly-growing semiconductor manufacturing industry it is critical to have a comprehensive methodology integrating techniques for process optimization real-time monitoring and adaptive process control. To this end we have accomplished an integrated knowledge-based approach combining latest expert system technology machine learning method and traditional statistical process control (SPC) techniques. This knowledge-based approach is advantageous in that it makes it possible for the task of process optimization and adaptive control to be performed consistently and predictably. Furthermore this approach can be used to construct high-level and qualitative description of processes and thus make the process behavior easy to monitor predict and control. Two software packages RIST (Rule Induction and Statistical Testing) and KARSM (Knowledge Acquisition from Response Surface Methodology) have been developed and incorporated with two commercially available packages G2 (real-time expert system) and ULTRAMAX (a tool for sequential process optimization).

  15. A Technical Survey on Optimization of Processing Geo Distributed Data

    Science.gov (United States)

    Naga Malleswari, T. Y. J.; Ushasukhanya, S.; Nithyakalyani, A.; Girija, S.

    2018-04-01

    With growing cloud services and technology, there is growth in some geographically distributed data centers to store large amounts of data. Analysis of geo-distributed data is required in various services for data processing, storage of essential information, etc., processing this geo-distributed data and performing analytics on this data is a challenging task. The distributed data processing is accompanied by issues in storage, computation and communication. The key issues to be dealt with are time efficiency, cost minimization, utility maximization. This paper describes various optimization methods like end-to-end multiphase, G-MR, etc., using the techniques like Map-Reduce, CDS (Community Detection based Scheduling), ROUT, Workload-Aware Scheduling, SAGE, AMP (Ant Colony Optimization) to handle these issues. In this paper various optimization methods and techniques used are analyzed. It has been observed that end-to end multiphase achieves time efficiency; Cost minimization concentrates to achieve Quality of Service, Computation and reduction of Communication cost. SAGE achieves performance improvisation in processing geo-distributed data sets.

  16. Optimal integration of organic Rankine cycles with industrial processes

    International Nuclear Information System (INIS)

    Hipólito-Valencia, Brígido J.; Rubio-Castro, Eusiel; Ponce-Ortega, José M.; Serna-González, Medardo; Nápoles-Rivera, Fabricio; El-Halwagi, Mahmoud M.

    2013-01-01

    Highlights: • An optimization approach for heat integration is proposed. • A new general superstructure for heat integration is proposed. • Heat process streams are simultaneously integrated with an organic Rankine cycle. • Better results can be obtained respect to other previously reported methodologies. - Abstract: This paper presents a procedure for simultaneously handling the problem of optimal integration of regenerative organic Rankine cycles (ORCs) with overall processes. ORCs may allow the recovery of an important fraction of the low-temperature process excess heat (i.e., waste heat from industrial processes) in the form of mechanical energy. An integrated stagewise superstructure is proposed for representing the interconnections and interactions between the HEN and ORC for fixed data of process streams. Based on the integrated superstructure, the optimization problem is formulated as a mixed integer nonlinear programming problem to simultaneously account for the capital and operating costs including the revenue from the sale of the shaft power produced by the integrated system. The application of this method is illustrated with three example problems. Results show that the proposed procedure provides significantly better results than an earlier developed method for discovering optimal integrated systems using a sequential approach, due to the fact that it accounts simultaneously for the tradeoffs between the capital and operating costs as well as the sale of the produced energy. Also, the proposed method is an improvement over the previously reported methods for solving the synthesis problem of heat exchanger networks without the option of integration with an ORC (i.e., stand-alone heat exchanger networks)

  17. Importance of design optimization of gamma processing plants

    International Nuclear Information System (INIS)

    George, Jain Reji

    2014-01-01

    Radiation processing of food commodities using ionizing radiations is well established world wide. In India too, novel designs are coming up for food irradiation as well as for multiproduct irradiation. It has been observed that though the designs of the product movement systems are excelling, the actual purpose for which the designs are made are failing in some. In such situations it is difficult to achieve an effective dose delivery by controlling the process parameters or even by modifying the source activity distribution without compromising some other aspects like throughput. It is very essential to arrive at an optimization in all components such as radiation source geometry, source product geometry and protective barriers of an irradiator system. Optimization of the various parameters can be done by modeling and analysis of the design

  18. Optimization of cutting parameters for machining time in turning process

    Science.gov (United States)

    Mavliutov, A. R.; Zlotnikov, E. G.

    2018-03-01

    This paper describes the most effective methods for nonlinear constraint optimization of cutting parameters in the turning process. Among them are Linearization Programming Method with Dual-Simplex algorithm, Interior Point method, and Augmented Lagrangian Genetic Algorithm (ALGA). Every each of them is tested on an actual example – the minimization of production rate in turning process. The computation was conducted in the MATLAB environment. The comparative results obtained from the application of these methods show: The optimal value of the linearized objective and the original function are the same. ALGA gives sufficiently accurate values, however, when the algorithm uses the Hybrid function with Interior Point algorithm, the resulted values have the maximal accuracy.

  19. Intelligent Optimization of a Mixed Culture Cultivation Process

    Directory of Open Access Journals (Sweden)

    Petia Koprinkova-Hristova

    2015-04-01

    Full Text Available In the present paper a neural network approach called "Adaptive Critic Design" (ACD was applied to optimal tuning of set point controllers of the three main substrates (sugar, nitrogen source and dissolved oxygen for PHB production process. For approximation of the critic and the controllers a special kind of recurrent neural networks called Echo state networks (ESN were used. Their structure allows fast training that will be of crucial importance in on-line applications. The critic network is trained to minimize the temporal difference error using Recursive Least Squares method. Two approaches - gradient and heuristic - were exploited for training of the controllers. The comparison is made with respect to achieved improvement of the utility function subject of optimization as well as with known expert strategy for control the PHB production process.

  20. Simulation of Corrosion Process for Structure with the Cellular Automata Method

    Science.gov (United States)

    Chen, M. C.; Wen, Q. Q.

    2017-06-01

    In this paper, from the mesoscopic point of view, under the assumption of metal corrosion damage evolution being a diffusive process, the cellular automata (CA) method was proposed to simulate numerically the uniform corrosion damage evolution of outer steel tube of concrete filled steel tubular columns subjected to corrosive environment, and the effects of corrosive agent concentration, dissolution probability and elapsed etching time on the corrosion damage evolution were also investigated. It was shown that corrosion damage increases nonlinearly with increasing elapsed etching time, and the longer the etching time, the more serious the corrosion damage; different concentration of corrosive agents had different impacts on the corrosion damage degree of the outer steel tube, but the difference between the impacts was very small; the heavier the concentration, the more serious the influence. The greater the dissolution probability, the more serious the corrosion damage of the outer steel tube, but with the increase of dissolution probability, the difference between its impacts on the corrosion damage became smaller and smaller. To validate present method, corrosion damage measurements for concrete filled square steel tubular columns (CFSSTCs) sealed at both their ends and immersed fully in a simulating acid rain solution were conducted, and Faraday’s law was used to predict their theoretical values. Meanwhile, the proposed CA mode was applied for the simulation of corrosion damage evolution of the CFSSTCs. It was shown by the comparisons of results from the three methods aforementioned that they were in good agreement, implying that the proposed method used for the simulation of corrosion damage evolution of concrete filled steel tubular columns is feasible and effective. It will open a new approach to study and evaluate further the corrosion damage, loading capacity and lifetime prediction of concrete filled steel tubular structures.

  1. Optimizing The DSSC Fabrication Process Using Lean Six Sigma

    Science.gov (United States)

    Fauss, Brian

    Alternative energy technologies must become more cost effective to achieve grid parity with fossil fuels. Dye sensitized solar cells (DSSCs) are an innovative third generation photovoltaic technology, which is demonstrating tremendous potential to become a revolutionary technology due to recent breakthroughs in cost of fabrication. The study here focused on quality improvement measures undertaken to improve fabrication of DSSCs and enhance process efficiency and effectiveness. Several quality improvement methods were implemented to optimize the seven step individual DSSC fabrication processes. Lean Manufacturing's 5S method successfully increased efficiency in all of the processes. Six Sigma's DMAIC methodology was used to identify and eliminate each of the root causes of defects in the critical titanium dioxide deposition process. These optimizations resulted with the following significant improvements in the production process: 1. fabrication time of the DSSCs was reduced by 54 %; 2. fabrication procedures were improved to the extent that all critical defects in the process were eliminated; 3. the quantity of functioning DSSCs fabricated was increased from 17 % to 90 %.

  2. Research in Mobile Database Query Optimization and Processing

    Directory of Open Access Journals (Sweden)

    Agustinus Borgy Waluyo

    2005-01-01

    Full Text Available The emergence of mobile computing provides the ability to access information at any time and place. However, as mobile computing environments have inherent factors like power, storage, asymmetric communication cost, and bandwidth limitations, efficient query processing and minimum query response time are definitely of great interest. This survey groups a variety of query optimization and processing mechanisms in mobile databases into two main categories, namely: (i query processing strategy, and (ii caching management strategy. Query processing includes both pull and push operations (broadcast mechanisms. We further classify push operation into on-demand broadcast and periodic broadcast. Push operation (on-demand broadcast relates to designing techniques that enable the server to accommodate multiple requests so that the request can be processed efficiently. Push operation (periodic broadcast corresponds to data dissemination strategies. In this scheme, several techniques to improve the query performance by broadcasting data to a population of mobile users are described. A caching management strategy defines a number of methods for maintaining cached data items in clients' local storage. This strategy considers critical caching issues such as caching granularity, caching coherence strategy and caching replacement policy. Finally, this survey concludes with several open issues relating to mobile query optimization and processing strategy.

  3. Simulation based optimization on automated fibre placement process

    Science.gov (United States)

    Lei, Shi

    2018-02-01

    In this paper, a software simulation (Autodesk TruPlan & TruFiber) based method is proposed to optimize the automate fibre placement (AFP) process. Different types of manufacturability analysis are introduced to predict potential defects. Advanced fibre path generation algorithms are compared with respect to geometrically different parts. Major manufacturing data have been taken into consideration prior to the tool paths generation to achieve high success rate of manufacturing.

  4. A model for optimization of process integration investments under uncertainty

    International Nuclear Information System (INIS)

    Svensson, Elin; Stroemberg, Ann-Brith; Patriksson, Michael

    2011-01-01

    The long-term economic outcome of energy-related industrial investment projects is difficult to evaluate because of uncertain energy market conditions. In this article, a general, multistage, stochastic programming model for the optimization of investments in process integration and industrial energy technologies is proposed. The problem is formulated as a mixed-binary linear programming model where uncertainties are modelled using a scenario-based approach. The objective is to maximize the expected net present value of the investments which enables heat savings and decreased energy imports or increased energy exports at an industrial plant. The proposed modelling approach enables a long-term planning of industrial, energy-related investments through the simultaneous optimization of immediate and later decisions. The stochastic programming approach is also suitable for modelling what is possibly complex process integration constraints. The general model formulation presented here is a suitable basis for more specialized case studies dealing with optimization of investments in energy efficiency. -- Highlights: → Stochastic programming approach to long-term planning of process integration investments. → Extensive mathematical model formulation. → Multi-stage investment decisions and scenario-based modelling of uncertain energy prices. → Results illustrate how investments made now affect later investment and operation opportunities. → Approach for evaluation of robustness with respect to variations in probability distribution.

  5. On the optimal design of the disassembly and recovery processes.

    Science.gov (United States)

    Xanthopoulos, A; Iakovou, E

    2009-05-01

    This paper tackles the problem of the optimal design of the recovery processes of the end-of-life (EOL) electric and electronic products, with a special focus on the disassembly issues. The objective is to recover as much ecological and economic value as possible, and to reduce the overall produced quantities of waste. In this context, a medium-range tactical problem is defined and a novel two-phased algorithm is presented for a remanufacturing-driven reverse supply chain. In the first phase, we propose a multicriteria/goal-programming analysis for the identification and the optimal selection of the most 'desirable' subassemblies and components to be disassembled for recovery, from a set of different types of EOL products. In the second phase, a multi-product, multi-period mixed-integer linear programming (MILP) model is presented, which addresses the optimization of the recovery processes, while taking into account explicitly the lead times of the disassembly and recovery processes. Moreover, a simulation-based solution approach is proposed for capturing the uncertainties in reverse logistics. The overall approach leads to an easy-to-use methodology that could support effectively middle level management decisions. Finally, the applicability of the developed methodology is illustrated by its application on a specific case study.

  6. Magnetic manipulation device for the optimization of cell processing conditions.

    Science.gov (United States)

    Ito, Hiroshi; Kato, Ryuji; Ino, Kosuke; Honda, Hiroyuki

    2010-02-01

    Variability in human cell phenotypes make it's advancements in optimized cell processing necessary for personalized cell therapy. Here we propose a strategy of palm-top sized device to assist physically manipulating cells for optimizing cell preparations. For the design of such a device, we combined two conventional approaches: multi-well plate formatting and magnetic cell handling using magnetite cationic liposomes (MCLs). From our previous works, we showed the labeling applications of MCL on adhesive cells for various tissue engineering approaches. To feasibly transfer cells in multi-well plate, we here evaluated the magnetic response of MCL-labeled suspension type cells. The cell handling performance of Jurkat cells proved to be faster and more robust compared to MACS (Magnetic Cell Sorting) bead methods. To further confirm our strategy, prototype palm-top sized device "magnetic manipulation device (MMD)" was designed. In the device, the actual cell transportation efficacy of Jurkat cells was satisfying. Moreover, as a model of the most distributed clinical cell processing, primary peripheral blood mononuclear cells (PBMCs) from different volunteers were evaluated. By MMD, individual PBMCs indicated to have optimum Interleukin-2 (IL-2) concentrations for the expansion. Such huge differences of individual cells indicated that MMD, our proposing efficient and self-contained support tool, could assist the feasible and cost-effective optimization of cell processing in clinical facilities. Copyright (c) 2009 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  7. On the optimal design of the disassembly and recovery processes

    International Nuclear Information System (INIS)

    Xanthopoulos, A.; Iakovou, E.

    2009-01-01

    This paper tackles the problem of the optimal design of the recovery processes of the end-of-life (EOL) electric and electronic products, with a special focus on the disassembly issues. The objective is to recover as much ecological and economic value as possible, and to reduce the overall produced quantities of waste. In this context, a medium-range tactical problem is defined and a novel two-phased algorithm is presented for a remanufacturing-driven reverse supply chain. In the first phase, we propose a multicriteria/goal-programming analysis for the identification and the optimal selection of the most 'desirable' subassemblies and components to be disassembled for recovery, from a set of different types of EOL products. In the second phase, a multi-product, multi-period mixed-integer linear programming (MILP) model is presented, which addresses the optimization of the recovery processes, while taking into account explicitly the lead times of the disassembly and recovery processes. Moreover, a simulation-based solution approach is proposed for capturing the uncertainties in reverse logistics. The overall approach leads to an easy-to-use methodology that could support effectively middle level management decisions. Finally, the applicability of the developed methodology is illustrated by its application on a specific case study

  8. Process optimization and evaluation of novel baicalin solid nanocrystals

    Directory of Open Access Journals (Sweden)

    Yue PF

    2013-08-01

    Full Text Available Peng-Fei Yue,1,2 Yu Li,1 Jing Wan,1 Yong Wang,1 Ming Yang,1 Wei-Feng Zhu,1 Chang-Hong Wang,2 Hai-Long Yuan31Key Lab of Modern Preparation of TCM, Jiangxi University of Traditional Chinese Medicine, Nanchang, 2Institute of Chinese Materia Medica, Shanghai University of Traditional Chinese Medicine, Shanghai, 3302 Hospital of PLA Institute of Chinese Materia Medica, Beijing, People's Republic of ChinaAbstract: The objective of this study was to prepare baicalin solid nanocrystals (BCN-SNS to enhance oral bioavailability of baicalin. A Box–Behnken design approach was used for process optimization. The physicochemical properties and pharmacokinetics of the optimal BCN-SNS were investigated. Multiple linear regression analysis for process optimization revealed that the fine BCN-SNS was obtained wherein the optimal values of homogenization pressure (bar, homogenization cycles (cycles, amount of TPGS to drug (w/w, and amount of MCCS to drug (w/w were 850 bar, 25 cycles, 10%, and 10%, respectively. Transmission electron microscopy and scanning electron microscopy results indicated that no significant aggregation or crystal growth could be observed in the redispersed freeze-dried BCN-SNS. Differential scanning calorimetry and X-ray diffraction results showed that BCN remained in a crystalline state. Dissolution velocity of the freeze-dried BCN-SNS powder was distinctly superior compared to those of the crude powder and physical mixture. The bioavailability of BCN in rats was increased remarkably after oral administration of BCN-SNS (P < 0.05, compared with those of BCN or the physical mixture. The SNS might be a good choice for oral administration of poorly soluble BCN, due to an improvement of the bioavailability and dissolution velocity of BCN-SNS.Keywords: baicalin, solid nanocrystals, optimization, in vivo/vitro evaluation

  9. Urban Growth Modeling Using Cellular Automata with Multi-Temporal Remote Sensing Images Calibrated by the Artificial Bee Colony Optimization Algorithm.

    Science.gov (United States)

    Naghibi, Fereydoun; Delavar, Mahmoud Reza; Pijanowski, Bryan

    2016-12-14

    Cellular Automata (CA) is one of the most common techniques used to simulate the urbanization process. CA-based urban models use transition rules to deliver spatial patterns of urban growth and urban dynamics over time. Determining the optimum transition rules of the CA is a critical step because of the heterogeneity and nonlinearities existing among urban growth driving forces. Recently, new CA models integrated with optimization methods based on swarm intelligence algorithms were proposed to overcome this drawback. The Artificial Bee Colony (ABC) algorithm is an advanced meta-heuristic swarm intelligence-based algorithm. Here, we propose a novel CA-based urban change model that uses the ABC algorithm to extract optimum transition rules. We applied the proposed ABC-CA model to simulate future urban growth in Urmia (Iran) with multi-temporal Landsat images from 1997, 2006 and 2015. Validation of the simulation results was made through statistical methods such as overall accuracy, the figure of merit and total operating characteristics (TOC). Additionally, we calibrated the CA model by ant colony optimization (ACO) to assess the performance of our proposed model versus similar swarm intelligence algorithm methods. We showed that the overall accuracy and the figure of merit of the ABC-CA model are 90.1% and 51.7%, which are 2.9% and 8.8% higher than those of the ACO-CA model, respectively. Moreover, the allocation disagreement of the simulation results for the ABC-CA model is 9.9%, which is 2.9% less than that of the ACO-CA model. Finally, the ABC-CA model also outperforms the ACO-CA model with fewer quantity and allocation errors and slightly more hits.

  10. Data Mining Process Optimization in Computational Multi-agent Systems

    OpenAIRE

    Kazík, O.; Neruda, R. (Roman)

    2015-01-01

    In this paper, we present an agent-based solution of metalearning problem which focuses on optimization of data mining processes. We exploit the framework of computational multi-agent systems in which various meta-learning problems have been already studied, e.g. parameter-space search or simple method recommendation. In this paper, we examine the effect of data preprocessing for machine learning problems. We perform the set of experiments in the search-space of data mining processes which is...

  11. Hydrologic Process-oriented Optimization of Electrical Resistivity Tomography

    Science.gov (United States)

    Hinnell, A.; Bechtold, M.; Ferre, T. A.; van der Kruk, J.

    2010-12-01

    Electrical resistivity tomography (ERT) is commonly used in hydrologic investigations. Advances in joint and coupled hydrogeophysical inversion have enhanced the quantitative use of ERT to construct and condition hydrologic models (i.e. identify hydrologic structure and estimate hydrologic parameters). However the selection of which electrical resistivity data to collect and use is often determined by a combination of data requirements for geophysical analysis, intuition on the part of the hydrogeophysicist and logistical constraints of the laboratory or field site. One of the advantages of coupled hydrogeophysical inversion is the direct link between the hydrologic model and the individual geophysical data used to condition the model. That is, there is no requirement to collect geophysical data suitable for independent geophysical inversion. The geophysical measurements collected can be optimized for estimation of hydrologic model parameters rather than to develop a geophysical model. Using a synthetic model of drip irrigation we evaluate the value of individual resistivity measurements to describe the soil hydraulic properties and then use this information to build a data set optimized for characterizing hydrologic processes. We then compare the information content in the optimized data set with the information content in a data set optimized using a Jacobian sensitivity analysis.

  12. Thermodynamic and economic optimization of LNG mixed refrigerant processes

    International Nuclear Information System (INIS)

    Wang, Mengyu; Khalilpour, Rajab; Abbas, Ali

    2014-01-01

    Highlights: • We study performance and cost optimization of C3MR and DMR processes. • A new economic objective function is proposed to reduce both compression work and equipment size. • The comparison of C3MR and DMR processes is based on process configuration, performance, and cost. - Abstract: Natural gas liquefaction processes are energy and cost intensive. This paper performs thermodynamic and economic optimization of the mid-scale mixed refrigerant cycles including propane precooled mixed refrigerant (C3MR) and dual mixed refrigerant (DMR) processes. Four different objective functions in this study are selected: total shaft work consumption, total cost investment (TCI), total annualized cost (TAC), and total capital cost of compressors and main cryogenic exchangers (MCHEs). Total cost investment (TCI) is a function of two key variables: shaft work (W) and overall heat transfer coefficient and area (UA) of MCHEs. It is proposed for reducing energy consumption and simultaneously minimizing total capital expenditure (CAPEX) and operating expenditure (OPEX). Total shaft work objective function can result in a 44.5% reduction of shaft work for C3MR and a 48.6% reduction for DMR compared to their baseline values, but infinitely high UA of MCHEs. Optimal results show that total capital cost of compressors and MCHEs is more suitable than other objective functions for the objective of reducing both shaft work and UA. It reduces 14.5% of specific power for C3MR and 26.7% for DMR when achieving the relatively lower UA values than their baseline values. In addition, TCI and TAC can also reduce a certain amount of total shaft work at a finite increased UA

  13. Optimization of Sunflower Oil Transesterification Process Using Sodium Methoxide

    Directory of Open Access Journals (Sweden)

    Sara KoohiKamali

    2012-01-01

    Full Text Available In this study, the methanolysis process of sunflower oil was investigated to get high methyl esters (biodiesel content using sodium methoxide. To reach to the best process conditions, central composite design (CCD through response surface methodology (RSM was employed. The optimal conditions predicted were the reaction time of 60 min, an excess stoichiometric amount of alcohol to oil ratio of 25%w/w and the catalyst content of 0.5%w/w, which lead to the highest methyl ester content (100%w/w. The methyl ester content of the mixture from gas chromatography analysis (GC was compared to that of optimum point. Results, confirmed that there was no significant difference between the fatty acid methyl ester content of sunflower oil produced under the optimized condition and the experimental value (P≥0.05. Furthermore, some fuel specifications of the resultant biodiesel were tested according to American standards for testing of materials (ASTM methods. The outcome showed that the methyl ester mixture produced from the optimized condition met nearly most of the important biodiesel specifications recommended in ASTM D 6751 requirements. Thus, the sunflower oil methyl esters resulted from this study could be a suitable alternative for petrol diesels.

  14. A Taguchi approach on optimal process control parameters for HDPE pipe extrusion process

    Science.gov (United States)

    Sharma, G. V. S. S.; Rao, R. Umamaheswara; Rao, P. Srinivasa

    2017-06-01

    High-density polyethylene (HDPE) pipes find versatile applicability for transportation of water, sewage and slurry from one place to another. Hence, these pipes undergo tremendous pressure by the fluid carried. The present work entails the optimization of the withstanding pressure of the HDPE pipes using Taguchi technique. The traditional heuristic methodology stresses on a trial and error approach and relies heavily upon the accumulated experience of the process engineers for determining the optimal process control parameters. This results in setting up of less-than-optimal values. Hence, there arouse a necessity to determine optimal process control parameters for the pipe extrusion process, which can ensure robust pipe quality and process reliability. In the proposed optimization strategy, the design of experiments (DoE) are conducted wherein different control parameter combinations are analyzed by considering multiple setting levels of each control parameter. The concept of signal-to-noise ratio ( S/ N ratio) is applied and ultimately optimum values of process control parameters are obtained as: pushing zone temperature of 166 °C, Dimmer speed at 08 rpm, and Die head temperature to be 192 °C. Confirmation experimental run is also conducted to verify the analysis and research result and values proved to be in synchronization with the main experimental findings and the withstanding pressure showed a significant improvement from 0.60 to 1.004 Mpa.

  15. Training practices of cell processing laboratory staff: analysis of a survey by the Alliance for Harmonization of Cellular Therapy Accreditation.

    Science.gov (United States)

    Keever-Taylor, Carolyn A; Slaper-Cortenbach, Ineke; Celluzzi, Christina; Loper, Kathy; Aljurf, Mahmoud; Schwartz, Joseph; Mcgrath, Eoin; Eldridge, Paul

    2015-12-01

    Methods for processing products used for hematopoietic progenitor cell (HPC) transplantation must ensure their safety and efficacy. Personnel training and ongoing competency assessment is critical to this goal. Here we present results from a global survey of methods used by a diverse array of cell processing facilities for the initial training and ongoing competency assessment of key personnel. The Alliance for Harmonisation of Cellular Therapy Accreditation (AHCTA) created a survey to identify facility type, location, activity, personnel, and methods used for training and competency. A survey link was disseminated through organizations represented in AHCTA to processing facilities worldwide. Responses were tabulated and analyzed as a percentage of total responses and as a percentage of response by region group. Most facilities were based at academic medical centers or hospitals. Facilities with a broad range of activity, product sources and processing procedures were represented. Facilities reported using a combination of training and competency methods. However, some methods predominated. Cellular sources for training differed for training versus competency and also differed based on frequency of procedures performed. Most facilities had responsibilities for procedures in addition to processing for which training and competency methods differed. Although regional variation was observed, training and competency requirements were generally consistent. Survey data showed the use of a variety of training and competency methods but some methods predominated, suggesting their utility. These results could help new and established facilities in making decisions for their own training and competency programs. Copyright © 2015 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  16. Metamodeling and optimization of the THF process with pulsating pressure

    Science.gov (United States)

    Bucconi, Marco; Strano, Matteo

    2018-05-01

    Tube hydroforming is a process used in various applications to form the tube in a desired complex shape, by combining the use of internal pressure, which provides the required stress to yield the material, and axial feeding, which helps the material to flow towards the bulging zone. In many studies it has been demonstrated how wrinkling and bursting defects can be severely reduced by means of a pulsating pressure, and how the so-called hammering hydroforming enhances the formability of the material. The definition of the optimum pressure and axial feeding profiles represent a daunting challenge in the designing phase of the hydroforming operation of a new part. The quality of the formed part is highly dependent on the amplitude and the peak value of the pulsating pressure, along with the axial stroke. In this paper, a research is reported, conducted by means of explicit finite element simulations of a hammering THF operation and metamodeling techniques aimed at optimizing the process parameters for the production of a complex part. The improved formability is explored for different factors and an optimization strategy is used to determine the most convenient pressure and axial feed profile curves for the hammering THF process of the examined part. It is shown how the pulsating pressure allows the minimization of the energy input in the process, still respecting final quality requirements.

  17. Optimal and adaptive methods of processing hydroacoustic signals (review)

    Science.gov (United States)

    Malyshkin, G. S.; Sidel'nikov, G. B.

    2014-09-01

    Different methods of optimal and adaptive processing of hydroacoustic signals for multipath propagation and scattering are considered. Advantages and drawbacks of the classical adaptive (Capon, MUSIC, and Johnson) algorithms and "fast" projection algorithms are analyzed for the case of multipath propagation and scattering of strong signals. The classical optimal approaches to detecting multipath signals are presented. A mechanism of controlled normalization of strong signals is proposed to automatically detect weak signals. The results of simulating the operation of different detection algorithms for a linear equidistant array under multipath propagation and scattering are presented. An automatic detector is analyzed, which is based on classical or fast projection algorithms, which estimates the background proceeding from median filtering or the method of bilateral spatial contrast.

  18. Where should I send it? Optimizing the submission decision process.

    Directory of Open Access Journals (Sweden)

    Santiago Salinas

    Full Text Available How do scientists decide where to submit manuscripts? Many factors influence this decision, including prestige, acceptance probability, turnaround time, target audience, fit, and impact factor. Here, we present a framework for evaluating where to submit a manuscript based on the theory of Markov decision processes. We derive two models, one in which an author is trying to optimally maximize citations and another in which that goal is balanced by either minimizing the number of resubmissions or the total time in review. We parameterize the models with data on acceptance probability, submission-to-decision times, and impact factors for 61 ecology journals. We find that submission sequences beginning with Ecology Letters, Ecological Monographs, or PLOS ONE could be optimal depending on the importance given to time to acceptance or number of resubmissions. This analysis provides some guidance on where to submit a manuscript given the individual-specific values assigned to these disparate objectives.

  19. Algorithm for cellular reprogramming.

    Science.gov (United States)

    Ronquist, Scott; Patterson, Geoff; Muir, Lindsey A; Lindsly, Stephen; Chen, Haiming; Brown, Markus; Wicha, Max S; Bloch, Anthony; Brockett, Roger; Rajapakse, Indika

    2017-11-07

    The day we understand the time evolution of subcellular events at a level of detail comparable to physical systems governed by Newton's laws of motion seems far away. Even so, quantitative approaches to cellular dynamics add to our understanding of cell biology. With data-guided frameworks we can develop better predictions about, and methods for, control over specific biological processes and system-wide cell behavior. Here we describe an approach for optimizing the use of transcription factors (TFs) in cellular reprogramming, based on a device commonly used in optimal control. We construct an approximate model for the natural evolution of a cell-cycle-synchronized population of human fibroblasts, based on data obtained by sampling the expression of 22,083 genes at several time points during the cell cycle. To arrive at a model of moderate complexity, we cluster gene expression based on division of the genome into topologically associating domains (TADs) and then model the dynamics of TAD expression levels. Based on this dynamical model and additional data, such as known TF binding sites and activity, we develop a methodology for identifying the top TF candidates for a specific cellular reprogramming task. Our data-guided methodology identifies a number of TFs previously validated for reprogramming and/or natural differentiation and predicts some potentially useful combinations of TFs. Our findings highlight the immense potential of dynamical models, mathematics, and data-guided methodologies for improving strategies for control over biological processes. Copyright © 2017 the Author(s). Published by PNAS.

  20. Parallel processing based decomposition technique for efficient collaborative optimization

    International Nuclear Information System (INIS)

    Park, Hyung Wook; Kim, Sung Chan; Kim, Min Soo; Choi, Dong Hoon

    2001-01-01

    In practical design studies, most of designers solve multidisciplinary problems with large sized and complex design system. These multidisciplinary problems have hundreds of analysis and thousands of variables. The sequence of process to solve these problems affects the speed of total design cycle. Thus it is very important for designer to reorder the original design processes to minimize total computational cost. This is accomplished by decomposing large multidisciplinary problem into several MultiDisciplinary Analysis SubSystem (MDASS) and processing it in parallel. This paper proposes new strategy for parallel decomposition of multidisciplinary problem to raise design efficiency by using genetic algorithm and shows the relationship between decomposition and Multidisciplinary Design Optimization(MDO) methodology

  1. Process Optimization for Valuable Metal Recovery from Dental Amalgam Residues

    Directory of Open Access Journals (Sweden)

    C.M. Parra–Mesa

    2009-07-01

    Full Text Available In this paper, the methodology used for optimizing leaching in a semi pilot plant is presented. This leaching process was applied to recover value metals from dental amalgam residues. 23 factorial design was used to characterize the process during the first stage and in the second one, a central compound rotational design was used for modeling copper percentage dissolved, a function of the nitric acid concentration, leaching time and temperature. This model explained the 81% of the response variability, which is considered satisfactory given the complexity of the process kinetics and, furthermore, it allowed the definition of the operation conditions for better copper recovery, which this was of 99.15%, at a temperature of 55°C, a concentration of 30% by weight and a time of 26 hours.

  2. Adjusting process count on demand for petascale global optimization

    KAUST Repository

    Sosonkina, Masha; Watson, Layne T.; Radcliffe, Nicholas R.; Haftka, Rafael T.; Trosset, Michael W.

    2013-01-01

    There are many challenges that need to be met before efficient and reliable computation at the petascale is possible. Many scientific and engineering codes running at the petascale are likely to be memory intensive, which makes thrashing a serious problem for many petascale applications. One way to overcome this challenge is to use a dynamic number of processes, so that the total amount of memory available for the computation can be increased on demand. This paper describes modifications made to the massively parallel global optimization code pVTdirect in order to allow for a dynamic number of processes. In particular, the modified version of the code monitors memory use and spawns new processes if the amount of available memory is determined to be insufficient. The primary design challenges are discussed, and performance results are presented and analyzed.

  3. A novel approach in optimization problem for research reactors fuel plate using a synergy between cellular automata and quasi-simulated annealing methods

    International Nuclear Information System (INIS)

    Barati, Ramin

    2014-01-01

    Highlights: • An innovative optimization technique for multi-objective optimization is presented. • The technique utilizes combination of CA and quasi-simulated annealing. • Mass and deformation of fuel plate are considered as objective functions. • Computational burden is significantly reduced compared to classic tools. - Abstract: This paper presents a new and innovative optimization technique utilizing combination of cellular automata (CA) and quasi-simulated annealing (QSA) as solver concerning conceptual design optimization which is indeed a multi-objective optimization problem. Integrating CA and QSA into a unified optimizer tool has a great potential for solving multi-objective optimization problems. Simulating neighborhood effects while taking local information into account from CA and accepting transitions based on decreasing of objective function and Boltzmann distribution from QSA as transition rule make this tool effective in multi-objective optimization. Optimization of fuel plate safety design while taking into account major goals of conceptual design such as improving reliability and life-time – which are the most significant elements during shutdown – is a major multi-objective optimization problem. Due to hugeness of search space in fuel plate optimization problem, finding optimum solution in classical methods requires a huge amount of calculation and CPU time. The CA models, utilizing local information, require considerably less computation. In this study, minimizing both mass and deformation of fuel plate of a multipurpose research reactor (MPRR) are considered as objective functions. Results, speed, and qualification of proposed method are comparable with those of genetic algorithm and neural network methods applied to this problem before

  4. The Enterprise Derivative Application: Flexible Software for Optimizing Manufacturing Processes

    Energy Technology Data Exchange (ETDEWEB)

    Ward, Richard C [ORNL; Allgood, Glenn O [ORNL; Knox, John R [ORNL

    2008-11-01

    The Enterprise Derivative Application (EDA) implements the enterprise-derivative analysis for optimization of an industrial process (Allgood and Manges, 2001). It is a tool to help industry planners choose the most productive way of manufacturing their products while minimizing their cost. Developed in MS Access, the application allows users to input initial data ranging from raw material to variable costs and enables the tracking of specific information as material is passed from one process to another. Energy-derivative analysis is based on calculation of sensitivity parameters. For the specific application to a steel production process these include: the cost to product sensitivity, the product to energy sensitivity, the energy to efficiency sensitivity, and the efficiency to cost sensitivity. Using the EDA, for all processes the user can display a particular sensitivity or all sensitivities can be compared for all processes. Although energy-derivative analysis was originally designed for use by the steel industry, it is flexible enough to be applied to many other industrial processes. Examples of processes where energy-derivative analysis would prove useful are wireless monitoring of processes in the petroleum cracking industry and wireless monitoring of motor failure for determining the optimum time to replace motor parts. One advantage of the MS Access-based application is its flexibility in defining the process flow and establishing the relationships between parent and child process and products resulting from a process. Due to the general design of the program, a process can be anything that occurs over time with resulting output (products). So the application can be easily modified to many different industrial and organizational environments. Another advantage is the flexibility of defining sensitivity parameters. Sensitivities can be determined between all possible variables in the process flow as a function of time. Thus the dynamic development of the

  5. Optimal design issues of a gas-to-liquid process

    Energy Technology Data Exchange (ETDEWEB)

    Rafiee, Ahmad

    2012-07-01

    Interests in Fischer-Tropsch (FT) synthesis is increasing rapidly due to the recent improvements of the technology, clean-burning fuels (low sulphur, low aromatics) derived from the FT process and the realization that the process can be used to monetize stranded natural gas resources. The economy of GTL plants depends very much on the natural gas price and there is a strong incentive to reduce the investment cost and in addition there is a need to improve energy efficiency and carbon efficiency. A model is constructed based on the available information in open literature. This model is used to simulate the GTL process with UNISIM DESIGN process simulator. In the FT reactor with cobalt based catalyst, Co2 is inert and will accumulate in the system. Five placements of Co2 removal unit in the GTL process are evaluated from an economical point of view. For each alternative, the process is optimized with respect to steam to carbon ratio, purge ratio of light ends, amount of tail gas recycled to syngas and FT units, reactor volume, and Co2 recovery. The results show that carbon and energy efficiencies and the annual net cash flow of the process with or without Co2 removal unit are not significantly different and there is not much to gain by removing Co2 from the process. It is optimal to recycle about 97 % of the light ends to the process (mainly to the FT unit) to obtain higher conversion of CO and H2 in the reactor. Different syngas configurations in a gas-to-liquid (GTL) plant are studied including auto-thermal reformer (ATR), combined reformer, and series arrangement of Gas Heated Reformer (GHR) and ATR. The Fischer-Tropsch (FT) reactor is based on cobalt catalyst and the degrees of freedom are; steam to carbon ratio, purge ratio of light ends, amount of tail gas recycled to synthesis gas (syngas) and Fischer-Tropsch (FT) synthesis units, and reactor volume. The production rate of liquid hydrocarbons is maximized for each syngas configuration. Installing a steam

  6. Optimization of Processing Technology of Compound Dandelion Wine

    Directory of Open Access Journals (Sweden)

    Wu Jixuan

    2016-01-01

    Full Text Available Exploring dandelion food has been the concern in fields of the food processing and pharmaceutical industry for playing exact curative effect on high-fat-diet induced hepatic steatosis and diuretic activity. Few dandelion foods including drinks and microencapsulation were explored and unilateral dandelion wine were less carried out for its bitter flavour. In tis paper, to optimize the processing technologies of fermented compound wine from dandelion root, the orthogonal experiment design method was used to composite dandelion root powder with glutinous rice and schisandra fruit and optimize the fermenting parameters. Four factors with dandelion content, schisandra content, acidity and sugar content were discussed. The acidity factor was firstly confirmed as 7.0 g/L. The other three factors were confirmed by a series experiments as dandelion 0.55%, schisandra 0.5%, sugar 22%. With nine step processing of mixing substrate, stirring with water, cooking rice, amylase saccharification, pectinase hydrolysis, adjusting juice, fermenting with yeast, fitering, aging, sterilization, a light yellow wine with the special taste with flavour of dandelion, schisandra and rice and less bitter, few index were determined as 14.7% alcohol, 6.85 g/L acidity. A dandelion fermented compound wine with suitable flavour and sanitarian function was developed for enriching the dandelion food.

  7. A Cloud Computing Model for Optimization of Transport Logistics Process

    Directory of Open Access Journals (Sweden)

    Benotmane Zineb

    2017-09-01

    Full Text Available In any increasing competitive environment and even in companies; we must adopt a good logistic chain management policy which is the main objective to increase the overall gain by maximizing profits and minimizing costs, including manufacturing costs such as: transaction, transport, storage, etc. In this paper, we propose a cloud platform of this chain logistic for decision support; in fact, this decision must be made to adopt new strategy for cost optimization, besides, the decision-maker must have knowledge on the consequences of this new strategy. Our proposed cloud computing platform has a multilayer structure; this later is contained from a set of web services to provide a link between applications using different technologies; to enable sending; and receiving data through protocols, which should be understandable by everyone. The chain logistic is a process-oriented business; it’s used to evaluate logistics process costs, to propose optimal solutions and to evaluate these solutions before their application. As a scenario, we have formulated the problem for the delivery process, and we have proposed a modified Bin-packing algorithm to improve vehicles loading.

  8. Optimization of vibratory welding process parameters using response surface methodology

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Pravin Kumar; Kumar, S. Deepak; Patel, D.; Prasad, S. B. [National Institute of Technology Jamshedpur, Jharkhand (India)

    2017-05-15

    The current investigation was carried out to study the effect of vibratory welding technique on mechanical properties of 6 mm thick butt welded mild steel plates. A new concept of vibratory welding technique has been designed and developed which is capable to transfer vibrations, having resonance frequency of 300 Hz, into the molten weld pool before it solidifies during the Shielded metal arc welding (SMAW) process. The important process parameters of vibratory welding technique namely welding current, welding speed and frequency of the vibrations induced in molten weld pool were optimized using Taguchi’s analysis and Response surface methodology (RSM). The effect of process parameters on tensile strength and hardness were evaluated using optimization techniques. Applying RSM, the effect of vibratory welding parameters on tensile strength and hardness were obtained through two separate regression equations. Results showed that, the most influencing factor for the desired tensile strength and hardness is frequency at its resonance value, i.e. 300 Hz. The micro-hardness and microstructures of the vibratory welded joints were studied in detail and compared with those of conventional SMAW joints. Comparatively, uniform and fine grain structure has been found in vibratory welded joints.

  9. Simulative design and process optimization of the two-stage stretch-blow molding process

    Energy Technology Data Exchange (ETDEWEB)

    Hopmann, Ch.; Rasche, S.; Windeck, C. [Institute of Plastics Processing at RWTH Aachen University (IKV) Pontstraße 49, 52062 Aachen (Germany)

    2015-05-22

    The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development time and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress.

  10. Simulative design and process optimization of the two-stage stretch-blow molding process

    International Nuclear Information System (INIS)

    Hopmann, Ch.; Rasche, S.; Windeck, C.

    2015-01-01

    The total production costs of PET bottles are significantly affected by the costs of raw material. Approximately 70 % of the total costs are spent for the raw material. Therefore, stretch-blow molding industry intends to reduce the total production costs by an optimized material efficiency. However, there is often a trade-off between an optimized material efficiency and required product properties. Due to a multitude of complex boundary conditions, the design process of new stretch-blow molded products is still a challenging task and is often based on empirical knowledge. Application of current CAE-tools supports the design process by reducing development time and costs. This paper describes an approach to determine optimized preform geometry and corresponding process parameters iteratively. The wall thickness distribution and the local stretch ratios of the blown bottle are calculated in a three-dimensional process simulation. Thereby, the wall thickness distribution is correlated with an objective function and preform geometry as well as process parameters are varied by an optimization algorithm. Taking into account the correlation between material usage, process history and resulting product properties, integrative coupled simulation steps, e.g. structural analyses or barrier simulations, are performed. The approach is applied on a 0.5 liter PET bottle of Krones AG, Neutraubling, Germany. The investigations point out that the design process can be supported by applying this simulative optimization approach. In an optimization study the total bottle weight is reduced from 18.5 g to 15.5 g. The validation of the computed results is in progress

  11. Modeling, estimation and optimal filtration in signal processing

    CERN Document Server

    Najim, Mohamed

    2010-01-01

    The purpose of this book is to provide graduate students and practitioners with traditional methods and more recent results for model-based approaches in signal processing.Firstly, discrete-time linear models such as AR, MA and ARMA models, their properties and their limitations are introduced. In addition, sinusoidal models are addressed.Secondly, estimation approaches based on least squares methods and instrumental variable techniques are presented.Finally, the book deals with optimal filters, i.e. Wiener and Kalman filtering, and adaptive filters such as the RLS, the LMS and the

  12. Optimization of Protein Hydrolysate Production Process from Jatropha curcas Cake

    OpenAIRE

    Waraporn Apiwatanapiwat; Pilanee Vaithanomsat; Phanu Somkliang; Taweesiri Malapant

    2009-01-01

    This was the first document revealing the investigation of protein hydrolysate production optimization from J. curcas cake. Proximate analysis of raw material showed 18.98% protein, 5.31% ash, 8.52% moisture and 12.18% lipid. The appropriate protein hydrolysate production process began with grinding the J. curcas cake into small pieces. Then it was suspended in 2.5% sodium hydroxide solution with ratio between solution/ J. curcas cake at 80:1 (v/w). The hydrolysis reactio...

  13. MO-B-BRB-00: Optimizing the Treatment Planning Process

    International Nuclear Information System (INIS)

    2015-01-01

    The radiotherapy treatment planning process has evolved over the years with innovations in treatment planning, treatment delivery and imaging systems. Treatment modality and simulation technologies are also rapidly improving and affecting the planning process. For example, Image-guided-radiation-therapy has been widely adopted for patient setup, leading to margin reduction and isocenter repositioning after simulation. Stereotactic Body radiation therapy (SBRT) and Radiosurgery (SRS) have gradually become the standard of care for many treatment sites, which demand a higher throughput for the treatment plans even if the number of treatments per day remains the same. Finally, simulation, planning and treatment are traditionally sequential events. However, with emerging adaptive radiotherapy, they are becoming more tightly intertwined, leading to iterative processes. Enhanced efficiency of planning is therefore becoming more critical and poses serious challenge to the treatment planning process; Lean Six Sigma approaches are being utilized increasingly to balance the competing needs for speed and quality. In this symposium we will discuss the treatment planning process and illustrate effective techniques for managing workflow. Topics will include: Planning techniques: (a) beam placement, (b) dose optimization, (c) plan evaluation (d) export to RVS. Planning workflow: (a) import images, (b) Image fusion, (c) contouring, (d) plan approval (e) plan check (f) chart check, (g) sequential and iterative process Influence of upstream and downstream operations: (a) simulation, (b) immobilization, (c) motion management, (d) QA, (e) IGRT, (f) Treatment delivery, (g) SBRT/SRS (h) adaptive planning Reduction of delay between planning steps with Lean systems due to (a) communication, (b) limited resource, (b) contour, (c) plan approval, (d) treatment. Optimizing planning processes: (a) contour validation (b) consistent planning protocol, (c) protocol/template sharing, (d) semi

  14. MO-B-BRB-00: Optimizing the Treatment Planning Process

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2015-06-15

    The radiotherapy treatment planning process has evolved over the years with innovations in treatment planning, treatment delivery and imaging systems. Treatment modality and simulation technologies are also rapidly improving and affecting the planning process. For example, Image-guided-radiation-therapy has been widely adopted for patient setup, leading to margin reduction and isocenter repositioning after simulation. Stereotactic Body radiation therapy (SBRT) and Radiosurgery (SRS) have gradually become the standard of care for many treatment sites, which demand a higher throughput for the treatment plans even if the number of treatments per day remains the same. Finally, simulation, planning and treatment are traditionally sequential events. However, with emerging adaptive radiotherapy, they are becoming more tightly intertwined, leading to iterative processes. Enhanced efficiency of planning is therefore becoming more critical and poses serious challenge to the treatment planning process; Lean Six Sigma approaches are being utilized increasingly to balance the competing needs for speed and quality. In this symposium we will discuss the treatment planning process and illustrate effective techniques for managing workflow. Topics will include: Planning techniques: (a) beam placement, (b) dose optimization, (c) plan evaluation (d) export to RVS. Planning workflow: (a) import images, (b) Image fusion, (c) contouring, (d) plan approval (e) plan check (f) chart check, (g) sequential and iterative process Influence of upstream and downstream operations: (a) simulation, (b) immobilization, (c) motion management, (d) QA, (e) IGRT, (f) Treatment delivery, (g) SBRT/SRS (h) adaptive planning Reduction of delay between planning steps with Lean systems due to (a) communication, (b) limited resource, (b) contour, (c) plan approval, (d) treatment. Optimizing planning processes: (a) contour validation (b) consistent planning protocol, (c) protocol/template sharing, (d) semi

  15. 3-D brain image registration using optimal morphological processing

    International Nuclear Information System (INIS)

    Loncaric, S.; Dhawan, A.P.

    1994-01-01

    The three-dimensional (3-D) registration of Magnetic Resonance (MR) and Positron Emission Tomographic (PET) images of the brain is important for analysis of the human brain and its diseases. A procedure for optimization of (3-D) morphological structuring elements, based on a genetic algorithm, is presented in the paper. The registration of the MR and PET images is done by means of a registration procedure in two major phases. In the first phase, the Iterative Principal Axis Transform (IPAR) is used for initial registration. In the second phase, the optimal shape description method based on the Morphological Signature Transform (MST) is used for final registration. The morphological processing is used to improve the accuracy of the basic IPAR method. The brain ventricle is used as a landmark for MST registration. A near-optimal structuring element obtained by means of a genetic algorithm is used in MST to describe the shape of the ventricle. The method has been tested on the set of brain images demonstrating the feasibility of approach. (author). 11 refs., 3 figs

  16. Energy consumption optimization of a continuous ice cream process

    International Nuclear Information System (INIS)

    González-Ramírez, J.E.; Leducq, D.; Arellano, M.; Alvarez, G.

    2013-01-01

    Highlights: • This work investigates potential energy savings of an ice cream freezer. • From a full load compressor to a variable speed compressor one in freezer. • 30% less of energy consumption. • It is possible to save between 11 and 14 MWh per year by optimizing freezers. - Abstract: This work investigates potential energy saves in an ice cream freezer by using a variable speed compressor and optimization’s methodology for operating conditions during the process. Two configurations to control the refrigeration capacity were analyzed, the first one, modifies the pressure through the pilot control valve (conventional refrigeration system) and the second one with a variable speed compressor, both with a float expansion valve. Variable speed compressor configuration has showed the highest coefficient of performance and around of 30% less of energy consumption than the conventional one. The optimization of operating conditions in order to minimize the energy consumption is also presented. It was calculated only in France, for all ice cream and sorbet production, it is possible to save energy between 11 and 14 MWh per year by optimizing the operation of the refrigeration system through a variable speed compressor configuration

  17. Effects of optimization and image processing in digital chest radiography

    International Nuclear Information System (INIS)

    Kheddache, S.; Maansson, L.G.; Angelhed, J.E.; Denbratt, L.; Gottfridsson, B.; Schlossman, D.

    1991-01-01

    A digital system for chest radiography based on a large image intensifier was compared to a conventional film-screen system. The digital system was optimized with regard to spatial and contrast resolution and dose. The images were digitally processed for contrast and edge enhancement. A simulated pneumothorax and two and two simulated nodules were positioned over the lungs and the mediastinum of an anthro-pomorphic phantom. Observer performance was evaluated with Receiver Operating Characteristic (ROC) analysis. Five observers assessed the processed digital images and the conventional full-size radiographs. The time spent viewing the full-size radiographs and the digital images was recorded. For the simulated pneumothorax, the results showed perfect performance for the full-size radiographs and detectability was high also for the processed digital images. No significant differences in the detectability of the simulated nodules was seen between the two imaging systems. The results for the digital images showed a significantly improved detectability for the nodules in the mediastinum as compared to a previous ROC study where no optimization and image processing was available. No significant difference in detectability was seen between the former and the present ROC study for small nodules in the lung. No difference was seen in the time spent assessing the conventional full-size radiographs and the digital images. The study indicates that processed digital images produced by a large image intensifier are equal in image quality to conventional full-size radiographs for low-contrast objects such as nodules. (author). 38 refs.; 4 figs.; 1 tab

  18. A new model for anaerobic processes of up-flow anaerobic sludge blanket reactors based on cellular automata

    DEFF Research Database (Denmark)

    Skiadas, Ioannis V.; Ahring, Birgitte Kiær

    2002-01-01

    characteristics and lead to different reactor behaviour. A dynamic mathematical model has been developed for the anaerobic digestion of a glucose based synthetic wastewater in UASB reactors. Cellular automata (CA) theory has been applied to simulate the granule development process. The model takes...... into consideration that granule diameter and granule microbial composition are functions of the reactor operational parameters and is capable of predicting the UASB performance and the layer structure of the granules....

  19. Laser cutting: industrial relevance, process optimization, and laser safety

    Science.gov (United States)

    Haferkamp, Heinz; Goede, Martin; von Busse, Alexander; Thuerk, Oliver

    1998-09-01

    Compared to other technological relevant laser machining processes, up to now laser cutting is the application most frequently used. With respect to the large amount of possible fields of application and the variety of different materials that can be machined, this technology has reached a stable position within the world market of material processing. Reachable machining quality for laser beam cutting is influenced by various laser and process parameters. Process integrated quality techniques have to be applied to ensure high-quality products and a cost effective use of the laser manufacturing plant. Therefore, rugged and versatile online process monitoring techniques at an affordable price would be desirable. Methods for the characterization of single plant components (e.g. laser source and optical path) have to be substituted by an omnivalent control system, capable of process data acquisition and analysis as well as the automatic adaptation of machining and laser parameters to changes in process and ambient conditions. At the Laser Zentrum Hannover eV, locally highly resolved thermographic measurements of the temperature distribution within the processing zone using cost effective measuring devices are performed. Characteristic values for cutting quality and plunge control as well as for the optimization of the surface roughness at the cutting edges can be deducted from the spatial distribution of the temperature field and the measured temperature gradients. Main influencing parameters on the temperature characteristic within the cutting zone are the laser beam intensity and pulse duration in pulse operation mode. For continuous operation mode, the temperature distribution is mainly determined by the laser output power related to the cutting velocity. With higher cutting velocities temperatures at the cutting front increase, reaching their maximum at the optimum cutting velocity. Here absorption of the incident laser radiation is drastically increased due to

  20. Optimization of joint recycling process of drilling sludge and phosphogypsum

    Directory of Open Access Journals (Sweden)

    I. Yu. Ablieieva

    2016-06-01

    Full Text Available Joint recycling of drilling sludge and phosphogypsum with obtaining a building material is environmentally appropriate and cost-effective, as it helps not only to prevent environmental pollution, but also to solve the problem of rational nature management. Drilling sludge is a waste formed during drilling oil wells, and phosphogypsum is a waste of the chemical industry, formed as a result of the production of concentrational phosphoric acid. However, technogenic raw materials contain heavy metals that can be transformed into a finished product and leached out of it. The problem of minimizing the negative impact of pollutants is very important to reduce the risk to human health. The author's idea is to optimize ecological characteristics of drilling waste and phosphogypsum recycling process. The concentration of heavy metals in the extract of gypsum concrete was determined as the function of the target which depends primarily on structural and technological parameters. The purpose of the article is solution to mathematical programming task, i.e., finding optimal solutions for the factors values of drilling sludge and phosphogypsum recycling process. Mathematical programming solution to optimization problem of the gypsum concrete environmental characteristics (to minimize concentration of heavy metals in the extract was performed by the method of simple random search in the Borland C ++ programming environment using C programming language. It is necessary to observe the values of such factors to minimize concentration of heavy metals in the extract of gypsum concrete. The mass ratio of gypsum binder and drilling sludge is 2.93 units, the mass ratio of quick lime and gypsum binder is 0.09 units, the age of gypsum concrete is above 19 days, exposure time is 28 days.

  1. Identification of Optimal Policies in Markov Decision Processes

    Czech Academy of Sciences Publication Activity Database

    Sladký, Karel

    46 2010, č. 3 (2010), s. 558-570 ISSN 0023-5954. [International Conference on Mathematical Methods in Economy and Industry. České Budějovice, 15.06.2009-18.06.2009] R&D Projects: GA ČR(CZ) GA402/08/0107; GA ČR GA402/07/1113 Institutional research plan: CEZ:AV0Z10750506 Keywords : finite state Markov decision processes * discounted and average costs * elimination of suboptimal policies Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.461, year: 2010 http://library.utia.cas.cz/separaty/2010/E/sladky-identification of optimal policies in markov decision processes.pdf

  2. Ising Processing Units: Potential and Challenges for Discrete Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Coffrin, Carleton James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Nagarajan, Harsha [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bent, Russell Whitford [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-07-05

    The recent emergence of novel computational devices, such as adiabatic quantum computers, CMOS annealers, and optical parametric oscillators, presents new opportunities for hybrid-optimization algorithms that leverage these kinds of specialized hardware. In this work, we propose the idea of an Ising processing unit as a computational abstraction for these emerging tools. Challenges involved in using and bench- marking these devices are presented, and open-source software tools are proposed to address some of these challenges. The proposed benchmarking tools and methodology are demonstrated by conducting a baseline study of established solution methods to a D-Wave 2X adiabatic quantum computer, one example of a commercially available Ising processing unit.

  3. Information theoretic methods for image processing algorithm optimization

    Science.gov (United States)

    Prokushkin, Sergey F.; Galil, Erez

    2015-01-01

    Modern image processing pipelines (e.g., those used in digital cameras) are full of advanced, highly adaptive filters that often have a large number of tunable parameters (sometimes > 100). This makes the calibration procedure for these filters very complex, and the optimal results barely achievable in the manual calibration; thus an automated approach is a must. We will discuss an information theory based metric for evaluation of algorithm adaptive characteristics ("adaptivity criterion") using noise reduction algorithms as an example. The method allows finding an "orthogonal decomposition" of the filter parameter space into the "filter adaptivity" and "filter strength" directions. This metric can be used as a cost function in automatic filter optimization. Since it is a measure of a physical "information restoration" rather than perceived image quality, it helps to reduce the set of the filter parameters to a smaller subset that is easier for a human operator to tune and achieve a better subjective image quality. With appropriate adjustments, the criterion can be used for assessment of the whole imaging system (sensor plus post-processing).

  4. Die design and process optimization of plastic gear extrusion

    Science.gov (United States)

    Zhang, Lei; Fu, Zhihong; Yao, Chen; Zang, Gongzheng; Wan, Yue

    2018-01-01

    The flow velocity of the melt in the extruder was simulated by using software Polyflow, and the size of the die channel with the best flow uniformity was obtained. The die profile shape is obtained by reverse design. The length of the shaping section is determined by Ansys transient thermal analysis. According to the simulation results, the design and manufacture of extrusion die of plastic gear and vacuum cooling setting were obtained. The influence of the five process parameters on the precision of the plastic gear were studied by the single factor analysis method, such as the die temperature T, the screw speed R, the die spacing S, the vacuum degree M and the hauling speed V. The optimal combination of process parameters was obtained by using the neural network particle swarm optimization algorithm(T = 197.05 °C, R = 9.04rpm, S = 67mm, M = -0.0194MPa). The tooth profile deviation of the extruded plastic gear can reach 9 level of accuracy.

  5. Three-dimensional cellular automaton-finite element modeling of solidification grain structures for arc-welding processes

    International Nuclear Information System (INIS)

    Chen, Shijia; Guillemot, Gildas; Gandin, Charles-André

    2016-01-01

    Solidification grain structure has significant impact on the final properties of welded parts using fusion welding processes. Direct simulation of grain structure at industrial scale is yet rarely reported in the literature and remains a challenge. A three-dimensional (3D) coupled Cellular Automaton (CA) – Finite Element (FE) model is presented that predicts the grain structure formation during multiple passes Gas Tungsten Arc Welding (GTAW) and Gas Metal Arc Welding (GMAW). The FE model is established in a level set (LS) approach that tracks the evolution of the metal-shielding gas interface due to the addition of metal. The FE method solves the mass, energy and momentum conservation equations for the metal plus shielding gas system based on an adaptive mesh (FE mesh). Fields are projected in a second FE mesh, named CA mesh. A CA grid made of a regular lattice of cubic cells is created to overlay the fixed CA mesh. The CA model based on the CA grid simulates the melting and growth of the grain boundaries in the liquid pool. In order to handle large computational domains while keeping reasonable computational costs, parallel computations and dynamic strategies for the allocation/deallocation of the CA grid are introduced. These strategies correspond to significant optimizations of the computer memories that are demonstrated. The 3D CAFE model is first applied to the simple configuration of single linear passes by GTAW of a duplex stainless steel URANUS 2202. It is then applied to a more persuasive example considering GMAW in spray transfer mode during multiple passes to fill a V-groove chamfer. Simulations reveal the possibility to handle domains with millions of grains in representative domain sizes while following the formation of textures that result from the growth competition among columnar grains. -- Graphical abstract: Simulated 3D grain structure (3D CAFE model) for GTAW multiple linear passes at the surface of a duplex stainless steel (URANUS 22002

  6. Cellular Automata as a learning process in Architecture and Urban design

    DEFF Research Database (Denmark)

    Jensen, Mads Brath; Foged, Isak Worre

    2014-01-01

    . An architectural methodological response to this situation is presented through the development of a conceptual computational design system that allows these dynamics to unfold and to be observed for architectural design decision taking. Reflecting on the development and implementation of a cellular automata based...... design approach on a master level urban design studio this paper will discuss the strategies for dealing with complexity at an urban scale as well as the pedagogical considerations behind applying computational tools and methods to a urban design education....

  7. Optimizing Endoscope Reprocessing Resources Via Process Flow Queuing Analysis.

    Science.gov (United States)

    Seelen, Mark T; Friend, Tynan H; Levine, Wilton C

    2018-05-04

    The Massachusetts General Hospital (MGH) is merging its older endoscope processing facilities into a single new facility that will enable high-level disinfection of endoscopes for both the ORs and Endoscopy Suite, leveraging economies of scale for improved patient care and optimal use of resources. Finalized resource planning was necessary for the merging of facilities to optimize staffing and make final equipment selections to support the nearly 33,000 annual endoscopy cases. To accomplish this, we employed operations management methodologies, analyzing the physical process flow of scopes throughout the existing Endoscopy Suite and ORs and mapping the future state capacity of the new reprocessing facility. Further, our analysis required the incorporation of historical case and reprocessing volumes in a multi-server queuing model to identify any potential wait times as a result of the new reprocessing cycle. We also performed sensitivity analysis to understand the impact of future case volume growth. We found that our future-state reprocessing facility, given planned capital expenditures for automated endoscope reprocessors (AERs) and pre-processing sinks, could easily accommodate current scope volume well within the necessary pre-cleaning-to-sink reprocessing time limit recommended by manufacturers. Further, in its current planned state, our model suggested that the future endoscope reprocessing suite at MGH could support an increase in volume of at least 90% over the next several years. Our work suggests that with simple mathematical analysis of historic case data, significant changes to a complex perioperative environment can be made with ease while keeping patient safety as the top priority.

  8. Optimizing enactment of nursing roles: redesigning care processes and structures

    Directory of Open Access Journals (Sweden)

    Jackson K

    2014-02-01

    Full Text Available Karen Jackson,1 Deborah E White,2 Jeanne Besner,1 Jill M Norris21Health Systems and Workforce Research Unit, Alberta Health Services, Calgary, Alberta, Canada; 2Faculty of Nursing, University of Calgary, Calgary, Alberta, CanadaBackground: Effective and efficient use of nursing human resources is critical. The Nursing Role Effectiveness Model conceptualizes nursing practice in terms of key clinical role accountabilities and has the potential to inform redesign efforts. The aims of this study were to develop, implement, and evaluate a job redesign intended to optimize the enactment of registered nurse (RN clinical role accountabilities.Methods: A job redesign was developed and implemented in a single medical patient care unit, the redesign unit. A mixed-methods design was used to evaluate the job redesign; a second medical patient care unit served as a control unit. Data from administrative databases, observations, interviews, and demographic surveys were collected pre-redesign (November 2005 and post-redesign (October 2007.Results: Several existing unit structures and processes (eg, model of care delivery influenced RNs' ability to optimally enact their role accountabilities. Redesign efforts were hampered by contextual issues, including organizational alignment, leadership, and timing. Overall, optimized enactment of RN role accountabilities and improvements to patient outcomes did not occur, yet this was predictable, given that the redesign was not successful. Although the results were disappointing, much was learned about job redesign.Conclusion: Potential exists to improve the utilization of nursing providers by situating nurses' work in a clinical role accountability framework and attending to a clear organizational vision and well-articulated strategic plan that is championed by leaders at all levels of the organization. Health care leaders require a clear understanding of nurses' role accountabilities, support in managing change, and

  9. Optimization and Control of Pressure Swing Adsorption Processes Under Uncertainty

    KAUST Repository

    Khajuria, Harish

    2012-03-21

    The real-time periodic performance of a pressure swing adsorption (PSA) system strongly depends on the choice of key decision variables and operational considerations such as processing steps and column pressure temporal profiles, making its design and operation a challenging task. This work presents a detailed optimization-based approach for simultaneously incorporating PSA design, operational, and control aspects under the effect of time variant and invariant disturbances. It is applied to a two-bed, six-step PSA system represented by a rigorous mathematical model, where the key optimization objective is to maximize the expected H2 recovery while achieving a closed loop product H2 purity of 99.99%, for separating 70% H2, 30% CH4 feed. The benefits over sequential design and control approach are shown in terms of closed-loop recovery improvement of more than 3%, while the incorporation of explicit/multiparametric model predictive controllers improves the closed loop performance. © 2012 American Institute of Chemical Engineers (AIChE).

  10. Optimal lot sizing in screening processes with returnable defective items

    Science.gov (United States)

    Vishkaei, Behzad Maleki; Niaki, S. T. A.; Farhangi, Milad; Rashti, Mehdi Ebrahimnezhad Moghadam

    2014-07-01

    This paper is an extension of Hsu and Hsu (Int J Ind Eng Comput 3(5):939-948, 2012) aiming to determine the optimal order quantity of product batches that contain defective items with percentage nonconforming following a known probability density function. The orders are subject to 100 % screening process at a rate higher than the demand rate. Shortage is backordered, and defective items in each ordering cycle are stored in a warehouse to be returned to the supplier when a new order is received. Although the retailer does not sell defective items at a lower price and only trades perfect items (to avoid loss), a higher holding cost incurs to store defective items. Using the renewal-reward theorem, the optimal order and shortage quantities are determined. Some numerical examples are solved at the end to clarify the applicability of the proposed model and to compare the new policy to an existing one. The results show that the new policy provides better expected profit per time.

  11. Optimization of extrusion process for production of nutritious pellets

    Directory of Open Access Journals (Sweden)

    Ernesto Aguilar-Palazuelos

    2012-03-01

    Full Text Available A blend of 50% Potato Starch (PS, 35% Quality Protein Maize (QPM, and 15% Soybean Meal (SM were used in the preparation of expanded pellets utilizing a laboratory extruder with a 1.5 × 20.0 × 100.0 mm die-nozzle. The independent variables analyzed were Barrel Temperature (BT (75-140 °C and Feed Moisture (FM (16-30%. The effect of extrusion variables was investigated in terms of Expansion Index (EI, apparent density (ApD, Penetration Force (PF and Specific Mechanical Energy (SME, viscosity profiles, DSC, crystallinity by X-ray diffraction, and Scanning Electronic Microscopy (SEM. The PF decreased from 30 to 4 kgf with the increase of both independent variables (BT and FM. SME was affected only by FM, and decreased with the increase in this variable. The optimal region showed that the maximum EI was found for BT in the range of 123-140 °C and 27-31% for FM, respectively. The extruded pellets obtained from the optimal processing region were probably not completely degraded, as shown in the structural characterization. Acceptable expanded pellets could be produced using a blend of PS, QPM, and SM by extrusion cooking.

  12. High Temperature Epoxy Foam: Optimization of Process Parameters

    Directory of Open Access Journals (Sweden)

    Samira El Gazzani

    2016-06-01

    Full Text Available For many years, reduction of fuel consumption has been a major aim in terms of both costs and environmental concerns. One option is to reduce the weight of fuel consumers. For this purpose, the use of a lightweight material based on rigid foams is a relevant choice. This paper deals with a new high temperature epoxy expanded material as substitution of phenolic resin, classified as potentially mutagenic by European directive Reach. The optimization of thermoset foam depends on two major parameters, the reticulation process and the expansion of the foaming agent. Controlling these two phenomena can lead to a fully expanded and cured material. The rheological behavior of epoxy resin is studied and gel time is determined at various temperatures. The expansion of foaming agent is investigated by thermomechanical analysis. Results are correlated and compared with samples foamed in the same temperature conditions. The ideal foaming/gelation temperature is then determined. The second part of this research concerns the optimization of curing cycle of a high temperature trifunctional epoxy resin. A two-step curing cycle was defined by considering the influence of different curing schedules on the glass transition temperature of the material. The final foamed material has a glass transition temperature of 270 °C.

  13. Optimization of jenipapo in vitro seed germination process

    Directory of Open Access Journals (Sweden)

    Rafaela Ribeiro de Souza

    Full Text Available ABSTRACT The in vitro seed germination is an effective alternative for quickly obtaining explants with sanitary quality. However, jenipapo seeds present slow and uneven germination. Therefore, internal and external factors to seed which directly interfere in the process, they must be identified, in order to adapt better techniques to obtain seedlings. In this sense, this work aimed to optimize the in vitro germination of Genipa americana L. seeds by evaluating different factors (light quality, GA3 treatment, pre-soaking in distilled water, growing media and stratification in the dark. It was found that the seed germination of G. americana was indifferent to light, however, the best results were obtained under conditions of continuous darkness; There was no effect of the application of exogenous GA3; The pre-soaking in distilled water for 48 h contributes to obtaining better germination rates; And the reduction in MS medium salts, and laminating the pretreatment in the dark maximizes the germination potential of seeds.Therefore, the optimal conditions for in vitro germination of G. americana L. seeds requires pre-soaking in distilled water for 48 hours and inoculation into culture media consisting of 1/2 MS + 15 g L-1 sucrose, with stratification in the dark for 16 days, followed by the transfer to growth chambers with lighting provided by white fluorescent lamps.

  14. Hydrogen production by onboard gasoline processingProcess simulation and optimization

    Energy Technology Data Exchange (ETDEWEB)

    Bisaria, Vega; Smith, R.J. Byron,

    2013-12-15

    Highlights: • Process flow sheet for an onboard fuel processor for 100 kW fuel cell output was simulated. • Gasoline fuel requirement was found to be 30.55 kg/hr. • The fuel processor efficiency was found to be 95.98%. • An heat integrated optimum flow sheet was developed. - Abstract: Fuel cell vehicles have reached the commercialization stage and hybrid vehicles are already on the road. While hydrogen storage and infrastructure remain critical issues in stand alone commercialization of the technology, researchers are developing onboard fuel processors, which can convert a variety of fuels into hydrogen to power these fuel cell vehicles. The feasibility study of a 100 kW on board fuel processor based on gasoline fuel is carried out using process simulation. The steady state model has been developed with the help of Aspen HYSYS to analyze the fuel processor and total system performance. The components of the fuel processor are the fuel reforming unit, CO clean-up unit and auxiliary units. Optimization studies were carried out by analyzing the influence of various operating parameters such as oxygen to carbon ratio, steam to carbon ratio, temperature and pressure on the process equipments. From the steady state model optimization using Aspen HYSYS, an optimized reaction composition in terms of hydrogen production and carbon monoxide concentration corresponds to: oxygen to carbon ratio of 0.5 and steam to carbon ratio of 0.5. The fuel processor efficiency of 95.98% is obtained under these optimized conditions. The heat integration of the system using the composite curve, grand composite curve and utility composite curve were studied for the system. The most appropriate heat exchanger network from the generated ones was chosen and that was incorporated into the optimized flow sheet of the100 kW fuel processor. A completely heat integrated 100 kW fuel processor flow sheet using gasoline as fuel was thus successfully simulated and optimized.

  15. Hydrogen production by onboard gasoline processingProcess simulation and optimization

    International Nuclear Information System (INIS)

    Bisaria, Vega; Smith, R.J. Byron

    2013-01-01

    Highlights: • Process flow sheet for an onboard fuel processor for 100 kW fuel cell output was simulated. • Gasoline fuel requirement was found to be 30.55 kg/hr. • The fuel processor efficiency was found to be 95.98%. • An heat integrated optimum flow sheet was developed. - Abstract: Fuel cell vehicles have reached the commercialization stage and hybrid vehicles are already on the road. While hydrogen storage and infrastructure remain critical issues in stand alone commercialization of the technology, researchers are developing onboard fuel processors, which can convert a variety of fuels into hydrogen to power these fuel cell vehicles. The feasibility study of a 100 kW on board fuel processor based on gasoline fuel is carried out using process simulation. The steady state model has been developed with the help of Aspen HYSYS to analyze the fuel processor and total system performance. The components of the fuel processor are the fuel reforming unit, CO clean-up unit and auxiliary units. Optimization studies were carried out by analyzing the influence of various operating parameters such as oxygen to carbon ratio, steam to carbon ratio, temperature and pressure on the process equipments. From the steady state model optimization using Aspen HYSYS, an optimized reaction composition in terms of hydrogen production and carbon monoxide concentration corresponds to: oxygen to carbon ratio of 0.5 and steam to carbon ratio of 0.5. The fuel processor efficiency of 95.98% is obtained under these optimized conditions. The heat integration of the system using the composite curve, grand composite curve and utility composite curve were studied for the system. The most appropriate heat exchanger network from the generated ones was chosen and that was incorporated into the optimized flow sheet of the100 kW fuel processor. A completely heat integrated 100 kW fuel processor flow sheet using gasoline as fuel was thus successfully simulated and optimized

  16. Optimization of the caldasite processing conditions by alcaline melting

    International Nuclear Information System (INIS)

    Brown, A.E.P.

    1976-01-01

    A study has been done to recover economically the uranium and zirconium values of the ores at Pocos de Caldas Plateau in the state of Minas Gerais, Brazil. In a preliminar study, it was investigated the opening of the ore by alcaline fusion that were carried out in a temperature controlled furnace and the variables studied were the time, temperature and NaOH/ore ratio. The optimization procedure was based on the steepest Ascent Method developed by Box and Wilson, utilizing a complete 2 3 factorial design. The analysis of the data indicated the response optimum for the process in: time 1.52 +-0.1 hour; temperature 805 +-15degC; NaOH/ore ratio 1.7 ton/ton. Solubilizations higher than 97%ZrO 2 and recuperations nearly of 100%U 3 O 8 are obtained arouns this point [pt

  17. Optimizing an immersion ESL curriculum using analytic hierarchy process.

    Science.gov (United States)

    Tang, Hui-Wen Vivian

    2011-11-01

    The main purpose of this study is to fill a substantial knowledge gap regarding reaching a uniform group decision in English curriculum design and planning. A comprehensive content-based course criterion model extracted from existing literature and expert opinions was developed. Analytical hierarchy process (AHP) was used to identify the relative importance of course criteria for the purpose of tailoring an optimal one-week immersion English as a second language (ESL) curriculum for elementary school students in a suburban county of Taiwan. The hierarchy model and AHP analysis utilized in the present study will be useful for resolving several important multi-criteria decision-making issues in planning and evaluating ESL programs. This study also offers valuable insights and provides a basis for further research in customizing ESL curriculum models for different student populations with distinct learning needs, goals, and socioeconomic backgrounds. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Optimization of signal processing algorithm for digital beam position monitor

    International Nuclear Information System (INIS)

    Lai Longwei; Yi Xing; Leng Yongbin; Yan Yingbing; Chen Zhichu

    2013-01-01

    Based on turn-by-turn (TBT) signal processing, the paper emphasizes on the optimization of system timing and implementation of digital automatic gain control, slow application (SA) modules. Beam position including TBT, fast application (FA) and SA data can be acquired. On-line evaluation on Shanghai Synchrotron Radiation Facility (SSRF) shows that the processor is able to get the multi-rate position data which contain true beam movements. When the storage ring is 174 mA and 500 bunches filled, the resolutions of TBT data, FA data and SA data achieve 0.84, 0.44 and 0.23 μm respectively. The above results prove that the design could meet the performance requirements. (authors)

  19. Global optimization numerical strategies for rate-independent processes

    Czech Academy of Sciences Publication Activity Database

    Benešová, Barbora

    2011-01-01

    Roč. 50, č. 2 (2011), s. 197-220 ISSN 0925-5001 R&D Projects: GA ČR GAP201/10/0357 Grant - others:GA MŠk(CZ) LC06052 Program:LC Institutional research plan: CEZ:AV0Z20760514 Keywords : rate-independent processes * numerical global optimization * energy estimates based algorithm Subject RIV: BA - General Mathematics Impact factor: 1.196, year: 2011 http://math.hnue.edu.vn/portal/rss.viewpage.php?id=0000037780&ap=L3BvcnRhbC9ncmFiYmVyLnBocD9jYXRpZD0xMDEyJnBhZ2U9Mg==

  20. Process analysis and data driven optimization in the salmon industry

    DEFF Research Database (Denmark)

    Johansson, Gine Ørnholt

    Aquaculture supplies around 70% of the salmon in the World and the industry is thus an important player in meeting the increasing demand for salmon products. Such mass production calls for systems that can handle thousands of tonnes of salmon without compromising the welfare of the fish...... and the following product quality. Moreover, the requirement of increased profit performance for the industry should be met with sustainable production solutions. Optimization during the production of salmon fillets could be one feasible approach to increase the outcome from the same level of incoming raw material...... and analysis of data from the salmon industry could be utilized to extract information that will support the industry in their decision-making processes. Mapping of quality parameters, their fluctuations and influences on yield and texture has been investigated. Additionally, the ability to predict the texture...

  1. Optimization of Production Processes Using the Yamazumi Method

    Directory of Open Access Journals (Sweden)

    Dušan Sabadka

    2017-12-01

    Full Text Available Manufacturing companies are now placing great emphasis on competitiveness and looking for ways to explore their resources more efficiently. This paper presents optimum efficiency improvement of the automotive transmission assembly production line by using line balancing. To optimize has been selected 3 assembly stations where is waste and where requirements are not met for achieving the production capacity. Several measures have been proposed on the assembly lines concerned to reduce operations by using eliminating unnecessary activities of the assembly processes, reducing the cycle time, and balancing manpower workload using line balancing through Yamazumi chart and Takt time. The results of the proposed measures were compared with the current situation in terms of increasing the efficiency of the production line.

  2. Process optimization for obtaining nano cellulose from curaua fiber

    International Nuclear Information System (INIS)

    Lunz, Juliana do N.; Cordeiro, Suellem B.; Mota, Jose Carlos F.; Marques, Maria de Fatima V.

    2011-01-01

    This study focuses on the methodology for optimization to obtain nanocellulose from vegetal fibers. An experimental planning was carried out for the treatment of curaua fibers and parameters were estimated, having the concentration of H 2 SO 4 , hydrolysis time, reaction temperature and time of sonication applied as independent variables for further statistical analysis. According to the estimated parameters, the statistically significant effects were determined for the process of obtaining nanocellulose. According to the results obtained from the thermogravimetric analysis (TGA) it was observed that certain conditions led to cellulose with degradation temperatures near or even above that of untreated cellulose fibers. The crystallinity index (IC) obtained after fiber treatment (X-ray diffraction) were higher than that of the pure fiber. Treatments with high acid concentrations led to higher IC. (author)

  3. Thermosonication and optimization of stingless bee honey processing.

    Science.gov (United States)

    Chong, K Y; Chin, N L; Yusof, Y A

    2017-10-01

    The effects of thermosonication on the quality of a stingless bee honey, the Kelulut, were studied using processing temperature from 45 to 90 ℃ and processing time from 30 to 120 minutes. Physicochemical properties including water activity, moisture content, color intensity, viscosity, hydroxymethylfurfural content, total phenolic content, and radical scavenging activity were determined. Thermosonication reduced the water activity and moisture content by 7.9% and 16.6%, respectively, compared to 3.5% and 6.9% for conventional heating. For thermosonicated honey, color intensity increased by 68.2%, viscosity increased by 275.0%, total phenolic content increased by 58.1%, and radical scavenging activity increased by 63.0% when compared to its raw form. The increase of hydroxymethylfurfural to 62.46 mg/kg was still within the limits of international standards. Optimized thermosonication conditions using response surface methodology were predicted at 90 ℃ for 111 minutes. Thermosonication was revealed as an effective alternative technique for honey processing.

  4. Optimization of electrocoagulation process for the treatment of landfill leachate

    Science.gov (United States)

    Huda, N.; Raman, A. A.; Ramesh, S.

    2017-06-01

    The main problem of landfill leachate is its diverse composition comprising of persistent organic pollutants (POPs) which must be removed before being discharge into the environment. In this study, the treatment of leachate using electrocoagulation (EC) was investigated. Iron was used as both the anode and cathode. Response surface methodology was used for experimental design and to study the effects of operational parameters. Central Composite Design was used to study the effects of initial pH, inter-electrode distance, and electrolyte concentration on color, and COD removals. The process could remove up to 84 % color and 49.5 % COD. The experimental data was fitted onto second order polynomial equations. All three factors were found to be significantly affect the color removal. On the other hand, electrolyte concentration was the most significant parameter affecting the COD removal. Numerical optimization was conducted to obtain the optimum process performance. Further work will be conducted towards integrating EC with other wastewater treatment processes such as electro-Fenton.

  5. Statistical process control using optimized neural networks: a case study.

    Science.gov (United States)

    Addeh, Jalil; Ebrahimzadeh, Ata; Azarbad, Milad; Ranaee, Vahid

    2014-09-01

    The most common statistical process control (SPC) tools employed for monitoring process changes are control charts. A control chart demonstrates that the process has altered by generating an out-of-control signal. This study investigates the design of an accurate system for the control chart patterns (CCPs) recognition in two aspects. First, an efficient system is introduced that includes two main modules: feature extraction module and classifier module. In the feature extraction module, a proper set of shape features and statistical feature are proposed as the efficient characteristics of the patterns. In the classifier module, several neural networks, such as multilayer perceptron, probabilistic neural network and radial basis function are investigated. Based on an experimental study, the best classifier is chosen in order to recognize the CCPs. Second, a hybrid heuristic recognition system is introduced based on cuckoo optimization algorithm (COA) algorithm to improve the generalization performance of the classifier. The simulation results show that the proposed algorithm has high recognition accuracy. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Optimization of Robotic Spray Painting process Parameters using Taguchi Method

    Science.gov (United States)

    Chidhambara, K. V.; Latha Shankar, B.; Vijaykumar

    2018-02-01

    Automated spray painting process is gaining interest in industry and research recently due to extensive application of spray painting in automobile industries. Automating spray painting process has advantages of improved quality, productivity, reduced labor, clean environment and particularly cost effectiveness. This study investigates the performance characteristics of an industrial robot Fanuc 250ib for an automated painting process using statistical tool Taguchi’s Design of Experiment technique. The experiment is designed using Taguchi’s L25 orthogonal array by considering three factors and five levels for each factor. The objective of this work is to explore the major control parameters and to optimize the same for the improved quality of the paint coating measured in terms of Dry Film thickness(DFT), which also results in reduced rejection. Further Analysis of Variance (ANOVA) is performed to know the influence of individual factors on DFT. It is observed that shaping air and paint flow are the most influencing parameters. Multiple regression model is formulated for estimating predicted values of DFT. Confirmation test is then conducted and comparison results show that error is within acceptable level.

  7. Optimization of image processing algorithms on mobile platforms

    Science.gov (United States)

    Poudel, Pramod; Shirvaikar, Mukul

    2011-03-01

    This work presents a technique to optimize popular image processing algorithms on mobile platforms such as cell phones, net-books and personal digital assistants (PDAs). The increasing demand for video applications like context-aware computing on mobile embedded systems requires the use of computationally intensive image processing algorithms. The system engineer has a mandate to optimize them so as to meet real-time deadlines. A methodology to take advantage of the asymmetric dual-core processor, which includes an ARM and a DSP core supported by shared memory, is presented with implementation details. The target platform chosen is the popular OMAP 3530 processor for embedded media systems. It has an asymmetric dual-core architecture with an ARM Cortex-A8 and a TMS320C64x Digital Signal Processor (DSP). The development platform was the BeagleBoard with 256 MB of NAND RAM and 256 MB SDRAM memory. The basic image correlation algorithm is chosen for benchmarking as it finds widespread application for various template matching tasks such as face-recognition. The basic algorithm prototypes conform to OpenCV, a popular computer vision library. OpenCV algorithms can be easily ported to the ARM core which runs a popular operating system such as Linux or Windows CE. However, the DSP is architecturally more efficient at handling DFT algorithms. The algorithms are tested on a variety of images and performance results are presented measuring the speedup obtained due to dual-core implementation. A major advantage of this approach is that it allows the ARM processor to perform important real-time tasks, while the DSP addresses performance-hungry algorithms.

  8. Optimization of the Enzymatic Saccharification Process of Milled Orange Wastes

    Directory of Open Access Journals (Sweden)

    Daniel Velasco

    2017-08-01

    Full Text Available Orange juice production generates a very high quantity of residues (Orange Peel Waste or OPW-50–60% of total weight that can be used for cattle feed as well as feedstock for the extraction or production of essential oils, pectin and nutraceutics and several monosaccharides by saccharification, inversion and enzyme-aided extraction. As in all solid wastes, simple pretreatments can enhance these processes. In this study, hydrothermal pretreatments and knife milling have been analyzed with enzyme saccharification at different dry solid contents as the selection test: simple knife milling seemed more appropriate, as no added pretreatment resulted in better final glucose yields. A Taguchi optimization study on dry solid to liquid content and the composition of the enzymatic cocktail was undertaken. The amounts of enzymatic preparations were set to reduce their impact on the economy of the process; however, as expected, the highest amounts resulted in the best yields to glucose and other monomers. Interestingly, the highest content in solid to liquid (11.5% on dry basis rendered the best yields. Additionally, in search for process economy with high yields, operational conditions were set: medium amounts of hemicellulases, polygalacturonases and β-glucosidases. Finally, a fractal kinetic modelling of results for all products from the saccharification process indicated very high activities resulting in the liberation of glucose, fructose and xylose, and very low activities to arabinose and galactose. High activity on pectin was also observed, but, for all monomers liberated initially at a fast rate, high hindrances appeared during the saccharification process.

  9. A Methodology for Optimization in Multistage Industrial Processes: A Pilot Study

    Directory of Open Access Journals (Sweden)

    Piotr Jarosz

    2015-01-01

    Full Text Available The paper introduces a methodology for optimization in multistage industrial processes with multiple quality criteria. Two ways of formulation of optimization problem and four different approaches to solve the problem are considered. Proposed methodologies were tested first on a virtual process described by benchmark functions and next were applied in optimization of multistage lead refining process.

  10. Model reduction for dynamic real-time optimization of chemical processes

    NARCIS (Netherlands)

    Van den Berg, J.

    2005-01-01

    The value of models in process industries becomes apparent in practice and literature where numerous successful applications are reported. Process models are being used for optimal plant design, simulation studies, for off-line and online process optimization. For online optimization applications

  11. Modeling of the inhomogeneity of grain refinement during combined metal forming process by finite element and cellular automata methods

    Energy Technology Data Exchange (ETDEWEB)

    Majta, Janusz; Madej, Łukasz; Svyetlichnyy, Dmytro S.; Perzyński, Konrad; Kwiecień, Marcin, E-mail: mkwiecie@agh.edu.pl; Muszka, Krzysztof

    2016-08-01

    The potential of discrete cellular automata technique to predict the grain refinement in wires produced using combined metal forming process is presented and discussed within the paper. The developed combined metal forming process can be treated as one of the Severe Plastic Deformation (SPD) techniques that consists of three different modes of deformation: asymmetric drawing with bending, namely accumulated angular drawing (AAD), wire drawing (WD) and wire flattening (WF). To accurately replicate complex stress state both at macro and micro scales during subsequent deformations two stage modeling approach was used. First, the Finite Element Method (FEM), implemented in commercial ABAQUS software, was applied to simulate entire combined forming process at the macro scale level. Then, based on FEM results, the Cellular Automata (CA) method was applied for simulation of grain refinement at the microstructure level. Data transferred between FEM and CA methods included set of files with strain tensor components obtained from selected integration points in the macro scale model. As a result of CA simulation, detailed information on microstructure evolution during severe plastic deformation conditions was obtained, namely: changes of shape and sizes of modeled representative volume with imposed microstructure, changes of the number of grains, subgrains and dislocation cells, development of grain boundaries angle distribution as well as changes in the pole figures. To evaluate CA model predictive capabilities, results of computer simulation were compared with scanning electron microscopy and electron back scattered diffraction images (SEM/EBSD) studies of samples after AAD+WD+WF process.

  12. Laser dimpling process parameters selection and optimization using surrogate-driven process capability space

    Science.gov (United States)

    Ozkat, Erkan Caner; Franciosa, Pasquale; Ceglarek, Dariusz

    2017-08-01

    Remote laser welding technology offers opportunities for high production throughput at a competitive cost. However, the remote laser welding process of zinc-coated sheet metal parts in lap joint configuration poses a challenge due to the difference between the melting temperature of the steel (∼1500 °C) and the vapourizing temperature of the zinc (∼907 °C). In fact, the zinc layer at the faying surface is vapourized and the vapour might be trapped within the melting pool leading to weld defects. Various solutions have been proposed to overcome this problem over the years. Among them, laser dimpling has been adopted by manufacturers because of its flexibility and effectiveness along with its cost advantages. In essence, the dimple works as a spacer between the two sheets in lap joint and allows the zinc vapour escape during welding process, thereby preventing weld defects. However, there is a lack of comprehensive characterization of dimpling process for effective implementation in real manufacturing system taking into consideration inherent changes in variability of process parameters. This paper introduces a methodology to develop (i) surrogate model for dimpling process characterization considering multiple-inputs (i.e. key control characteristics) and multiple-outputs (i.e. key performance indicators) system by conducting physical experimentation and using multivariate adaptive regression splines; (ii) process capability space (Cp-Space) based on the developed surrogate model that allows the estimation of a desired process fallout rate in the case of violation of process requirements in the presence of stochastic variation; and, (iii) selection and optimization of the process parameters based on the process capability space. The proposed methodology provides a unique capability to: (i) simulate the effect of process variation as generated by manufacturing process; (ii) model quality requirements with multiple and coupled quality requirements; and (iii

  13. Optimizing Urine Processing Protocols for Protein and Metabolite Detection.

    Science.gov (United States)

    Siddiqui, Nazema Y; DuBois, Laura G; St John-Williams, Lisa; Will, Thompson J; Grenier, Carole; Burke, Emily; Fraser, Matthew O; Amundsen, Cindy L; Murphy, Susan K

    In urine, factors such as timing of voids, and duration at room temperature (RT) may affect the quality of recovered protein and metabolite data. Additives may aid with detection, but can add more complexity in sample collection or analysis. We aimed to identify the optimal urine processing protocol for clinically-obtained urine samples that allows for the highest protein and metabolite yields with minimal degradation. Healthy women provided multiple urine samples during the same day. Women collected their first morning (1 st AM) void and another "random void". Random voids were aliquotted with: 1) no additive; 2) boric acid (BA); 3) protease inhibitor (PI); or 4) both BA + PI. Of these aliquots, some were immediately stored at 4°C, and some were left at RT for 4 hours. Proteins and individual metabolites were quantified, normalized to creatinine concentrations, and compared across processing conditions. Sample pools corresponding to each processing condition were analyzed using mass spectrometry to assess protein degradation. Ten Caucasian women between 35-65 years of age provided paired 1 st morning and random voided urine samples. Normalized protein concentrations were slightly higher in 1 st AM compared to random "spot" voids. The addition of BA did not significantly change proteins, while PI significantly improved normalized protein concentrations, regardless of whether samples were immediately cooled or left at RT for 4 hours. In pooled samples, there were minimal differences in protein degradation under the various conditions we tested. In metabolite analyses, there were significant differences in individual amino acids based on the timing of the void. For comparative translational research using urine, information about void timing should be collected and standardized. For urine samples processed in the same day, BA does not appear to be necessary while the addition of PI enhances protein yields, regardless of 4°C or RT storage temperature.

  14. XFEL diffraction: developing processing methods to optimize data quality

    Energy Technology Data Exchange (ETDEWEB)

    Sauter, Nicholas K., E-mail: nksauter@lbl.gov [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)

    2015-01-29

    Bragg spots recorded from a still crystal necessarily give partial measurements of the structure factor intensity. Correction to the full-spot equivalent, relying on both a physical model for crystal disorder and postrefinement of the crystal orientation, improves the electron density map in serial crystallography. Serial crystallography, using either femtosecond X-ray pulses from free-electron laser sources or short synchrotron-radiation exposures, has the potential to reveal metalloprotein structural details while minimizing damage processes. However, deriving a self-consistent set of Bragg intensities from numerous still-crystal exposures remains a difficult problem, with optimal protocols likely to be quite different from those well established for rotation photography. Here several data processing issues unique to serial crystallography are examined. It is found that the limiting resolution differs for each shot, an effect that is likely to be due to both the sample heterogeneity and pulse-to-pulse variation in experimental conditions. Shots with lower resolution limits produce lower-quality models for predicting Bragg spot positions during the integration step. Also, still shots by their nature record only partial measurements of the Bragg intensity. An approximate model that corrects to the full-spot equivalent (with the simplifying assumption that the X-rays are monochromatic) brings the distribution of intensities closer to that expected from an ideal crystal, and improves the sharpness of anomalous difference Fourier peaks indicating metal positions.

  15. Biometric Attendance and Big Data Analysis for Optimizing Work Processes.

    Science.gov (United States)

    Verma, Neetu; Xavier, Teenu; Agrawal, Deepak

    2016-01-01

    Although biometric attendance management is available, large healthcare organizations have difficulty in big data analysis for optimization of work processes. The aim of this project was to assess the implementation of a biometric attendance system and its utility following big data analysis. In this prospective study the implementation of biometric system was evaluated over 3 month period at our institution. Software integration with other existing systems for data analysis was also evaluated. Implementation of the biometric system could be successfully done over a two month period with enrollment of 10,000 employees into the system. However generating reports and taking action this large number of staff was a challenge. For this purpose software was made for capturing the duty roster of each employee and integrating it with the biometric system and adding an SMS gateway. This helped in automating the process of sending SMSs to each employee who had not signed in. Standalone biometric systems have limited functionality in large organizations unless it is meshed with employee duty roster.

  16. Magnetic MIMO Signal Processing and Optimization for Wireless Power Transfer

    Science.gov (United States)

    Yang, Gang; Moghadam, Mohammad R. Vedady; Zhang, Rui

    2017-06-01

    In magnetic resonant coupling (MRC) enabled multiple-input multiple-output (MIMO) wireless power transfer (WPT) systems, multiple transmitters (TXs) each with one single coil are used to enhance the efficiency of simultaneous power transfer to multiple single-coil receivers (RXs) by constructively combining their induced magnetic fields at the RXs, a technique termed "magnetic beamforming". In this paper, we study the optimal magnetic beamforming design in a multi-user MIMO MRC-WPT system. We introduce the multi-user power region that constitutes all the achievable power tuples for all RXs, subject to the given total power constraint over all TXs as well as their individual peak voltage and current constraints. We characterize each boundary point of the power region by maximizing the sum-power deliverable to all RXs subject to their minimum harvested power constraints. For the special case without the TX peak voltage and current constraints, we derive the optimal TX current allocation for the single-RX setup in closed-form as well as that for the multi-RX setup. In general, the problem is a non-convex quadratically constrained quadratic programming (QCQP), which is difficult to solve. For the case of one single RX, we show that the semidefinite relaxation (SDR) of the problem is tight. For the general case with multiple RXs, based on SDR we obtain two approximate solutions by applying time-sharing and randomization, respectively. Moreover, for practical implementation of magnetic beamforming, we propose a novel signal processing method to estimate the magnetic MIMO channel due to the mutual inductances between TXs and RXs. Numerical results show that our proposed magnetic channel estimation and adaptive beamforming schemes are practically effective, and can significantly improve the power transfer efficiency and multi-user performance trade-off in MIMO MRC-WPT systems.

  17. A key to success: optimizing the planning process

    Science.gov (United States)

    Turk, Huseyin; Karakaya, Kamil

    2014-05-01

    operation planning process is analyzed according to a comprehensive approach. The difficulties of planning are identified. Consequently, for optimizing a decisionmaking process of an air operation, a planning process is identified in a virtual command and control structure.

  18. The Adjoint Method for Gradient-based Dynamic Optimization of UV Flash Processes

    DEFF Research Database (Denmark)

    Ritschel, Tobias Kasper Skovborg; Capolei, Andrea; Jørgensen, John Bagterp

    2017-01-01

    This paper presents a novel single-shooting algorithm for gradient-based solution of optimal control problems with vapor-liquid equilibrium constraints. Dynamic optimization of UV flash processes is relevant in nonlinear model predictive control of distillation columns, certain two-phase flow pro......-component flash process which demonstrate the importance of the optimization solver, the compiler, and the linear algebra software for the efficiency of dynamic optimization of UV flash processes....

  19. Recent Progress on Data-Based Optimization for Mineral Processing Plants

    Directory of Open Access Journals (Sweden)

    Jinliang Ding

    2017-04-01

    Full Text Available In the globalized market environment, increasingly significant economic and environmental factors within complex industrial plants impose importance on the optimization of global production indices; such optimization includes improvements in production efficiency, product quality, and yield, along with reductions of energy and resource usage. This paper briefly overviews recent progress in data-driven hybrid intelligence optimization methods and technologies in improving the performance of global production indices in mineral processing. First, we provide the problem description. Next, we summarize recent progress in data-based optimization for mineral processing plants. This optimization consists of four layers: optimization of the target values for monthly global production indices, optimization of the target values for daily global production indices, optimization of the target values for operational indices, and automation systems for unit processes. We briefly overview recent progress in each of the different layers. Finally, we point out opportunities for future works in data-based optimization for mineral processing plants.

  20. Effects of multiple enzyme–substrate interactions in basic units of cellular signal processing

    International Nuclear Information System (INIS)

    Seaton, D D; Krishnan, J

    2012-01-01

    Covalent modification cycles are a ubiquitous feature of cellular signalling networks. In these systems, the interaction of an active enzyme with the unmodified form of its substrate is essential for signalling to occur. However, this interaction is not necessarily the only enzyme–substrate interaction possible. In this paper, we analyse the behaviour of a basic model of signalling in which additional, non-essential enzyme–substrate interactions are possible. These interactions include those between the inactive form of an enzyme and its substrate, and between the active form of an enzyme and its product. We find that these additional interactions can result in increased sensitivity and biphasic responses, respectively. The dynamics of the responses are also significantly altered by the presence of additional interactions. Finally, we evaluate the consequences of these interactions in two variations of our basic model, involving double modification of substrate and scaffold-mediated signalling, respectively. We conclude that the molecular details of protein–protein interactions are important in determining the signalling properties of enzymatic signalling pathways. (paper)

  1. Energetics of cellular repair processes in a respiratory-deficient mutant of yeast

    International Nuclear Information System (INIS)

    Jain, V.K.; Gupta, I.; Lata, K.

    1982-01-01

    Repair of potentially lethal damage induced by cytoxic agents like UV irradiation (254 nm), psorelen-plus-UVA (365 mn), and methyl methanesulfonate has been studied in the presence of a glucose analog, 2-deoxy-D-glucose, in yeast cells. Simultaneously, effects of 2-deoxy-D-glucose were also investigated on parameters of energy metabolism like glucose utilization, rate of ATP production, and ATP content of cells. The following results were obtained. (i) 2-Deoxy-D-glucose is able to inhibit repair of potentially lethal damage induced by all the cytotoxic agents tested. The 2-deoxy-D-glucose-induced inhibition of repair depends upon the type of lesion and the pattern of cellular energy metabolism, the inhibition being greater in respiratory-deficient mutants than in the wild type. (ii) A continuous energy flow is necessary for repair of potentially lethal damage in yeast cells. Energy may be supplied by the glycolytic and/or the respiratory pathway; respiratory metabolism is not essential for this purpose. (iii) The magnitude of repair correlates with the rate of ATP production in a sigmoid manner

  2. Controlling major cellular processes of human mesenchymal stem cells using microwell structures.

    Science.gov (United States)

    Xu, Xun; Wang, Weiwei; Kratz, Karl; Fang, Liang; Li, Zhengdong; Kurtz, Andreas; Ma, Nan; Lendlein, Andreas

    2014-12-01

    Directing stem cells towards a desired location and function by utilizing the structural cues of biomaterials is a promising approach for inducing effective tissue regeneration. Here, the cellular response of human adipose-derived mesenchymal stem cells (hADSCs) to structural signals from microstructured substrates comprising arrays of square-shaped or round-shaped microwells is explored as a transitional model between 2D and 3D systems. Microwells with a side length/diameter of 50 μm show advantages over 10 μm and 25 μm microwells for accommodating hADSCs within single microwells rather than in the inter-microwell area. The cell morphologies are three-dimensionally modulated by the microwell structure due to differences in focal adhesion and consequent alterations of the cytoskeleton. In contrast to the substrate with 50 μm round-shaped microwells, the substrate with 50 μm square-shaped microwells promotes the proliferation and osteogenic differentiation potential of hADSCs but reduces the cell migration velocity and distance. Such microwell shape-dependent modulatory effects are highly associated with Rho/ROCK signaling. Following ROCK inhibition, the differences in migration, proliferation, and osteogenesis between cells on different substrates are diminished. These results highlight the possibility to control stem cell functions through the use of structured microwells combined with the manipulation of Rho/ROCK signaling. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Forecasting optimal duration of a beer main fermentation process using the Kalman filter

    OpenAIRE

    Niyonsaba T.; Pavlov V.A.

    2016-01-01

    One of the most important processes of beer production is the main process of fermentation. In this process, the wort transforms into beer. The quality of beer depends on the dynamics of wort parameters. The main fermentation process continues for 10 days and requires high costs. Therefore, the main purpose of this article is to forecast the optimal duration of the beer main fermentation process and provide its optimal control. The Kalman filter can provide optimal control of the main ferment...

  4. QUALITY OF ACCOUNTING INFORMATION TO OPTIMIZE THE DECISIONAL PROCESS

    Directory of Open Access Journals (Sweden)

    Miculescu Marius Nicolae

    2012-12-01

    Full Text Available This article provides information on business and therefore need managers to obtain information relevant accounting, reliable, clear, accurate and lowest costs to optimize decision making. This need derives from the current economic environment. The survival of organizations in a competitive environment, to which they must adapt, is conditioned by obtaining accounting information which should be qualitative, opportune, vital, and in a short time. This information is related to patrimony, analytical results, the market (dynamics, dimensions, and structure, and relationships with business partners, competitors, suppliers. Therefore focus more intensely on the quality of accounting information. Definition of quality of accounting information but leave the boundaries and features of accounting communication process and aims to determine \\\\\\"quality criteria\\\\\\" or \\\\\\"qualitative characteristics\\\\\\" to develop a measurement tool. Note that the reviewliterature was found that the normalization and accounting dotrine, criteria for definition of quality of accounting infornation are not identical, their selection and ranking is different. Theory and practice also identifies the fact that information itself is worthless. Instead it is valuable once it is used in a decisional process. Thus, the economic value of the accounting information depends on the earnings obtained after making a decision, diminished by information cost. To be more specific, it depends on the table or on the implemented decision tree, on the informational cost and on the optimal condition established by the decision maker (due to the fact that producing accounting information implies costs which are often considerable and profits arise only form shares. The problem of convergence between content and interpretation of information sent by users also take, and the quality of information to be intelligible. In this case, those who use, say users should have sufficient

  5. Diurnal Regulation of Cellular Processes in the Cyanobacterium Synechocystis sp. Strain PCC 6803: Insights from Transcriptomic, Fluxomic, and Physiological Analyses

    Directory of Open Access Journals (Sweden)

    Rajib Saha

    2016-05-01

    Full Text Available Synechocystis sp. strain PCC 6803 is the most widely studied model cyanobacterium, with a well-developed omics level knowledgebase. Like the lifestyles of other cyanobacteria, that of Synechocystis PCC 6803 is tuned to diurnal changes in light intensity. In this study, we analyzed the expression patterns of all of the genes of this cyanobacterium over two consecutive diurnal periods. Using stringent criteria, we determined that the transcript levels of nearly 40% of the genes in Synechocystis PCC 6803 show robust diurnal oscillating behavior, with a majority of the transcripts being upregulated during the early light period. Such transcripts corresponded to a wide array of cellular processes, such as light harvesting, photosynthetic light and dark reactions, and central carbon metabolism. In contrast, transcripts of membrane transporters for transition metals involved in the photosynthetic electron transport chain (e.g., iron, manganese, and copper were significantly upregulated during the late dark period. Thus, the pattern of global gene expression led to the development of two distinct transcriptional networks of coregulated oscillatory genes. These networks help describe how Synechocystis PCC 6803 regulates its metabolism toward the end of the dark period in anticipation of efficient photosynthesis during the early light period. Furthermore, in silico flux prediction of important cellular processes and experimental measurements of cellular ATP, NADP(H, and glycogen levels showed how this diurnal behavior influences its metabolic characteristics. In particular, NADPH/NADP+ showed a strong correlation with the majority of the genes whose expression peaks in the light. We conclude that this ratio is a key endogenous determinant of the diurnal behavior of this cyanobacterium.

  6. Chip Design Process Optimization Based on Design Quality Assessment

    Science.gov (United States)

    Häusler, Stefan; Blaschke, Jana; Sebeke, Christian; Rosenstiel, Wolfgang; Hahn, Axel

    2010-06-01

    Nowadays, the managing of product development projects is increasingly challenging. Especially the IC design of ASICs with both analog and digital components (mixed-signal design) is becoming more and more complex, while the time-to-market window narrows at the same time. Still, high quality standards must be fulfilled. Projects and their status are becoming less transparent due to this complexity. This makes the planning and execution of projects rather difficult. Therefore, there is a need for efficient project control. A main challenge is the objective evaluation of the current development status. Are all requirements successfully verified? Are all intermediate goals achieved? Companies often develop special solutions that are not reusable in other projects. This makes the quality measurement process itself less efficient and produces too much overhead. The method proposed in this paper is a contribution to solve these issues. It is applied at a German design house for analog mixed-signal IC design. This paper presents the results of a case study and introduces an optimized project scheduling on the basis of quality assessment results.

  7. Optimization of the LENS process for steady molten pool size

    Energy Technology Data Exchange (ETDEWEB)

    Wang, L. [Center for Advanced Vehicular Systems, Mississippi State University, Mississippi State, MS 39762 (United States); Felicelli, S. [Mechanical Engineering Department, Mississippi State University, Mississippi State, MS 39762 (United States)], E-mail: felicelli@me.msstate.edu; Gooroochurn, Y. [ESI Group, Bloomfield Hills, MI 48304 (United States); Wang, P.T.; Horstemeyer, M.F. [Center for Advanced Vehicular Systems, Mississippi State University, Mississippi State, MS 39762 (United States)

    2008-02-15

    A three-dimensional finite element model was developed and applied to analyze the temperature and phase evolution in deposited stainless steel 410 (SS410) during the Laser Engineered Net Shaping (LENS) rapid fabrication process. The effect of solid phase transformations is taken into account by using temperature and phase dependent material properties and the continuous cooling transformation (CCT) diagram. The laser beam is modeled as a Gaussian distribution of heat flux from a moving heat source with conical shape. The laser power and translational speed during deposition of a single-wall plate are optimized in order to maintain a steady molten pool size. It is found that, after an initial transient due to the cold substrate, the dependency of laser power with layer number is approximately linear for all travel speeds analyzed. The temperature distribution and cooling rate surrounding the molten pool are predicted and compared with experiments. Based upon the predicted thermal cycles and cooling rate, the phase transformations and their effects on the hardness of the part are discussed.

  8. Uniform, optimal signal processing of mapped deep-sequencing data.

    Science.gov (United States)

    Kumar, Vibhor; Muratani, Masafumi; Rayan, Nirmala Arul; Kraus, Petra; Lufkin, Thomas; Ng, Huck Hui; Prabhakar, Shyam

    2013-07-01

    Despite their apparent diversity, many problems in the analysis of high-throughput sequencing data are merely special cases of two general problems, signal detection and signal estimation. Here we adapt formally optimal solutions from signal processing theory to analyze signals of DNA sequence reads mapped to a genome. We describe DFilter, a detection algorithm that identifies regulatory features in ChIP-seq, DNase-seq and FAIRE-seq data more accurately than assay-specific algorithms. We also describe EFilter, an estimation algorithm that accurately predicts mRNA levels from as few as 1-2 histone profiles (R ∼0.9). Notably, the presence of regulatory motifs in promoters correlates more with histone modifications than with mRNA levels, suggesting that histone profiles are more predictive of cis-regulatory mechanisms. We show by applying DFilter and EFilter to embryonic forebrain ChIP-seq data that regulatory protein identification and functional annotation are feasible despite tissue heterogeneity. The mathematical formalism underlying our tools facilitates integrative analysis of data from virtually any sequencing-based functional profile.

  9. Supramodal processing optimizes visual perceptual learning and plasticity.

    Science.gov (United States)

    Zilber, Nicolas; Ciuciu, Philippe; Gramfort, Alexandre; Azizi, Leila; van Wassenhove, Virginie

    2014-06-01

    Multisensory interactions are ubiquitous in cortex and it has been suggested that sensory cortices may be supramodal i.e. capable of functional selectivity irrespective of the sensory modality of inputs (Pascual-Leone and Hamilton, 2001; Renier et al., 2013; Ricciardi and Pietrini, 2011; Voss and Zatorre, 2012). Here, we asked whether learning to discriminate visual coherence could benefit from supramodal processing. To this end, three groups of participants were briefly trained to discriminate which of a red or green intermixed population of random-dot-kinematograms (RDKs) was most coherent in a visual display while being recorded with magnetoencephalography (MEG). During training, participants heard no sound (V), congruent acoustic textures (AV) or auditory noise (AVn); importantly, congruent acoustic textures shared the temporal statistics - i.e. coherence - of visual RDKs. After training, the AV group significantly outperformed participants trained in V and AVn although they were not aware of their progress. In pre- and post-training blocks, all participants were tested without sound and with the same set of RDKs. When contrasting MEG data collected in these experimental blocks, selective differences were observed in the dynamic pattern and the cortical loci responsive to visual RDKs. First and common to all three groups, vlPFC showed selectivity to the learned coherence levels whereas selectivity in visual motion area hMT+ was only seen for the AV group. Second and solely for the AV group, activity in multisensory cortices (mSTS, pSTS) correlated with post-training performances; additionally, the latencies of these effects suggested feedback from vlPFC to hMT+ possibly mediated by temporal cortices in AV and AVn groups. Altogether, we interpret our results in the context of the Reverse Hierarchy Theory of learning (Ahissar and Hochstein, 2004) in which supramodal processing optimizes visual perceptual learning by capitalizing on sensory

  10. Multidisciplinary Design Optimization (MDO) Methods: Their Synergy with Computer Technology in Design Process

    Science.gov (United States)

    Sobieszczanski-Sobieski, Jaroslaw

    1998-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate a radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimization (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behavior by interaction of a large number of very simple models may be an inspiration for the above algorithms, the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should be now, even though the widespread availability of massively parallel processing is still a few years away.

  11. Effect of in vitro gamma exposure on rat mesencephalic and striatal cellular types and processes length

    International Nuclear Information System (INIS)

    Coffigny, H.; Court, L.

    1994-01-01

    The isolated mesencephalic and striatal cells were irradiated in a dose-range of 0.25 to 3 Gy followed by 3 day of culture. The proportion of monopolar, bipolar, tripolar and multipolar cell population was not obviously modified by irradiation. The processes length was similar to controls, except after 3 Gy exposure, for monopolar and bipolar mesencephalic cells and the tripolar striatal cells where it was increased. In these populations, only cells with long processes seemed to survive. (author)

  12. Optimization of a novel enzyme treatment process for early-stage processing of sheepskins.

    Science.gov (United States)

    Lim, Y F; Bronlund, J E; Allsop, T F; Shilton, A N; Edmonds, R L

    2010-01-01

    An enzyme treatment process for early-stage processing of sheepskins has been previously reported by the Leather and Shoe Research Association of New Zealand (LASRA) as an alternative to current industry operations. The newly developed process had marked benefits over conventional processing in terms of a lowered energy usage (73%), processing time (47%) as well as water use (49%), but had been developed as a "proof of principle''. The objective of this work was to develop the process further to a stage ready for adoption by industry. Mass balancing was used to investigate potential modifications for the process based on the understanding developed from a detailed analysis of preliminary design trials. Results showed that a configuration utilising a 2 stage counter-current system for the washing stages and segregation and recycling of enzyme float prior to dilution in the neutralization stage was a significant improvement. Benefits over conventional processing include a reduction of residual TDS by 50% at the washing stages and 70% savings on water use overall. Benefits over the un-optimized LASRA process are reduction of solids in product after enzyme treatment and neutralization stages by 30%, additional water savings of 21%, as well as 10% savings of enzyme usage.

  13. Process optimization and particle engineering of micronized drug powders via milling.

    Science.gov (United States)

    Brunaugh, A; Smyth, H D C

    2017-11-13

    Process control and optimization is a critical aspect of process analytical technology (PAT), quality by design (QbD), and the implementation of continuous manufacturing procedures. While process control and optimization techniques have been utilized in other manufacturing industries for decades, the pharmaceutical industry has only recently begun to adopt these procedures. Micronization, particularly milling, is a generally low-yield, high-energy consumption process that is well suited for a process optimization mindset. This review discusses optimization of the pharmaceutical milling process through design space development, theoretical and empirical modeling, and monitoring of critical quality attributes.

  14. A model for Intelligent Random Access Memory architecture (IRAM) cellular automata algorithms on the Associative String Processing machine (ASTRA)

    CERN Document Server

    Rohrbach, F; Vesztergombi, G

    1997-01-01

    In the near future, the computer performance will be completely determined by how long it takes to access memory. There are bottle-necks in memory latency and memory-to processor interface bandwidth. The IRAM initiative could be the answer by putting Processor-In-Memory (PIM). Starting from the massively parallel processing concept, one reached a similar conclusion. The MPPC (Massively Parallel Processing Collaboration) project and the 8K processor ASTRA machine (Associative String Test bench for Research \\& Applications) developed at CERN \\cite{kuala} can be regarded as a forerunner of the IRAM concept. The computing power of the ASTRA machine, regarded as an IRAM with 64 one-bit processors on a 64$\\times$64 bit-matrix memory chip machine, has been demonstrated by running statistical physics algorithms: one-dimensional stochastic cellular automata, as a simple model for dynamical phase transitions. As a relevant result for physics, the damage spreading of this model has been investigated.

  15. Process optimization in petrochemical industries; Otimizacao de processo nas industrias petroquimicas

    Energy Technology Data Exchange (ETDEWEB)

    Castro Filho, Paulo Farias; Chachamovitz, Joao Carlos

    1993-12-31

    The most recent and efficient technologies of simulation, modeling and process optimization are presented. Some practical problems are analyzed together with a methodology for applying the optimization technologies. (author) 2 refs., 7 figs., 2 tabs.

  16. Microtubule self-organisation by reaction-diffusion processes causes collective transport and organisation of cellular particles

    Directory of Open Access Journals (Sweden)

    Demongeot Jacques

    2004-06-01

    Full Text Available Abstract Background The transport of intra-cellular particles by microtubules is a major biological function. Under appropriate in vitro conditions, microtubule preparations behave as a 'complex' system and show 'emergent' phenomena. In particular, they form dissipative structures that self-organise over macroscopic distances by a combination of reaction and diffusion. Results Here, we show that self-organisation also gives rise to a collective transport of colloidal particles along a specific direction. Particles, such as polystyrene beads, chromosomes, nuclei, and vesicles are carried at speeds of several microns per minute. The process also results in the macroscopic self-organisation of these particles. After self-organisation is completed, they show the same pattern of organisation as the microtubules. Numerical simulations of a population of growing and shrinking microtubules, incorporating experimentally realistic reaction dynamics, predict self-organisation. They forecast that during self-organisation, macroscopic parallel arrays of oriented microtubules form which cross the reaction space in successive waves. Such travelling waves are capable of transporting colloidal particles. The fact that in the simulations, the aligned arrays move along the same direction and at the same speed as the particles move, suggest that this process forms the underlying mechanism for the observed transport properties. Conclusions This process constitutes a novel physical chemical mechanism by which chemical energy is converted into collective transport of colloidal particles along a given direction. Self-organisation of this type provides a new mechanism by which intra cellular particles such as chromosomes and vesicles can be displaced and simultaneously organised by microtubules. It is plausible that processes of this type occur in vivo.

  17. Microtubule self-organisation by reaction-diffusion processes causes collective transport and organisation of cellular particles

    Science.gov (United States)

    Glade, Nicolas; Demongeot, Jacques; Tabony, James

    2004-01-01

    Background The transport of intra-cellular particles by microtubules is a major biological function. Under appropriate in vitro conditions, microtubule preparations behave as a 'complex' system and show 'emergent' phenomena. In particular, they form dissipative structures that self-organise over macroscopic distances by a combination of reaction and diffusion. Results Here, we show that self-organisation also gives rise to a collective transport of colloidal particles along a specific direction. Particles, such as polystyrene beads, chromosomes, nuclei, and vesicles are carried at speeds of several microns per minute. The process also results in the macroscopic self-organisation of these particles. After self-organisation is completed, they show the same pattern of organisation as the microtubules. Numerical simulations of a population of growing and shrinking microtubules, incorporating experimentally realistic reaction dynamics, predict self-organisation. They forecast that during self-organisation, macroscopic parallel arrays of oriented microtubules form which cross the reaction space in successive waves. Such travelling waves are capable of transporting colloidal particles. The fact that in the simulations, the aligned arrays move along the same direction and at the same speed as the particles move, suggest that this process forms the underlying mechanism for the observed transport properties. Conclusions This process constitutes a novel physical chemical mechanism by which chemical energy is converted into collective transport of colloidal particles along a given direction. Self-organisation of this type provides a new mechanism by which intra cellular particles such as chromosomes and vesicles can be displaced and simultaneously organised by microtubules. It is plausible that processes of this type occur in vivo. PMID:15176973

  18. A differential genome-wide transcriptome analysis: impact of cellular copper on complex biological processes like aging and development.

    Directory of Open Access Journals (Sweden)

    Jörg Servos

    Full Text Available The regulation of cellular copper homeostasis is crucial in biology. Impairments lead to severe dysfunctions and are known to affect aging and development. Previously, a loss-of-function mutation in the gene encoding the copper-sensing and copper-regulated transcription factor GRISEA of the filamentous fungus Podospora anserina was reported to lead to cellular copper depletion and a pleiotropic phenotype with hypopigmentation of the mycelium and the ascospores, affected fertility and increased lifespan by approximately 60% when compared to the wild type. This phenotype is linked to a switch from a copper-dependent standard to an alternative respiration leading to both a reduced generation of reactive oxygen species (ROS and of adenosine triphosphate (ATP. We performed a genome-wide comparative transcriptome analysis of a wild-type strain and the copper-depleted grisea mutant. We unambiguously assigned 9,700 sequences of the transcriptome in both strains to the more than 10,600 predicted and annotated open reading frames of the P. anserina genome indicating 90% coverage of the transcriptome. 4,752 of the transcripts differed significantly in abundance with 1,156 transcripts differing at least 3-fold. Selected genes were investigated by qRT-PCR analyses. Apart from this general characterization we analyzed the data with special emphasis on molecular pathways related to the grisea mutation taking advantage of the available complete genomic sequence of P. anserina. This analysis verified but also corrected conclusions from earlier data obtained by single gene analysis, identified new candidates of factors as part of the cellular copper homeostasis system including target genes of transcription factor GRISEA, and provides a rich reference source of quantitative data for further in detail investigations. Overall, the present study demonstrates the importance of systems biology approaches also in cases were mutations in single genes are analyzed to

  19. Potential and challenges in home care service process optimization : a route optimization approach

    OpenAIRE

    Nakari, Pentti J. E.

    2016-01-01

    Aging of the population is an increasing problem in many countries, including Finland, and it poses a challenge to public services such as home care. Vehicle routing optimization (VRP) type optimization solutions are one possible way to decrease the time required for planning home visits and driving to customer addresses, as well as decreasing transportation costs. Although VRP optimization is widely and succesfully applied to commercial and industrial logistics, the home care ...

  20. Cellular processing of the amyloidogenic cystatin C variant of hereditary cerebral hemorrhage with amyloidosis, Icelandic type

    DEFF Research Database (Denmark)

    Benedikz, Eirikur; Merz, G S; Schwenk, V

    1999-01-01

    of an amyloidogenic mutation on the intracellular processing of its protein product. The protein, a mutant of the cysteine protease inhibitor cystatin C, is the amyloid precursor protein in Hereditary Cerebral Hemorrhage with Amyloidosis--Icelandic type (HCHWA-I). The amyloid fibers are composed of mutant cystatin C...... (L68Q) that lacks the first 10 amino acids. We have previously shown that processing of wild-type cystatin C entails formation of a transient intracellular dimer that dissociates prior to secretion, such that extracellular cystatin C is monomeric. We report here that the cystatin C mutation engenders...

  1. Advanced landfill leachate treatment using iron-carbon microelectrolysis- Fenton process: Process optimization and column experiments

    International Nuclear Information System (INIS)

    Wang, Liqun; Yang, Qi; Wang, Dongbo; Li, Xiaoming; Zeng, Guangming; Li, Zhijun; Deng, Yongchao; Liu, Jun; Yi, Kaixin

    2016-01-01

    Highlights: • Fe-C microelectrolysis-Fenton process is proposed to pretreat landfill leachate. • Operating variables are optimized by response surface methodology (RSM). • 3D-EEMs and MW distribution explain the mechanism of enhanced biodegradability. • Fixed-bed column experiments are performed at different flow rates. - Abstract: A novel hydrogen peroxide-enhanced iron-carbon (Fe-C) microelectrolysis reactor was proposed for the pretreatment of mature landfill leachate. This reactor, combining microelectrolysis with Fenton process, revealed high treatment efficiency. The operating variables, including Fe-C dosage, H_2O_2 concentration and initial pH, were optimized by the response surface methodology (RSM), regarding the chemical oxygen demand (COD) removal efficiency and biochemical oxygen demand: chemical oxygen demand (BOD_5/COD) as the responses. The highest COD removal (74.59%) and BOD_5/COD (0.50) was obtained at optimal conditions of Fe-C dosage 55.72 g/L, H_2O_2 concentration 12.32 mL/L and initial pH 3.12. Three-dimensional excitation and emission matrix (3D-EEM) fluorescence spectroscopy and molecular weight (MW) distribution demonstrated that high molecular weight fractions such as refractory fulvic-like substances in leachate were effectively destroyed during the combined processes, which should be attributed to the combination oxidative effect of microelectrolysis and Fenton. The fixed-bed column experiments were performed and the breakthrough curves at different flow rates were evaluated to determine the practical applicability of the combined process. All these results show that the hydrogen peroxide-enhanced iron-carbon (Fe-C) microelectrolysis reactor is a promising and efficient technology for the treatment of mature landfill leachate.

  2. Advanced landfill leachate treatment using iron-carbon microelectrolysis- Fenton process: Process optimization and column experiments

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Liqun, E-mail: 691127317@qq.com [College of Environmental Science and Engineering, Hunan University, Changsha 410082 (China); Key Laboratory of Environmental Biology and Pollution Control (Hunan University), Ministry of Education, Changsha 410082 (China); Yang, Qi, E-mail: yangqi@hnu.edu.cn [College of Environmental Science and Engineering, Hunan University, Changsha 410082 (China); Key Laboratory of Environmental Biology and Pollution Control (Hunan University), Ministry of Education, Changsha 410082 (China); Wang, Dongbo [College of Environmental Science and Engineering, Hunan University, Changsha 410082 (China); Key Laboratory of Environmental Biology and Pollution Control (Hunan University), Ministry of Education, Changsha 410082 (China); Li, Xiaoming, E-mail: xmli121x@hotmail.com [College of Environmental Science and Engineering, Hunan University, Changsha 410082 (China); Key Laboratory of Environmental Biology and Pollution Control (Hunan University), Ministry of Education, Changsha 410082 (China); Zeng, Guangming; Li, Zhijun; Deng, Yongchao; Liu, Jun; Yi, Kaixin [College of Environmental Science and Engineering, Hunan University, Changsha 410082 (China); Key Laboratory of Environmental Biology and Pollution Control (Hunan University), Ministry of Education, Changsha 410082 (China)

    2016-11-15

    Highlights: • Fe-C microelectrolysis-Fenton process is proposed to pretreat landfill leachate. • Operating variables are optimized by response surface methodology (RSM). • 3D-EEMs and MW distribution explain the mechanism of enhanced biodegradability. • Fixed-bed column experiments are performed at different flow rates. - Abstract: A novel hydrogen peroxide-enhanced iron-carbon (Fe-C) microelectrolysis reactor was proposed for the pretreatment of mature landfill leachate. This reactor, combining microelectrolysis with Fenton process, revealed high treatment efficiency. The operating variables, including Fe-C dosage, H{sub 2}O{sub 2} concentration and initial pH, were optimized by the response surface methodology (RSM), regarding the chemical oxygen demand (COD) removal efficiency and biochemical oxygen demand: chemical oxygen demand (BOD{sub 5}/COD) as the responses. The highest COD removal (74.59%) and BOD{sub 5}/COD (0.50) was obtained at optimal conditions of Fe-C dosage 55.72 g/L, H{sub 2}O{sub 2} concentration 12.32 mL/L and initial pH 3.12. Three-dimensional excitation and emission matrix (3D-EEM) fluorescence spectroscopy and molecular weight (MW) distribution demonstrated that high molecular weight fractions such as refractory fulvic-like substances in leachate were effectively destroyed during the combined processes, which should be attributed to the combination oxidative effect of microelectrolysis and Fenton. The fixed-bed column experiments were performed and the breakthrough curves at different flow rates were evaluated to determine the practical applicability of the combined process. All these results show that the hydrogen peroxide-enhanced iron-carbon (Fe-C) microelectrolysis reactor is a promising and efficient technology for the treatment of mature landfill leachate.

  3. Implementation of a configurable laboratory information management system for use in cellular process development and manufacturing.

    Science.gov (United States)

    Russom, Diana; Ahmed, Amira; Gonzalez, Nancy; Alvarnas, Joseph; DiGiusto, David

    2012-01-01

    Regulatory requirements for the manufacturing of cell products for clinical investigation require a significant level of record-keeping, starting early in process development and continuing through to the execution and requisite follow-up of patients on clinical trials. Central to record-keeping is the management of documentation related to patients, raw materials, processes, assays and facilities. To support these requirements, we evaluated several laboratory information management systems (LIMS), including their cost, flexibility, regulatory compliance, ongoing programming requirements and ability to integrate with laboratory equipment. After selecting a system, we performed a pilot study to develop a user-configurable LIMS for our laboratory in support of our pre-clinical and clinical cell-production activities. We report here on the design and utilization of this system to manage accrual with a healthy blood-donor protocol, as well as manufacturing operations for the production of a master cell bank and several patient-specific stem cell products. The system was used successfully to manage blood donor eligibility, recruiting, appointments, billing and serology, and to provide annual accrual reports. Quality management reporting features of the system were used to capture, report and investigate process and equipment deviations that occurred during the production of a master cell bank and patient products. Overall the system has served to support the compliance requirements of process development and phase I/II clinical trial activities for our laboratory and can be easily modified to meet the needs of similar laboratories.

  4. The effect of processing history on physical behavior and cellular response for tyrosine-derived polyarylates

    International Nuclear Information System (INIS)

    Doddi, S; Patlolla, A; Shanumunsgarundum, S; Jaffe, M; Collins, G; Arinzeh, T Livingston

    2009-01-01

    Polyarylates have shown promise as fully degradable polymers for drug delivery as well as for structural implant applications due to their range of physicomechanical properties. Processing history, however, could have a significant impact on their overall performance in biologically relevant environments. More specifically, structural changes at the molecular level can occur that will affect a polymer's physical properties and subsequent, cell attachment and growth. The present study was aimed at comparing cell growth on tyrosine-derived polyarylates with that of polylactic acid (PLLA) in their original state and after processing (i.e. undrawn and drawn forms). Two polyarylates having distinct molecular structures were chosen. Strictly, amorphous poly(DTE adipate), denoted as poly(DT 2,4), and poly(DTD) dodecandioate, denoted as poly(DT 12,10), having a more complex, non-crystalline organization, were compared with semi-crystalline PLLA. The degree of shrinkage, thermal characterization, air-water contact angle and surface morphology were determined for each polymer in its undrawn and drawn states. Poly(DT 2,4) and PLLA after processing resulted in greater shrinkage and a slight decrease in hydrophilicity whereas poly(DT 12,10) had minimal shrinkage and became slightly more hydrophilic in its drawn state. Surface morphology or roughness was also altered by processing. In turn, the rate of cell growth and overall cell numbers were reduced significantly on drawn forms of poly(DT 2,4) and PLLA, whereas more favorable growth rates were supported on drawn poly(DT 12,10). These findings indicate that processing effects in amorphous as well as oriented polymeric structures can significantly alter their biological performance.

  5. Design, characterization, and in vitro cellular inhibition and uptake of optimized genistein-loaded NLC for the prevention of posterior capsular opacification using response surface methodology.

    Science.gov (United States)

    Zhang, Wenji; Li, Xuedong; Ye, Tiantian; Chen, Fen; Sun, Xiao; Kong, Jun; Yang, Xinggang; Pan, Weisan; Li, Sanming

    2013-09-15

    This study was to design an innovative nanostructured lipid carrier (NLC) for drug delivery of genistein applied after cataract surgery for the prevention of posterior capsular opacification. NLC loaded with genistein (GEN-NLC) was produced with Compritol 888 ATO, Gelucire 44/14 and Miglyol 812N, stabilized by Solutol(®) HS15 by melt emulsification method. A 2(4) central composite design of 4 independent variables was performed for optimization. Effects of drug concentration, Gelucire 44/14 concentration in total solid lipid, liquid lipid concentration, and surfactant concentration on the mean particle size, polydispersity index, zeta potential and encapsulation efficiency were investigated. Analysis of variance (ANOVA) statistical test was used to assess the optimization. The optimized GEN-NLC showed a homogeneous particle size of 90.16 nm (with PI=0.33) of negatively charged surface (-25.08 mv) and high encapsulation efficiency (91.14%). Particle morphology assessed by TEM revealed a spherical shape. DSC analyses confirmed that GEN was mostly entrapped in amorphous state. In vitro release experiments indicated a prolonged and controlled genistein release for 72 h. In vitro growth inhibition assay showed an effective growth inhibition of GEN-NLCs on human lens epithelial cells (HLECs). Preliminary cellular uptake test proved a enhanced penetration of genistein into HLECs when delivered in NLC. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Cellular processing of gold nanoparticles: CE-ICP-MS evidence for the speciation changes in human cytosol.

    Science.gov (United States)

    Legat, Joanna; Matczuk, Magdalena; Timerbaev, Andrei R; Jarosz, Maciej

    2018-01-01

    The cellular uptake of gold nanoparticles (AuNPs) may (or may not) affect their speciation, but information on the chemical forms in which the particles exist in the cell remains obscure. An analytical method based on the use of capillary electrophoresis hyphenated with inductively coupled plasma mass spectrometry (ICP-MS) has been proposed to shed light on the intracellular processing of AuNPs. It was observed that when being introduced into normal cytosol, the conjugates of 10-50 nm AuNPs with albumin evolved in human serum stayed intact. On the contrary, under simulated cancer cytosol conditions, the nanoconjugates underwent decomposition, the rate of which and the resulting metal speciation patterns were strongly influenced by particle size. The new peaks that appeared in ICP-MS electropherograms could be ascribed to nanosized species, as upon ultracentrifugation, they quantitatively precipitated whereas the supernatant showed only trace Au signals. Our present study is the first step to unravel a mystery of the cellular chemistry for metal-based nanomedicines.

  7. Process Optimization of Bismaleimide (BMI) Resin Infused Carbon Fiber Composite

    Science.gov (United States)

    Ehrlich, Joshua W.; Tate, LaNetra C.; Cox, Sarah B.; Taylor, Brian J.; Wright, M. Clara; Caraccio, Anne J.; Sampson, Jeffery W.

    2013-01-01

    Bismaleimide (BMI) resins are an attractive new addition to world-wide composite applications. This type of thermosetting polyimide provides several unique characteristics such as excellent physical property retention at elevated temperatures and in wet environments, constant electrical properties over a vast array of temperature settings, and nonflammability properties as well. This makes BMI a popular choice in advance composites and electronics applications [I]. Bismaleimide-2 (BMI-2) resin was used to infuse intermediate modulus 7 (IM7) based carbon fiber. Two panel configurations consisting of 4 plies with [+45deg, 90deg]2 and [0deg]4 orientations were fabricated. For tensile testing, a [90deg]4 configuration was tested by rotating the [0deg]4 configirration to lie orthogonal with the load direction of the test fixture. Curing of the BMI-2/IM7 system utilized an optimal infusion process which focused on the integration of the manufacturer-recommended ramp rates,. hold times, and cure temperatures. Completion of the cure cycle for the BMI-2/IM7 composite yielded a product with multiple surface voids determined through visual and metallographic observation. Although the curing cycle was the same for the three panellayups, the surface voids that remained within the material post-cure were different in abundance, shape, and size. For tensile testing, the [0deg]4 layup had a 19.9% and 21.7% greater average tensile strain performance compared to the [90deg]4 and [+45deg, 90deg, 90deg,-45degg] layups, respectively, at failure. For tensile stress performance, the [0deg]4 layup had a 5.8% and 34.0% greater average performance% than the [90deg]4 and [+45deg, 90deg, 90deg,-45deg] layups.

  8. Global Optimization Employing Gaussian Process-Based Bayesian Surrogates

    Directory of Open Access Journals (Sweden)

    Roland Preuss

    2018-03-01

    Full Text Available The simulation of complex physics models may lead to enormous computer running times. Since the simulations are expensive it is necessary to exploit the computational budget in the best possible manner. If for a few input parameter settings an output data set has been acquired, one could be interested in taking these data as a basis for finding an extremum and possibly an input parameter set for further computer simulations to determine it—a task which belongs to the realm of global optimization. Within the Bayesian framework we utilize Gaussian processes for the creation of a surrogate model function adjusted self-consistently via hyperparameters to represent the data. Although the probability distribution of the hyperparameters may be widely spread over phase space, we make the assumption that only the use of their expectation values is sufficient. While this shortcut facilitates a quickly accessible surrogate, it is somewhat justified by the fact that we are not interested in a full representation of the model by the surrogate but to reveal its maximum. To accomplish this the surrogate is fed to a utility function whose extremum determines the new parameter set for the next data point to obtain. Moreover, we propose to alternate between two utility functions—expected improvement and maximum variance—in order to avoid the drawbacks of each. Subsequent data points are drawn from the model function until the procedure either remains in the points found or the surrogate model does not change with the iteration. The procedure is applied to mock data in one and two dimensions in order to demonstrate proof of principle of the proposed approach.

  9. Advanced landfill leachate treatment using iron-carbon microelectrolysis- Fenton process: Process optimization and column experiments.

    Science.gov (United States)

    Wang, Liqun; Yang, Qi; Wang, Dongbo; Li, Xiaoming; Zeng, Guangming; Li, Zhijun; Deng, Yongchao; Liu, Jun; Yi, Kaixin

    2016-11-15

    A novel hydrogen peroxide-enhanced iron-carbon (Fe-C) microelectrolysis reactor was proposed for the pretreatment of mature landfill leachate. This reactor, combining microelectrolysis with Fenton process, revealed high treatment efficiency. The operating variables, including Fe-C dosage, H2O2 concentration and initial pH, were optimized by the response surface methodology (RSM), regarding the chemical oxygen demand (COD) removal efficiency and biochemical oxygen demand: chemical oxygen demand (BOD5/COD) as the responses. The highest COD removal (74.59%) and BOD5/COD (0.50) was obtained at optimal conditions of Fe-C dosage 55.72g/L, H2O2 concentration 12.32mL/L and initial pH 3.12. Three-dimensional excitation and emission matrix (3D-EEM) fluorescence spectroscopy and molecular weight (MW) distribution demonstrated that high molecular weight fractions such as refractory fulvic-like substances in leachate were effectively destroyed during the combined processes, which should be attributed to the combination oxidative effect of microelectrolysis and Fenton. The fixed-bed column experiments were performed and the breakthrough curves at different flow rates were evaluated to determine the practical applicability of the combined process. All these results show that the hydrogen peroxide-enhanced iron-carbon (Fe-C) microelectrolysis reactor is a promising and efficient technology for the treatment of mature landfill leachate. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Global optimization of silicon nanowires for efficient parametric processes

    DEFF Research Database (Denmark)

    Vukovic, Dragana; Xu, Jing; Mørk, Jesper

    2013-01-01

    We present a global optimization of silicon nanowires for parametric single-pump mixing. For the first time, the effect of surface roughness-induced loss is included in the analysis, significantly influencing the optimum waveguide dimensions.......We present a global optimization of silicon nanowires for parametric single-pump mixing. For the first time, the effect of surface roughness-induced loss is included in the analysis, significantly influencing the optimum waveguide dimensions....

  11. Internalization and cellular processing of cholecystokinin in rat pancreatic acinar cells

    International Nuclear Information System (INIS)

    Izzo, R.S.; Pellecchia, C.; Praissman, M.

    1988-01-01

    To evaluate the internalization of cholecystokinin, monoiodinated imidoester of cholecystokinin octapeptide [ 125 I-(IE)-CCK-8] was bound to dispersed pancreatic acinar cells, and surface-bound and internalized radioligand were differentiated by treating with an acidified glycine buffer. The amount of internalized radioligand was four- and sevenfold greater at 24 and 37 degree C than at 4 degree C between 5 and 60 min of association. Specific binding of radioligand to cell surface receptors was not significantly different at these temperatures. Chloroquine, a lysosomotropic agent that blocks intracellular proteolysis, significantly increased the amount of CCK-8 internalized by 18 and 16% at 30 and 60 min of binding, respectively, compared with control. Dithiothreitol (DTT), a sulfhydryl reducing agent, also augmented the amount of CCK-8 radioligand internalized by 25 and 29% at 30 and 60 min, respectively. The effect of chloroquine and DTT on the processing of internalized radioligand was also considered after an initial 60 min of binding of radioligand to acinar cells. After 180 min of processing, the amount of radioligand internalized was significantly greater in the presence of chloroquine compared with controls, whereas the amount of radioligand declined in acinar cells treated with DTT. Internalized and released radioactivity from acinar cells was rebound to pancreatic membrane homogenates to determine the amount of intact radioligand during intracellular processing. Chloroquine significantly increased the amount of intact 125 I-(IE)-CCK-8 radioligand in released and internalized radioactivity while DTT increased the amount of intact radioligand only in internalized samples. This study shows that pancreatic acinar cells rapidly internalize large amounts of CCK-8 and that chloroquine and DTT inhibit intracellular degradation

  12. Network-Guided Key Gene Discovery for a Given Cellular Process

    DEFF Research Database (Denmark)

    He, Feng Q; Ollert, Markus

    2018-01-01

    Identification of key genes for a given physiological or pathological process is an essential but still very challenging task for the entire biomedical research community. Statistics-based approaches, such as genome-wide association study (GWAS)- or quantitative trait locus (QTL)-related analysis...... have already made enormous contributions to identifying key genes associated with a given disease or phenotype, the success of which is however very much dependent on a huge number of samples. Recent advances in network biology, especially network inference directly from genome-scale data...

  13. Multiphoton microscopy for the in-situ investigation of cellular processes and integrity in cryopreservation.

    Science.gov (United States)

    Doerr, Daniel; Stark, Martin; Ehrhart, Friederike; Zimmermann, Heiko; Stracke, Frank

    2009-08-01

    In this study we demonstrate a new noninvasive imaging method to monitor freezing processes in biological samples and to investigate life in the frozen state. It combines a laser scanning microscope with a computer-controlled cryostage. Nearinfrared (NIR) femtosecond laser pulses evoke the fluorescence of endogenous fluorophores and fluorescent labels due to multiphoton absorption.The inherent optical nonlinearity of multiphoton absorption allows 3D fluorescence imaging for optical tomography of frozen biological material in-situ. As an example for functional imaging we use fluorescence lifetime imaging (FLIM) to create images with chemical and physical contrast.

  14. Optimization benefits analysis in production process of fabrication components

    Science.gov (United States)

    Prasetyani, R.; Rafsanjani, A. Y.; Rimantho, D.

    2017-12-01

    The determination of an optimal number of product combinations is important. The main problem at part and service department in PT. United Tractors Pandu Engineering (shortened to PT.UTPE) Is the optimization of the combination of fabrication component products (known as Liner Plate) which influence to the profit that will be obtained by the company. Liner Plate is a fabrication component that serves as a protector of core structure for heavy duty attachment, such as HD Vessel, HD Bucket, HD Shovel, and HD Blade. The graph of liner plate sales from January to December 2016 has fluctuated and there is no direct conclusion about the optimization of production of such fabrication components. The optimal product combination can be achieved by calculating and plotting the amount of production output and input appropriately. The method that used in this study is linear programming methods with primal, dual, and sensitivity analysis using QM software for Windows to obtain optimal fabrication components. In the optimal combination of components, PT. UTPE provide the profit increase of Rp. 105,285,000.00 for a total of Rp. 3,046,525,000.00 per month and the production of a total combination of 71 units per unit variance per month.

  15. Morphology of Filamentous Fungi: Linking Cellular Biology to Process Engineering Using Aspergillus niger

    Science.gov (United States)

    Krull, Rainer; Cordes, Christiana; Horn, Harald; Kampen, Ingo; Kwade, Arno; Neu, Thomas R.; Nörtemann, Bernd

    In various biotechnological processes, filamentous fungi, e.g. Aspergillus niger, are widely applied for the production of high value-added products due to their secretion efficiency. There is, however, a tangled relationship between the morphology of these microorganisms, the transport phenomena and the related productivity. The morphological characteristics vary between freely dispersed mycelia and distinct pellets of aggregated biomass. Hence, advantages and disadvantages for mycel or pellet cultivation have to be balanced out carefully. Due to this inadequate understanding of morphogenesis of filamentous microorganisms, fungal morphology, along with reproducibility of inocula of the same quality, is often a bottleneck of productivity in industrial production. To obtain an optimisation of the production process it is of great importance to gain a better understanding of the molecular and cell biology of these microorganisms as well as the approaches in biochemical engineering and particle technique, in particular to characterise the interactions between the growth conditions, cell morphology, spore-hyphae-interactions and product formation. Advances in particle and image analysis techniques as well as micromechanical devices and their applications to fungal cultivations have made available quantitative morphological data on filamentous cells. This chapter provides the ambitious aspects of this line of action, focussing on the control and characterisation of the morphology, the transport gradients and the approaches to understand the metabolism of filamentous fungi. Based on these data, bottlenecks in the morphogenesis of A. niger within the complex production pathways from gene to product should be identified and this may improve the production yield.

  16. The use of real-time cell analyzer technology in drug discovery: defining optimal cell culture conditions and assay reproducibility with different adherent cellular models.

    Science.gov (United States)

    Atienzar, Franck A; Tilmant, Karen; Gerets, Helga H; Toussaint, Gaelle; Speeckaert, Sebastien; Hanon, Etienne; Depelchin, Olympe; Dhalluin, Stephane

    2011-07-01

    The use of impedance-based label-free technology applied to drug discovery is nowadays receiving more and more attention. Indeed, such a simple and noninvasive assay that interferes minimally with cell morphology and function allows one to perform kinetic measurements and to obtain information on proliferation, migration, cytotoxicity, and receptor-mediated signaling. The objective of the study was to further assess the usefulness of a real-time cell analyzer (RTCA) platform based on impedance in the context of quality control and data reproducibility. The data indicate that this technology is useful to determine the best coating and cellular density conditions for different adherent cellular models including hepatocytes, cardiomyocytes, fibroblasts, and hybrid neuroblastoma/neuronal cells. Based on 31 independent experiments, the reproducibility of cell index data generated from HepG2 cells exposed to DMSO and to Triton X-100 was satisfactory, with a coefficient of variation close to 10%. Cell index data were also well reproduced when cardiomyocytes and fibroblasts were exposed to 21 compounds three times (correlation >0.91, p technology appears to be a powerful and reliable tool in drug discovery because of the reasonable throughput, rapid and efficient performance, technical optimization, and cell quality control.

  17. RNA-SeQC: RNA-seq metrics for quality control and process optimization.

    Science.gov (United States)

    DeLuca, David S; Levin, Joshua Z; Sivachenko, Andrey; Fennell, Timothy; Nazaire, Marc-Danie; Williams, Chris; Reich, Michael; Winckler, Wendy; Getz, Gad

    2012-06-01

    RNA-seq, the application of next-generation sequencing to RNA, provides transcriptome-wide characterization of cellular activity. Assessment of sequencing performance and library quality is critical to the interpretation of RNA-seq data, yet few tools exist to address this issue. We introduce RNA-SeQC, a program which provides key measures of data quality. These metrics include yield, alignment and duplication rates; GC bias, rRNA content, regions of alignment (exon, intron and intragenic), continuity of coverage, 3'/5' bias and count of detectable transcripts, among others. The software provides multi-sample evaluation of library construction protocols, input materials and other experimental parameters. The modularity of the software enables pipeline integration and the routine monitoring of key measures of data quality such as the number of alignable reads, duplication rates and rRNA contamination. RNA-SeQC allows investigators to make informed decisions about sample inclusion in downstream analysis. In summary, RNA-SeQC provides quality control measures critical to experiment design, process optimization and downstream computational analysis. See www.genepattern.org to run online, or www.broadinstitute.org/rna-seqc/ for a command line tool.

  18. A cellular automata model for social-learning processes in a classroom context

    Science.gov (United States)

    Bordogna, C. M.; Albano, E. V.

    2002-02-01

    A model for teaching-learning processes that take place in the classroom is proposed and simulated numerically. Recent ideas taken from the fields of sociology, educational psychology, statistical physics and computational science are key ingredients of the model. Results of simulations are consistent with well-established empirical results obtained in classrooms by means of different evaluation tools. It is shown that students engaged in collaborative groupwork reach higher achievements than those attending traditional lectures only. However, in many cases, this difference is subtle and consequently very difficult to be detected using tests. The influence of the number of students forming the collaborative groups on the average knowledge achieved is also studied and discussed.

  19. Cellular Aspects of Shigella Pathogenesis: Focus on the Manipulation of Host Cell Processes.

    Science.gov (United States)

    Killackey, Samuel A; Sorbara, Matthew T; Girardin, Stephen E

    2016-01-01

    Shigella is a Gram-negative bacterium that is responsible for shigellosis. Over the years, the study of Shigella has provided a greater understanding of how the host responds to bacterial infection, and how bacteria have evolved to effectively counter the host defenses. In this review, we provide an update on some of the most recent advances in our understanding of pivotal processes associated with Shigella infection, including the invasion into host cells, the metabolic changes that occur within the bacterium and the infected cell, cell-to-cell spread mechanisms, autophagy and membrane trafficking, inflammatory signaling and cell death. This recent progress sheds a new light into the mechanisms underlying Shigella pathogenesis, and also more generally provides deeper understanding of the complex interplay between host cells and bacterial pathogens in general.

  20. Isobutane Alkylation Process Synthesis by means of Hybrid Simulation-Multiobjective Optimization

    OpenAIRE

    Fernandez-Torres, Maria J.; García, Norberto; Caballero, José A.

    2014-01-01

    Multiobjective Generalized Disjunctive Programming (MO-GDP) optimization has been used for the synthesis of an important industrial process, isobutane alkylation. The two objective functions to be simultaneously optimized are the environmental impact, determined by means of LCA (Life Cycle Assessment), and the economic potential of the process. The main reason for including the minimization of the environmental impact in the optimization process is the widespread environmental concern by the ...

  1. Cellular responses of human astrocytoma cells to dust from the Acheson process: An in vitro study.

    Science.gov (United States)

    Arnoldussen, Yke Jildouw; Ervik, Torunn Kringlen; Berlinger, Balazs; Kero, Ida; Shaposhnikov, Sergey; Zienolddiny, Shanbeh

    2018-03-01

    Silicon carbide (SiC) is largely used in various products such as diesel particulate filters and solar panels. It is produced through the Acheson process where aerosolized fractions of SiC and other by-products are generated in the work environment and may potentially affect the workers' health. In this study, dust was collected directly on a filter in a furnace hall over a time period of 24h. The collected dust was characterized by scanning electron microscopy and found to contain a high content of graphite particles, and carbon and silicon containing particles. Only 6% was classified as SiC, whereof only 10% had a fibrous structure. To study effects of exposure beyond the respiratory system, neurotoxic effects on human astrocytic cells, were investigated. Both low, occupationally relevant, and high doses from 9E-6μg/cm 2 up to 4.5μg/cm 2 were used, respectively. Cytotoxicity assay indicated no effects of low doses but an effect of the higher doses after 24h. Furthermore, investigation of intracellular reactive oxygen species (ROS) indicated no effects with low doses, whereas a higher dose of 0.9μg/cm 2 induced a significant increase in ROS and DNA damage. In summary, low doses of dust from the Acheson process may exert no or little toxic effects, at least experimentally in the laboratory on human astrocytes. However, higher doses have implications and are likely a result of the complex composition of the dust. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  2. The surfactant protein C mutation A116D alters cellular processing, stress tolerance, surfactant lipid composition, and immune cell activation

    Directory of Open Access Journals (Sweden)

    Zarbock Ralf

    2012-03-01

    Full Text Available Abstract Background Surfactant protein C (SP-C is important for the function of pulmonary surfactant. Heterozygous mutations in SFTPC, the gene encoding SP-C, cause sporadic and familial interstitial lung disease (ILD in children and adults. Mutations mapping to the BRICHOS domain located within the SP-C proprotein result in perinuclear aggregation of the proprotein. In this study, we investigated the effects of the mutation A116D in the BRICHOS domain of SP-C on cellular homeostasis. We also evaluated the ability of drugs currently used in ILD therapy to counteract these effects. Methods SP-CA116D was expressed in MLE-12 alveolar epithelial cells. We assessed in vitro the consequences for cellular homeostasis, immune response and effects of azathioprine, hydroxychloroquine, methylprednisolone and cyclophosphamide. Results Stable expression of SP-CA116D in MLE-12 alveolar epithelial cells resulted in increased intracellular accumulation of proSP-C processing intermediates. SP-CA116D expression further led to reduced cell viability and increased levels of the chaperones Hsp90, Hsp70, calreticulin and calnexin. Lipid analysis revealed decreased intracellular levels of phosphatidylcholine (PC and increased lyso-PC levels. Treatment with methylprednisolone or hydroxychloroquine partially restored these lipid alterations. Furthermore, SP-CA116D cells secreted soluble factors into the medium that modulated surface expression of CCR2 or CXCR1 receptors on CD4+ lymphocytes and neutrophils, suggesting a direct paracrine effect of SP-CA116D on neighboring cells in the alveolar space. Conclusions We show that the A116D mutation leads to impaired processing of proSP-C in alveolar epithelial cells, alters cell viability and lipid composition, and also activates cells of the immune system. In addition, we show that some of the effects of the mutation on cellular homeostasis can be antagonized by application of pharmaceuticals commonly applied in ILD therapy

  3. An Optimized GD2-Targeting Retroviral Cassette for More Potent and Safer Cellular Therapy of Neuroblastoma and Other Cancers.

    Directory of Open Access Journals (Sweden)

    Simon Thomas

    Full Text Available Neuroblastoma is the commonest extra cranial solid cancer of childhood. Despite escalation of treatment regimens, a significant minority of patients die of their disease. Disialoganglioside (GD2 is consistently expressed at high-levels in neuroblastoma tumors, which have been targeted with some success using therapeutic monoclonal antibodies. GD2 is also expressed in a range of other cancer but with the exception of some peripheral nerves is largely absent from non-transformed tissues. Chimeric Antigen Receptors (CARs are artificial type I proteins which graft the specificity of a monoclonal antibody onto a T-cell. Clinical data with early CAR designs directed against GD2 have shown some promise in Neuroblastoma. Here, we describe a GD2-targeting CAR retroviral cassette, which has been optimized for CAR T-cell persistence, efficacy and safety.

  4. Optimal control of stretching process of flexible solar arrays on spacecraft based on a hybrid optimization strategy

    Directory of Open Access Journals (Sweden)

    Qijia Yao

    2017-07-01

    Full Text Available The optimal control of multibody spacecraft during the stretching process of solar arrays is investigated, and a hybrid optimization strategy based on Gauss pseudospectral method (GPM and direct shooting method (DSM is presented. First, the elastic deformation of flexible solar arrays was described approximately by the assumed mode method, and a dynamic model was established by the second Lagrangian equation. Then, the nonholonomic motion planning problem is transformed into a nonlinear programming problem by using GPM. By giving fewer LG points, initial values of the state variables and control variables were obtained. A serial optimization framework was adopted to obtain the approximate optimal solution from a feasible solution. Finally, the control variables were discretized at LG points, and the precise optimal control inputs were obtained by DSM. The optimal trajectory of the system can be obtained through numerical integration. Through numerical simulation, the stretching process of solar arrays is stable with no detours, and the control inputs match the various constraints of actual conditions. The results indicate that the method is effective with good robustness. Keywords: Motion planning, Multibody spacecraft, Optimal control, Gauss pseudospectral method, Direct shooting method

  5. Cellular scanning strategy for selective laser melting: Evolution of optimal grid-based scanning path & parametric approach to thermal homogeneity

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Tutum, Cem Celal; Hattel, Jesper Henri

    2013-01-01

    Selective laser melting, as a rapid manufacturing technology, is uniquely poised to enforce a paradigm shift in the manufacturing industry by eliminating the gap between job- and batch-production techniques. Products from this process, however, tend to show an increased amount of defects such as ...... strategy has been developed for processing the standard sample, one unit cell at a time, using genetic algorithms, with an objective of reducing thermal asymmetries. © 2013 SPIE....

  6. Processing and characterization of multi-cellular monolithic bioceramics for bone regenerative scaffolds

    Science.gov (United States)

    Ari-Wahjoedi, Bambang; Ginta, Turnad Lenggo; Parman, Setyamartana; Abustaman, Mohd Zikri Ahmad

    2014-10-01

    Multicellular monolithic ceramic body is a ceramic material which has many gas or liquid passages partitioned by thin walls throughout the bulk material. There are many currently known advanced industrial applications of multicellular ceramics structures i.e. as supports for various catalysts, electrode support structure for solid oxide fuel cells, refractories, electric/electronic materials, aerospace vehicle re-entry heat shields and biomaterials for dental as well as orthopaedic implants by naming only a few. Multicellular ceramic bodies are usually made of ceramic phases such as mullite, cordierite, aluminum titanate or pure oxides such as silica, zirconia and alumina. What make alumina ceramics is excellent for the above functions are the intrinsic properties of alumina which are hard, wear resistant, excellent dielectric properties, resists strong acid and alkali attacks at elevated temperatures, good thermal conductivities, high strength and stiffness as well as biocompatible. In this work the processing technology leading to truly multicellular monolithic alumina ceramic bodies and their characterization are reported. Ceramic slip with 66 wt.% solid loading was found to be optimum as impregnant to the polyurethane foam template. Mullitic ceramic composite of alumina-sodium alumino disilicate-Leucite-like phases with bulk and true densities of 0.852 and 1.241 g cm-3 respectively, pore linear density of ±35 cm-1, linear and bulk volume shrinkages of 7-16% and 32 vol.% were obtained. The compressive strength and elastic modulus of the bioceramics are ≈0.5-1.0 and ≈20 MPa respectively.

  7. Photostable bipolar fluorescent probe for video tracking plasma membranes related cellular processes.

    Science.gov (United States)

    Zhang, Xinfu; Wang, Chao; Jin, Liji; Han, Zhuo; Xiao, Yi

    2014-08-13

    Plasma membranes can sense the stimulations and transmit the signals from extracellular environment and then make further responses through changes in locations, shapes or morphologies. Common fluorescent membrane markers are not well suited for long time tracking due to their shorter retention time inside plasma membranes and/or their lower photostability. To this end, we develop a new bipolar marker, Mem-SQAC, which can stably insert into plasma membranes of different cells and exhibits a long retention time over 30 min. Mem-SQAC also inherits excellent photostability from the BODIPY dye family. Large two-photon absorption cross sections and long wavelength fluorescence emissions further enhance the competitiveness of Mem-SQAC as a membrane marker. By using Mem-SQAC, significant morphological changes of plasma membranes have been monitored during heavy metal poisoning and drug induced apoptosis of MCF-7 cells; the change tendencies are so distinctly different from each other that they can be used as indicators to distinguish different cell injuries. Further on, the complete processes of endocytosis toward Staphylococcus aureus and Escherichia coli by RAW 264.7 cells have been dynamically tracked. It is discovered that plasma membranes take quite different actions in response to the two bacteria, information unavailable in previous research reports.

  8. Coupled THM processes in EDZ of crystalline rocks using an elasto-plastic cellular automaton

    Science.gov (United States)

    Pan, Peng-Zhi; Feng, Xia-Ting; Huang, Xiao-Hua; Cui, Qiang; Zhou, Hui

    2009-05-01

    This paper aims at a numerical study of coupled thermal, hydrological and mechanical processes in the excavation disturbed zones (EDZ) around nuclear waste emplacement drifts in fractured crystalline rocks. The study was conducted for two model domains close to an emplacement tunnel; (1) a near-field domain and (2) a smaller wall-block domain. Goodman element and weak element were used to represent the fractures in the rock mass and the rock matrix was represented as elasto-visco-plastic material. Mohr-Coulomb criterion and a non-associated plastic flow rule were adopted to consider the viscoplastic deformation in the EDZ. A relation between volumetric strain and permeability was established. Using a self-developed EPCA2D code, the elastic, elasto-plastic and creep analyses to study the evolution of stress and deformations, as well as failure and permeability evolution in the EDZ were conducted. Results indicate a strong impact of fractures, plastic deformation and time effects on the behavior of EDZ especially the evolution of permeability around the drift.

  9. Signal processing for solar array monitoring, fault detection, and optimization

    CERN Document Server

    Braun, Henry; Spanias, Andreas

    2012-01-01

    Although the solar energy industry has experienced rapid growth recently, high-level management of photovoltaic (PV) arrays has remained an open problem. As sensing and monitoring technology continues to improve, there is an opportunity to deploy sensors in PV arrays in order to improve their management. In this book, we examine the potential role of sensing and monitoring technology in a PV context, focusing on the areas of fault detection, topology optimization, and performance evaluation/data visualization. First, several types of commonly occurring PV array faults are considered and detection algorithms are described. Next, the potential for dynamic optimization of an array's topology is discussed, with a focus on mitigation of fault conditions and optimization of power output under non-fault conditions. Finally, monitoring system design considerations such as type and accuracy of measurements, sampling rate, and communication protocols are considered. It is our hope that the benefits of monitoring presen...

  10. Processing and characterization of multi-cellular monolithic bioceramics for bone regenerative scaffolds

    International Nuclear Information System (INIS)

    Ari-Wahjoedi, Bambang; Ginta, Turnad Lenggo; Parman, Setyamartana; Abustaman, Mohd Zikri Ahmad

    2014-01-01

    Multicellular monolithic ceramic body is a ceramic material which has many gas or liquid passages partitioned by thin walls throughout the bulk material. There are many currently known advanced industrial applications of multicellular ceramics structures i.e. as supports for various catalysts, electrode support structure for solid oxide fuel cells, refractories, electric/electronic materials, aerospace vehicle re-entry heat shields and biomaterials for dental as well as orthopaedic implants by naming only a few. Multicellular ceramic bodies are usually made of ceramic phases such as mullite, cordierite, aluminum titanate or pure oxides such as silica, zirconia and alumina. What make alumina ceramics is excellent for the above functions are the intrinsic properties of alumina which are hard, wear resistant, excellent dielectric properties, resists strong acid and alkali attacks at elevated temperatures, good thermal conductivities, high strength and stiffness as well as biocompatible. In this work the processing technology leading to truly multicellular monolithic alumina ceramic bodies and their characterization are reported. Ceramic slip with 66 wt.% solid loading was found to be optimum as impregnant to the polyurethane foam template. Mullitic ceramic composite of alumina-sodium alumino disilicate-Leucite-like phases with bulk and true densities of 0.852 and 1.241 g cm −3 respectively, pore linear density of ±35 cm −1 , linear and bulk volume shrinkages of 7-16% and 32 vol.% were obtained. The compressive strength and elastic modulus of the bioceramics are ≈0.5-1.0 and ≈20 MPa respectively

  11. Processing and characterization of multi-cellular monolithic bioceramics for bone regenerative scaffolds

    Energy Technology Data Exchange (ETDEWEB)

    Ari-Wahjoedi, Bambang, E-mail: bambang-ariwahjoedi@petronas.com.my [Department of Fundamental and Applied Sciences, Universiti Teknologi PETRONAS, Bandar Seri Iskandar, 31750 Tronoh, Perak Darul Ridzuan (Malaysia); Centre for Intelligent Signal and Imaging Research, Universiti Teknologi PETRONAS, Bandar Seri Iskandar (Malaysia); Ginta, Turnad Lenggo [Department of Mechanical Engineering, Universiti Teknologi PETRONAS, Bandar Seri Iskandar, 31750 Tronoh, Perak Darul Ridzuan (Malaysia); Centre for Intelligent Signal and Imaging Research, Universiti Teknologi PETRONAS, Bandar Seri Iskandar, 31750 Tro (Malaysia); Parman, Setyamartana [Department of Mechanical Engineering, Universiti Teknologi PETRONAS, Bandar Seri Iskandar, 31750 Tronoh, Perak Darul Ridzuan (Malaysia); Abustaman, Mohd Zikri Ahmad [Kebabangan Petroleum Operating Company Sdn Bhd, Lvl. 52, Tower 2, PETRONAS Twin Towers, KLCC, 50088 Kuala Lumpur (Malaysia)

    2014-10-24

    Multicellular monolithic ceramic body is a ceramic material which has many gas or liquid passages partitioned by thin walls throughout the bulk material. There are many currently known advanced industrial applications of multicellular ceramics structures i.e. as supports for various catalysts, electrode support structure for solid oxide fuel cells, refractories, electric/electronic materials, aerospace vehicle re-entry heat shields and biomaterials for dental as well as orthopaedic implants by naming only a few. Multicellular ceramic bodies are usually made of ceramic phases such as mullite, cordierite, aluminum titanate or pure oxides such as silica, zirconia and alumina. What make alumina ceramics is excellent for the above functions are the intrinsic properties of alumina which are hard, wear resistant, excellent dielectric properties, resists strong acid and alkali attacks at elevated temperatures, good thermal conductivities, high strength and stiffness as well as biocompatible. In this work the processing technology leading to truly multicellular monolithic alumina ceramic bodies and their characterization are reported. Ceramic slip with 66 wt.% solid loading was found to be optimum as impregnant to the polyurethane foam template. Mullitic ceramic composite of alumina-sodium alumino disilicate-Leucite-like phases with bulk and true densities of 0.852 and 1.241 g cm{sup −3} respectively, pore linear density of ±35 cm{sup −1}, linear and bulk volume shrinkages of 7-16% and 32 vol.% were obtained. The compressive strength and elastic modulus of the bioceramics are ≈0.5-1.0 and ≈20 MPa respectively.

  12. Cellular distribution and function of ion channels involved in transport processes in rat tracheal epithelium.

    Science.gov (United States)

    Hahn, Anne; Faulhaber, Johannes; Srisawang, Lalita; Stortz, Andreas; Salomon, Johanna J; Mall, Marcus A; Frings, Stephan; Möhrlen, Frank

    2017-06-01

    Transport of water and electrolytes in airway epithelia involves chloride-selective ion channels, which are controlled either by cytosolic Ca 2+ or by cAMP The contributions of the two pathways to chloride transport differ among vertebrate species. Because rats are becoming more important as animal model for cystic fibrosis, we have examined how Ca 2+ - dependent and cAMP- dependent Cl - secretion is organized in the rat tracheal epithelium. We examined the expression of the Ca 2+ -gated Cl - channel anoctamin 1 (ANO1), the cystic fibrosis transmembrane conductance regulator (CFTR) Cl - channel, the epithelial Na + channel ENaC, and the water channel aquaporin 5 (AQP5) in rat tracheal epithelium. The contribution of ANO1 channels to nucleotide-stimulated Cl - secretion was determined using the channel blocker Ani9 in short-circuit current recordings obtained from primary cultures of rat tracheal epithelial cells in Ussing chambers. We found that ANO1, CFTR and AQP5 proteins were expressed in nonciliated cells of the tracheal epithelium, whereas ENaC was expressed in ciliated cells. Among nonciliated cells, ANO1 occurred together with CFTR and Muc5b and, in addition, in a different cell type without CFTR and Muc5b. Bioelectrical studies with the ANO1-blocker Ani9 indicated that ANO1 mediated the secretory response to the nucleotide uridine-5'-triphosphate. Our data demonstrate that, in rat tracheal epithelium, Cl - secretion and Na + absorption are routed through different cell types, and that ANO1 channels form the molecular basis of Ca 2+ -dependent Cl - secretion in this tissue. These characteristic features of Cl - -dependent secretion reveal similarities and distinct differences to secretory processes in human airways. © 2017 The Authors. Physiological Reports published by Wiley Periodicals, Inc. on behalf of The Physiological Society and the American Physiological Society.

  13. Human myosin VIIa is a very slow processive motor protein on various cellular actin structures.

    Science.gov (United States)

    Sato, Osamu; Komatsu, Satoshi; Sakai, Tsuyoshi; Tsukasaki, Yoshikazu; Tanaka, Ryosuke; Mizutani, Takeomi; Watanabe, Tomonobu M; Ikebe, Reiko; Ikebe, Mitsuo

    2017-06-30

    Human myosin VIIa (MYO7A) is an actin-linked motor protein associated with human Usher syndrome (USH) type 1B, which causes human congenital hearing and visual loss. Although it has been thought that the role of human myosin VIIa is critical for USH1 protein tethering with actin and transportation along actin bundles in inner-ear hair cells, myosin VIIa's motor function remains unclear. Here, we studied the motor function of the tail-truncated human myosin VIIa dimer (HM7AΔTail/LZ) at the single-molecule level. We found that the HM7AΔTail/LZ moves processively on single actin filaments with a step size of 35 nm. Dwell-time distribution analysis indicated an average waiting time of 3.4 s, yielding ∼0.3 s -1 for the mechanical turnover rate; hence, the velocity of HM7AΔTail/LZ was extremely slow, at 11 nm·s -1 We also examined HM7AΔTail/LZ movement on various actin structures in demembranated cells. HM7AΔTail/LZ showed unidirectional movement on actin structures at cell edges, such as lamellipodia and filopodia. However, HM7AΔTail/LZ frequently missed steps on actin tracks and exhibited bidirectional movement at stress fibers, which was not observed with tail-truncated myosin Va. These results suggest that the movement of the human myosin VIIa motor protein is more efficient on lamellipodial and filopodial actin tracks than on stress fibers, which are composed of actin filaments with different polarity, and that the actin structures influence the characteristics of cargo transportation by human myosin VIIa. In conclusion, myosin VIIa movement appears to be suitable for translocating USH1 proteins on stereocilia actin bundles in inner-ear hair cells. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  14. Multi-objective optimization model of CNC machining to minimize processing time and environmental impact

    Science.gov (United States)

    Hamada, Aulia; Rosyidi, Cucuk Nur; Jauhari, Wakhid Ahmad

    2017-11-01

    Minimizing processing time in a production system can increase the efficiency of a manufacturing company. Processing time are influenced by application of modern technology and machining parameter. Application of modern technology can be apply by use of CNC machining, one of the machining process can be done with a CNC machining is turning. However, the machining parameters not only affect the processing time but also affect the environmental impact. Hence, optimization model is needed to optimize the machining parameters to minimize the processing time and environmental impact. This research developed a multi-objective optimization to minimize the processing time and environmental impact in CNC turning process which will result in optimal decision variables of cutting speed and feed rate. Environmental impact is converted from environmental burden through the use of eco-indicator 99. The model were solved by using OptQuest optimization software from Oracle Crystal Ball.

  15. Analyses of Methods and Algorithms for Modelling and Optimization of Biotechnological Processes

    Directory of Open Access Journals (Sweden)

    Stoyan Stoyanov

    2009-08-01

    Full Text Available A review of the problems in modeling, optimization and control of biotechnological processes and systems is given in this paper. An analysis of existing and some new practical optimization methods for searching global optimum based on various advanced strategies - heuristic, stochastic, genetic and combined are presented in the paper. Methods based on the sensitivity theory, stochastic and mix strategies for optimization with partial knowledge about kinetic, technical and economic parameters in optimization problems are discussed. Several approaches for the multi-criteria optimization tasks are analyzed. The problems concerning optimal controls of biotechnological systems are also discussed.

  16. Theoretical aspects and modelling of cellular decision making, cell killing and information-processing in photodynamic therapy of cancer.

    Science.gov (United States)

    Gkigkitzis, Ioannis

    2013-01-01

    The aim of this report is to provide a mathematical model of the mechanism for making binary fate decisions about cell death or survival, during and after Photodynamic Therapy (PDT) treatment, and to supply the logical design for this decision mechanism as an application of rate distortion theory to the biochemical processing of information by the physical system of a cell. Based on system biology models of the molecular interactions involved in the PDT processes previously established, and regarding a cellular decision-making system as a noisy communication channel, we use rate distortion theory to design a time dependent Blahut-Arimoto algorithm where the input is a stimulus vector composed of the time dependent concentrations of three PDT related cell death signaling molecules and the output is a cell fate decision. The molecular concentrations are determined by a group of rate equations. The basic steps are: initialize the probability of the cell fate decision, compute the conditional probability distribution that minimizes the mutual information between input and output, compute the cell probability of cell fate decision that minimizes the mutual information and repeat the last two steps until the probabilities converge. Advance to the next discrete time point and repeat the process. Based on the model from communication theory described in this work, and assuming that the activation of the death signal processing occurs when any of the molecular stimulants increases higher than a predefined threshold (50% of the maximum concentrations), for 1800s of treatment, the cell undergoes necrosis within the first 30 minutes with probability range 90.0%-99.99% and in the case of repair/survival, it goes through apoptosis within 3-4 hours with probability range 90.00%-99.00%. Although, there is no experimental validation of the model at this moment, it reproduces some patterns of survival ratios of predicted experimental data. Analytical modeling based on cell death

  17. Design of experiment approach for the process optimization of ...

    African Journals Online (AJOL)

    Mulberry is considered as food-medicine herb, with specific nutritional and medicinal values. In this study, response surface methodology (RSM) was employed to optimize the ultrasonic-assisted extraction of total polysaccharide from mulberry using Box-Behnken design (BBD). Based on single factor experiments, a three ...

  18. Optimization of injection moulding process parameters in the ...

    African Journals Online (AJOL)

    In this study, optimal injection moulding conditions for minimum shrinkage during moulding of High Density Polyethylene (HDPE) were obtained by Taguchi method. The result showed that melting temperature of 190OC, injection pressure of 55 MPa, refilling pressure of 85 MPa and cooling time of 11 seconds gave ...

  19. Optimization of coagulation-flocculation process for colour removal ...

    African Journals Online (AJOL)

    Response surface methodology (RSM) using face-centered central composite design (FCCD) was used to optimize the four variables. Increase in the colour removal efficiency was higher in acidic solution pH. Accurate control of coagulant dosages gave optimum destabilization of charged particles and re-stabilization ...

  20. Response surface optimization of the process conditions for anti ...

    African Journals Online (AJOL)

    GREGORY

    2011-12-16

    Dec 16, 2011 ... These two phenolic acids are already known to have anti-diabetic properties from previous study. ... Key words: Anti-diabetic, Cucumis sativus, β-glucosidase inhibitor, optimization, phenolic acids. .... concentration of 0.022 unit/ml solution. ..... antioxidant properties (Srivastava et al., 2009). ... acids in beer.

  1. optimization of coagulation-flocculation process for colour removal

    African Journals Online (AJOL)

    user

    2DEPARTMENT OF CHEMICAL ENGINEERING, NNAMDI AZIKIWE UNIVERSITY, AWKA, ANAMBRA STATE. ... The ability of organic polymer rich coagulants for colour removal from acid dye was studied. ... Response surface methodology (RSM) using face-centered ...... successfully applied for modeling and optimizing the.

  2. Automatic generation of optimal business processes from business rules

    NARCIS (Netherlands)

    Steen, B.; Ferreira Pires, Luis; Iacob, Maria Eugenia

    2010-01-01

    In recent years, business process models are increasingly being used as a means for business process improvement. Business rules can be seen as requirements for business processes, in that they describe the constraints that must hold for business processes that implement these business rules.

  3. Parametric optimization of ultrasonic machining process using gravitational search and fireworks algorithms

    Directory of Open Access Journals (Sweden)

    Debkalpa Goswami

    2015-03-01

    Full Text Available Ultrasonic machining (USM is a mechanical material removal process used to erode holes and cavities in hard or brittle workpieces by using shaped tools, high-frequency mechanical motion and an abrasive slurry. Unlike other non-traditional machining processes, such as laser beam and electrical discharge machining, USM process does not thermally damage the workpiece or introduce significant levels of residual stress, which is important for survival of materials in service. For having enhanced machining performance and better machined job characteristics, it is often required to determine the optimal control parameter settings of an USM process. The earlier mathematical approaches for parametric optimization of USM processes have mostly yielded near optimal or sub-optimal solutions. In this paper, two almost unexplored non-conventional optimization techniques, i.e. gravitational search algorithm (GSA and fireworks algorithm (FWA are applied for parametric optimization of USM processes. The optimization performance of these two algorithms is compared with that of other popular population-based algorithms, and the effects of their algorithm parameters on the derived optimal solutions and computational speed are also investigated. It is observed that FWA provides the best optimal results for the considered USM processes.

  4. Optimization Of PVDF-TrFE Processing Conditions For The Fabrication Of Organic MEMS Resonators.

    Science.gov (United States)

    Ducrot, Pierre-Henri; Dufour, Isabelle; Ayela, Cédric

    2016-01-21

    This paper reports a systematic optimization of processing conditions of PVDF-TrFE piezoelectric thin films, used as integrated transducers in organic MEMS resonators. Indeed, despite data on electromechanical properties of PVDF found in the literature, optimized processing conditions that lead to these properties remain only partially described. In this work, a rigorous optimization of parameters enabling state-of-the-art piezoelectric properties of PVDF-TrFE thin films has been performed via the evaluation of the actuation performance of MEMS resonators. Conditions such as annealing duration, poling field and poling duration have been optimized and repeatability of the process has been demonstrated.

  5. Optimal fabrication processes for unidirectional metal-matrix composites: A computational simulation

    Science.gov (United States)

    Saravanos, D. A.; Murthy, P. L. N.; Morel, M.

    1990-01-01

    A method is proposed for optimizing the fabrication process of unidirectional metal matrix composites. The temperature and pressure histories are optimized such that the residual microstresses of the composite at the end of the fabrication process are minimized and the material integrity throughout the process is ensured. The response of the composite during the fabrication is simulated based on a nonlinear micromechanics theory. The optimal fabrication problem is formulated and solved with non-linear programming. Application cases regarding the optimization of the fabrication cool-down phases of unidirectional ultra-high modulus graphite/copper and silicon carbide/titanium composites are presented.

  6. Optimal fabrication processes for unidirectional metal-matrix composites - A computational simulation

    Science.gov (United States)

    Saravanos, D. A.; Murthy, P. L. N.; Morel, M.

    1990-01-01

    A method is proposed for optimizing the fabrication process of unidirectional metal matrix composites. The temperature and pressure histories are optimized such that the residual microstresses of the composite at the end of the fabrication process are minimized and the material integrity throughout the process is ensured. The response of the composite during the fabrication is simulated based on a nonlinear micromechanics theory. The optimal fabrication problem is formulated and solved with nonlinear programming. Application cases regarding the optimization of the fabrication cool-down phases of unidirectional ultra-high modulus graphite/copper and silicon carbide/titanium composites are presented.

  7. Biodiesel production from microalgae Spirulina maxima by two step process: Optimization of process variable

    Directory of Open Access Journals (Sweden)

    M.A. Rahman

    2017-04-01

    Full Text Available Biodiesel from green energy source is gaining tremendous attention for ecofriendly and economically aspect. In this investigation, a two-step process was developed for the production of biodiesel from microalgae Spirulina maxima and determined best operating conditions for the steps. In the first stage, acid esterification was conducted to lessen acid value (AV from 10.66 to 0.51 mgKOH/g of the feedstock and optimal conditions for maximum esterified oil yielding were found at molar ratio 12:1, temperature 60°C, 1% (wt% H2SO4, and mixing intensity 400 rpm for a reaction time of 90 min. The second stage alkali transesterification was carried out for maximum biodiesel yielding (86.1% and optimal conditions were found at molar ratio 9:1, temperature 65°C, mixing intensity 600 rpm, catalyst concentration 0.75% (wt% KOH for a reaction time of 20 min. Biodiesel were analyzed according to ASTM standards and results were within standards limit. Results will helpful to produce third generation algal biodiesel from microalgae Spirulina maxima in an efficient manner.

  8. Optimal scanning and image processing with the STEM

    International Nuclear Information System (INIS)

    Crewe, A.V.; Ohtsuki, M.

    1981-01-01

    We have recently published a theory of an optimal scanning system which is particularly suited for the STEM. One concludes from the theory that the diffraction limit of the electron probe should be a fixed fraction of the full-scale deflection in order to avoid scanning artifacts. More recently, we have confirmed the value of this technique by direct experiments. Our program now is to combine the use of optimal scanning with the use of a programmable digital refresh memory for image analysis. Limited experience to date indicates that false color conversion is probably more useful than histogram equalization in black and white and that this system is particularly valuable for rotational averaging and selected area Fourier transforms. (orig.)

  9. Optimal Input Strategy for Plug and Play Process Control Systems

    DEFF Research Database (Denmark)

    Kragelund, Martin Nygaard; Leth, John-Josef; Wisniewski, Rafal

    2010-01-01

    This paper considers the problem of optimal operation of a plant, which goal is to maintain production at minimum cost. The system considered in this work consists of a joined plant and redundant input systems. It is assumed that each input system contributes to a flow of goods into the joined pa...... the performance of the plant. The results are applied to a coal fired power plant where an additional new fuel system, gas, becomes available....

  10. OPTIMIZATION OF THE PROCESS OF DRYING THE FILTRATE DISTILLERY DREGS

    Directory of Open Access Journals (Sweden)

    A. A. Shevtsov

    2013-01-01

    Full Text Available The interactions of various factors affecting the process of drying the filtrate distillery dregs are investigated. Rational conditions for the process of drying the filtrate distillery dregs in a spray dryer are obtained.

  11. Manufacturing and process optimization of porous rice straw board

    Science.gov (United States)

    Liu, Dejun; Dong, Bing; Bai, Xuewei; Gao, Wei; Gong, Yuanjuan

    2018-03-01

    Development and utilization of straw resources and the production of straw board can dramatically reduce straw waste and environmental pollution associated with straw burning in China. However, the straw board production faces several challenges, such as improving the physical and mechanical properties, as well as eliminating its formaldehyde content. The recent research was to develop a new straw board compound adhesive containing both inorganic (MgSO4, MgCO3, active silicon and ALSiO4) and organic (bean gum and modified Methyl DiphenylDiisocyanate, MDI) gelling materials, to devise a new high frequency straw board hot pressing technique and to optimize the straw board production parameters. The results indicated that the key hot pressing parameters leading to porous straw board with optimal physical and mechanical properties. These parameters are as follows: an adhesive containing a 4:1 ratio of inorganic-to-organic gelled material, the percentage of adhesive in the total mass of preload straw materials is 40%, a hot-pressing temperature in the range of 120 °C to 140 °C, and a high frequency hot pressing for 10 times at a pressure of 30 MPa. Finally, the present work demonstrated that porous straw board fabricated under optimal manufacturing condition is an environmentally friendly and renewable materials, thereby meeting national standard of medium density fiberboard (MDF) with potential applications in the building industry.

  12. Multi-stage and multi-response process optimization in Taguchi ...

    African Journals Online (AJOL)

    Product quality is all about reducing variations of key performance indicators. However, product manufacturing often, requires multiple processes with multiple indicators, which make reducing variation a complex task. There are tools used to optimize a single stage process independently which ensure local optimization ...

  13. Modelling on optimal portfolio with exchange rate based on discontinuous stochastic process

    Science.gov (United States)

    Yan, Wei; Chang, Yuwen

    2016-12-01

    Considering the stochastic exchange rate, this paper is concerned with the dynamic portfolio selection in financial market. The optimal investment problem is formulated as a continuous-time mathematical model under mean-variance criterion. These processes follow jump-diffusion processes (Weiner process and Poisson process). Then the corresponding Hamilton-Jacobi-Bellman(HJB) equation of the problem is presented and its efferent frontier is obtained. Moreover, the optimal strategy is also derived under safety-first criterion.

  14. Optimal operation of integrated processes. Studies on heat recovery systems

    Energy Technology Data Exchange (ETDEWEB)

    Glemmestad, Bjoern

    1997-12-31

    Separators, reactors and a heat exchanger network (HEN) for heat recovery are important parts of an integrated plant. This thesis deals with the operation of HENs, in particular, optimal operation. The purpose of heat integration is to save energy, but the HEN also introduces new interactions and feedback into the overall plant. A prerequisite for optimisation is that there are extra degrees of freedom left after regulatory control is implemented. It is shown that extra degrees of freedom may not always be utilized for energy optimisation, and a quantitative expression for the degrees of freedom that can be so utilized are presented. A simplified expression that is often valid is also deduced. The thesis presents some improvements and generalisations of a structure based method that has been proposed earlier. Structural information is used to divide possible manipulations into three categories depending on how each manipulation affects the utility consumption. By means of these categories and two heuristic rules for operability, the possible manipulations are ordered in a priority table. This table is used to determine which manipulation should be preferred and which manipulation should be selected if an active manipulation is saturated. It is shown that the method may correspond to split-range control. A method that uses parametric information in addition to structural information is proposed. In this method, the optimal control structure is found through solving an integer programming problem. The thesis also proposes a method that combines the use of steady state optimisation and optimal selection of measurements. 86 refs., 46 figs., 8 tabs.

  15. Biologic phosphorus elimination - influencing parameters, boundary conditions, process optimation

    International Nuclear Information System (INIS)

    Dai Xiaohu.

    1992-01-01

    This paper first presents a systematic study of the basic process of biologic phosphorus elimination as employed by the original 'Phoredox (Main Stream) Process'. The conditions governing the process and the factors influencing its performance were determined by trial operation. A stationary model was developed for the purpose of modelling biologic phosphorus elimination in such a main stream process and optimising the dimensioning. The validity of the model was confirmed by operational data given in the literature and by operational data from the authors' own semitechnical-scale experimental plant. The model permits simulation of the values to be expected for effluent phosphorus and phosphate concentrations for given influent data and boundary conditions. It is thus possible to dimension a plant for accomodation of the original Phoredox (Main Stream) Process or any similar phosphorus eliminating plant that is to work according to the principle of the main stream process. (orig./EF) [de

  16. Integrated and Modular Design of an Optimized Process Architecture

    Directory of Open Access Journals (Sweden)

    Colin Raßfeld

    2013-07-01

    Full Text Available Global economic integration increased the complexity of business activities, so organizations are forced to become more efficient each day. Process organization is a very useful way of aligning organizational systems towards business processes. However, an organization must do more than just focus its attention and efforts on processes. The layout design has also a significant impact on the system performance.. We contribute to this field by developing a tailored process-oriented organizational structure and new layout design for the quality assurance of a leading German automotive manufacturer. The target concept we developed was evaluated by process owners and an IT-based process simulation. Our results provide solid empirical back-up in which the performance and effects are  assessed from a qualitative and quantitative perspective

  17. SIMULATION AS A TOOL FOR PROCESS OPTIMIZATION OF LOGISTIC SYSTEMS

    Directory of Open Access Journals (Sweden)

    Radko Popovič

    2015-09-01

    Full Text Available The paper deals with the simulation of the production processes, especially module of Siemens Tecnomatix software. Tecnomatix Process Simulate is designed for building new or modifying existing production processes. The simulation created in this software has a posibility for fast testing of planned changes or improvements of the production processes. On the base of simulation you can imagine the future picture of the real production system. 3D Simulation can reflects the actual status and conditions on the running system and of course, after some improvements, it can show the possible figure of the production system.

  18. Optimization of frozen wild blueberry vacuum drying process

    Directory of Open Access Journals (Sweden)

    Šumić Zdravko M.

    2015-01-01

    Full Text Available The objective of this research was to optimize the vacuum drying of frozen blueberries in order to preserve health benefits phytochemicals using response surface methodology. The drying was performed in a new design of vacuum dryer equipment. Investigated range of temperature was 46-74°C and of pressure 38-464 mbar. Total solids, total phenolics, vitamin C, anthocyanin content and total color change were used as quality indicators of dried blueberries. Within the experimental range of studied variables, the optimum conditions of 60 °C and 100 mbar were established for vacuum drying of blueberries. Separate validation experiments were conducted at optimum conditions to verify predictions and adequacy of the second-order polynomial models. Under these optimal conditions, the predicted amount of total phenolics was 3.70 mgCAE/100dw, vitamin C 59.79 mg/100gdw, anthocyanin content 2746.33 mg/100gdw, total solids 89.50% and total color change 88.83. [Projekat Ministarstva nauke Republike Srbije, br. TR 31044

  19. Models for optimizing the conveying process; Modelle in der Foerderprozessoptimierung

    Energy Technology Data Exchange (ETDEWEB)

    Koehler, U. [Vattenfall Europe Mining AG, Cottbus (Germany)

    2007-05-15

    Load- and time controlled use of excavator-conveyor-spreader equipment combinations in the overburden operation is of essential importance for achieving economic cost structures in opencast lignite mines. These effects result from optimizations based on realistic models. Vattenfall Europe Mining AG has successfully implemented a constant linkage of information from the geological model to the direct GPS-based operational management. With the help of this large-scale system model it was possible for the first time to operate two modernized bucket wheel excavators simultaneously with a spreader adjusted to performance limits. At the same time, quality requirements of overburden dumping were fulfilled. Special importance is attached to an uninterrupted, continuous mode of operation at the real, current capacity limit in the systems characteristic field. The Article explains the initial situation and the state-of-the-art technology for the model design as basis for the optimization of linked excavation, conveying and dumping systems. Furthermore, potential considerations from reports presented on the occasion of the Colloquium for Innovative Lignite Mining (KIB) and possible steps for the further technological development are outlined. (orig.)

  20. Optimization of Deacetylation Process for Regenerated Cellulose Hollow Fiber Membranes

    Directory of Open Access Journals (Sweden)

    Xuezhong He

    2017-01-01

    Full Text Available Cellulose acetate (CA hollow fibers were spun from a CA+ Polyvinylpyrrolidone (PVP/N-methyl-2-pyrrolidone (NMP/H2O dope solution and regenerated by deacetylation. The complete deacetylation time of 0.5 h was found at a high concentration (0.2 M NaOH ethanol (96% solution. The reaction rate of deacetylation with 0.5 M NaOH was faster in a 50% ethanol compared to a 96 vol.% ethanol. The hydrogen bond between CA and tertiary amide group of PVP was confirmed. The deacetylation parameters of NaOH concentration, reaction time, swelling time, and solution were investigated by orthogonal experimental design (OED method. The degree of cross-linking, the residual acetyl content, and the PVP content in the deacetylated membranes were determined by FTIR analysis. The conjoint analysis in the Statistical Product and Service Solutions (SPSS software was used to analyze the OED results, and the importance of the deacetylation parameters was sorted as Solution > Swelling time > Reaction time > Concentration. The optimal deacetylation condition of 96 vol.% ethanol solution, swelling time 24 h, the concentration of NaOH (0.075 M, and the reaction time (2 h were identified. The regenerated cellulose hollow fibers under the optimal deacetylation condition can be further used as precursors for preparation of hollow fiber carbon membranes.

  1. Optimizing the order processing of customized products using product configuration

    DEFF Research Database (Denmark)

    Hvam, Lars; Bonev, Martin; Denkena, B.

    2011-01-01

    . Product configuration based on integrated modular product structure and product family architecture has been recognized as an effective means for implementing mass customization. In order to evaluate the effects of product configuration on order processing, a study has been conducted by the Department...... and its benefits for the order processing have been evaluated....

  2. Rate-optimal Bayesian intensity smoothing for inhomogeneous Poisson processes

    NARCIS (Netherlands)

    Belitser, E.N.; Serra, P.; van Zanten, H.

    2015-01-01

    We apply nonparametric Bayesian methods to study the problem of estimating the intensity function of an inhomogeneous Poisson process. To motivate our results we start by analyzing count data coming from a call center which we model as a Poisson process. This analysis is carried out using a certain

  3. Optimal Control of Beer Fermentation Process Using Differential ...

    African Journals Online (AJOL)

    ADOWIE PERE

    ABSTRACT: In this paper, the mathematical model of batch fermentation process of ethanol was formulated. The method of differential transform was used to obtain the solution governing the fermentation process; the system of equation was transformed using the differential transform method. The result obtained from the ...

  4. Energy-saving management modelling and optimization for lead-acid battery formation process

    Science.gov (United States)

    Wang, T.; Chen, Z.; Xu, J. Y.; Wang, F. Y.; Liu, H. M.

    2017-11-01

    In this context, a typical lead-acid battery producing process is introduced. Based on the formation process, an efficiency management method is proposed. An optimization model with the objective to minimize the formation electricity cost in a single period is established. This optimization model considers several related constraints, together with two influencing factors including the transformation efficiency of IGBT charge-and-discharge machine and the time-of-use price. An example simulation is shown using PSO algorithm to solve this mathematic model, and the proposed optimization strategy is proved to be effective and learnable for energy-saving and efficiency optimization in battery producing industries.

  5. Interplay between cellular activity and three-dimensional scaffold-cell constructs with different foam structure processed by electron beam melting.

    Science.gov (United States)

    Nune, Krishna C; Misra, R Devesh K; Gaytan, Sara M; Murr, Lawrence E

    2015-05-01

    The cellular activity, biological response, and consequent integration of scaffold-cell construct in the physiological system are governed by the ability of cells to adhere, proliferate, and biomineralize. In this regard, we combine cellular biology and materials science and engineering to fundamentally elucidate the interplay between cellular activity and interconnected three-dimensional foamed architecture obtained by a novel process of electron beam melting and computational tools. Furthermore, the organization of key proteins, notably, actin, vinclulin, and fibronectin, involved in cellular activity and biological functions and relationship with the structure was explored. The interconnected foamed structure with ligaments was favorable to cellular activity that includes cell attachment, proliferation, and differentiation. The primary rationale for favorable modulation of cellular functions is that the foamed structure provided a channel for migration and communication between cells leading to highly mineralized extracellular matrix (ECM) by the differentiating osteoblasts. The filopodial interaction amongst cells on the ligaments was a governing factor in the secretion of ECM, with consequent influence on maturation and mineralization. © 2014 Wiley Periodicals, Inc.

  6. Design optimization of single mixed refrigerant LNG process using a hybrid modified coordinate descent algorithm

    Science.gov (United States)

    Qyyum, Muhammad Abdul; Long, Nguyen Van Duc; Minh, Le Quang; Lee, Moonyong

    2018-01-01

    Design optimization of the single mixed refrigerant (SMR) natural gas liquefaction (LNG) process involves highly non-linear interactions between decision variables, constraints, and the objective function. These non-linear interactions lead to an irreversibility, which deteriorates the energy efficiency of the LNG process. In this study, a simple and highly efficient hybrid modified coordinate descent (HMCD) algorithm was proposed to cope with the optimization of the natural gas liquefaction process. The single mixed refrigerant process was modeled in Aspen Hysys® and then connected to a Microsoft Visual Studio environment. The proposed optimization algorithm provided an improved result compared to the other existing methodologies to find the optimal condition of the complex mixed refrigerant natural gas liquefaction process. By applying the proposed optimization algorithm, the SMR process can be designed with the 0.2555 kW specific compression power which is equivalent to 44.3% energy saving as compared to the base case. Furthermore, in terms of coefficient of performance (COP), it can be enhanced up to 34.7% as compared to the base case. The proposed optimization algorithm provides a deep understanding of the optimization of the liquefaction process in both technical and numerical perspectives. In addition, the HMCD algorithm can be employed to any mixed refrigerant based liquefaction process in the natural gas industry.

  7. Process Optimization of EDM Cutting Process on Tool Steel using Zinc Coated Electrode

    Directory of Open Access Journals (Sweden)

    Hanizam H.

    2017-01-01

    Full Text Available In WEDM machining process, surface finish quality depends on intensity and duration of spark plasma. Electrode wire diameter has significant effect on the spark intensity and yet the studies on this matter still less. Therefore, the main objectives of this studies are to compare the different diameters of zinc coated and uncoated brass electrode on H13 tool steel surface roughness. The experiments were conducted on Sodick VZ300L WEDM and work piece material of tool steel AISI H13 block. Electrode of zinc coated brass with diameters of 0.1 mm, 0.2 mm, 0.25 mm and uncoated brass 0.2 mm were used. The surface roughness of cutting was measured using the SUR-FTEST SJ-410 Mitutoyo, surface roughness tester. The results suggest that better surface roughness quality can be achieved through smaller electrode wire diameter. The zinc coated improves flushing ability and sparks intensity resulting in better surface finish of H13 tool steel. New alloys and coating materials shall be experimented to optimized the process further.

  8. Cutting an NKG2D Ligand Short: Cellular Processing of the Peculiar Human NKG2D Ligand ULBP4

    Directory of Open Access Journals (Sweden)

    Tobias Zöller

    2018-03-01

    Full Text Available Stress-induced cell surface expression of MHC class I-related glycoproteins of the MIC and ULBP families allows for immune recognition of dangerous “self cells” by human cytotoxic lymphocytes via the NKG2D receptor. With two MIC molecules (MICA and MICB and six ULBP molecules (ULBP1–6, there are a total of eight human NKG2D ligands (NKG2DL. Since the discovery of the NKG2D–NKG2DL system, the cause for both redundancy and diversity of NKG2DL has been a major and ongoing matter of debate. NKG2DL diversity has been attributed, among others, to the selective pressure by viral immunoevasins, to diverse regulation of expression, to differential tissue expression as well as to variations in receptor interactions. Here, we critically review the current state of knowledge on the poorly studied human NKG2DL ULBP4. Summarizing available facts and previous studies, we picture ULBP4 as a peculiar ULBP family member distinct from other ULBP family members by various aspects. In addition, we provide novel experimental evidence suggesting that cellular processing gives rise to mature ULBP4 glycoproteins different to previous reports. Finally, we report on the proteolytic release of soluble ULBP4 and discuss these results in the light of known mechanisms for generation of soluble NKG2DL.

  9. Production of Low Cost Carbon-Fiber through Energy Optimization of Stabilization Process

    Directory of Open Access Journals (Sweden)

    Gelayol Golkarnarenji

    2018-03-01

    Full Text Available To produce high quality and low cost carbon fiber-based composites, the optimization of the production process of carbon fiber and its properties is one of the main keys. The stabilization process is the most important step in carbon fiber production that consumes a large amount of energy and its optimization can reduce the cost to a large extent. In this study, two intelligent optimization techniques, namely Support Vector Regression (SVR and Artificial Neural Network (ANN, were studied and compared, with a limited dataset obtained to predict physical property (density of oxidative stabilized PAN fiber (OPF in the second zone of a stabilization oven within a carbon fiber production line. The results were then used to optimize the energy consumption in the process. The case study can be beneficial to chemical industries involving carbon fiber manufacturing, for assessing and optimizing different stabilization process conditions at large.

  10. Optimal stopping and perpetual options for Lévy processes

    OpenAIRE

    Ernesto Mordecki

    2002-01-01

    Consider a model of a financial market with a stock driven by a Lévy process and constant interest rate. A closed formula for prices of perpetual American call options in terms of the overall supremum of the Lévy process, and a corresponding closed formula for perpetual American put options involving the infimum of the after-mentioned process are obtained. As a direct application of the previous results, a Black-Scholes type formula is given. Also as a consequence, simple explicit formulas fo...

  11. Bio-oil Production - Process Optimization and Product Quality

    DEFF Research Database (Denmark)

    Hoffmann, Jessica

    , fossil fuels still accounted for 87% of global and 81% of EU primary energy consumption. In an effort to reduce the carbon footprint of a continued supply of liquid fuels, processes utilizing biomass in general, and lignocellulosic biomass in particular, are being developed to replace their fossil...... such candidate is hydrothermal liquefaction (HTL), a thermochemical process that converts low-value biomass feedstocks to a high-value bio-through the use of hot compressed water and catalysts. As there is typically residual oxygen left in the bio-crude from HTL, further processing involves upgrading in order...

  12. Optimization of the encapsulation process of bituminized radioactive wastes

    International Nuclear Information System (INIS)

    Silva, Jarine E.C.; Tello, Clédola C.O.

    2017-01-01

    The objective of this paper is to propose alternatives for the deposition of bituminized waste in metallic packages coated with a cementitious matrix for surface repository, aiming to meet the standards criteria and increasing the integrity of the metallic packaging during the planned storage time, transportation and disposal. For this purpose, tests will be carried out to evaluate cement pastes and mortar with cementitious additives, aiming at the durability and reduction of pores. Leaching tests with different thicknesses will also be carried out, where optimization of the encapsulation can meet safety, durability and economy standards for the repository, as well as practices that contribute to reduce environmental impacts and the economic burden imposed on future generations

  13. Preparation of Highly Conductive Yarns by an Optimized Impregnation Process

    Science.gov (United States)

    Amba Sankar, K. N.; Mohanta, Kallol

    2018-03-01

    We report the development of the electrical conductivity in textile yarns through impregnation and post-treatment of poly(3,4-ethylenedioxythiophene) polystyrene sulfonate (PEDOT:PSS). The conductive polymer is deposited on fibers, which fills the gap space within the hierarchical structure of the yarns. Organic nonpolar solvents act as reducing agent to increase the density of PEDOT moieties on the yarns, galvanizing increment in conductivity values. Post-treatment by ethylene glycol transforms the resonance configuration of the conductive moieties of conjugated polymer, which helps in further enhancement of electrical conductivity of the yarns. We have optimized the method in terms of loading and conformal change of the polymer to have a lesser resistance of the coated conductive yarns. The minimum resistance achieved has a value of 77 Ωcm-1. This technique of developing conductivity in conventional yarns enables retaining the flexibility of yarns and feeling of softness which would find suitable␣applications for wearable electronics.

  14. Optimization of Injection Moulding Process Parameters in the ...

    African Journals Online (AJOL)

    ADOWIE PERE

    https://www.ajol.info/index.php/jasem ... Cooling time was found to be the factor with most significant effect on ... Keywords: High Density Polyethylene (HDPE), Injection Moulding, Process .... value of shrinkage behavior is expected to be.

  15. Optimization of Wireless Transceivers under Processing Energy Constraints

    Science.gov (United States)

    Wang, Gaojian; Ascheid, Gerd; Wang, Yanlu; Hanay, Oner; Negra, Renato; Herrmann, Matthias; Wehn, Norbert

    2017-09-01

    Focus of the article is on achieving maximum data rates under a processing energy constraint. For a given amount of processing energy per information bit, the overall power consumption increases with the data rate. When targeting data rates beyond 100 Gb/s, the system's overall power consumption soon exceeds the power which can be dissipated without forced cooling. To achieve a maximum data rate under this power constraint, the processing energy per information bit must be minimized. Therefore, in this article, suitable processing efficient transmission schemes together with energy efficient architectures and their implementations are investigated in a true cross-layer approach. Target use cases are short range wireless transmitters working at carrier frequencies around 60 GHz and bandwidths between 1 GHz and 10 GHz.

  16. Optimal linear filtering of Poisson process with dead time

    International Nuclear Information System (INIS)

    Glukhova, E.V.

    1993-01-01

    The paper presents a derivation of an integral equation defining the impulsed transient of optimum linear filtering for evaluation of the intensity of the fluctuating Poisson process with allowance for dead time of transducers

  17. Optimization of turning process parameters by using grey-Taguchi

    African Journals Online (AJOL)

    DR OKE

    ... India continue to choose the operating conditions solely on the basis of handbook values .... Surface Roughness Measuring instrument ... process control parameters like spindle speed, feed and depth of cut. ..... and Industrial Engineering.

  18. Optimized Laplacian image sharpening algorithm based on graphic processing unit

    Science.gov (United States)

    Ma, Tinghuai; Li, Lu; Ji, Sai; Wang, Xin; Tian, Yuan; Al-Dhelaan, Abdullah; Al-Rodhaan, Mznah

    2014-12-01

    In classical Laplacian image sharpening, all pixels are processed one by one, which leads to large amount of computation. Traditional Laplacian sharpening processed on CPU is considerably time-consuming especially for those large pictures. In this paper, we propose a parallel implementation of Laplacian sharpening based on Compute Unified Device Architecture (CUDA), which is a computing platform of Graphic Processing Units (GPU), and analyze the impact of picture size on performance and the relationship between the processing time of between data transfer time and parallel computing time. Further, according to different features of different memory, an improved scheme of our method is developed, which exploits shared memory in GPU instead of global memory and further increases the efficiency. Experimental results prove that two novel algorithms outperform traditional consequentially method based on OpenCV in the aspect of computing speed.

  19. Characterization and Optimization of Dual Anaerobic/Aerobic Biofilm Process

    National Research Council Canada - National Science Library

    Togna, A

    1997-01-01

    The purpose of this Phase I STTR effort was to develop and characterize a dual anaerobic/aerobic biofilm process that promotes anaerobic reductive dehalogenation and aerobic cometabolic biodegradation...

  20. Optimization of the soldering process by the DMAIC methodology

    Directory of Open Access Journals (Sweden)

    Michał Zasadzień

    2016-06-01

    Full Text Available The chapter presents the use of the DMAIC method for the analysis and improvement of the process of soldering pins in a plug connecting a bundle of wires to the board of a controller; a part of the steering system of a car. The main problem in the soldering process, that is an unsatisfactory share of bad soldered connections between the board and the plug and the instability of that number, was identified by means of a five-phase improvement process. Key points and main causes of the defect were pointed out, and process improvement measures were suggested. Due to the analysis conducted and the correct implementation of improvement measures the share of defective connections has been decreased twofold.

  1. Process modelling and optimization of osmotic dehydration assisted ...

    African Journals Online (AJOL)

    ... ash content, water loss and solid gain were estimated as quality parameters. Model equations were developed with Essential Regression (ESSREG) software package which related output parameters to process variables and validated.

  2. Optimizing the availability of a buffered industrial process

    Science.gov (United States)

    Martz, Jr., Harry F.; Hamada, Michael S.; Koehler, Arthur J.; Berg, Eric C.

    2004-08-24

    A computer-implemented process determines optimum configuration parameters for a buffered industrial process. A population size is initialized by randomly selecting a first set of design and operation values associated with subsystems and buffers of the buffered industrial process to form a set of operating parameters for each member of the population. An availability discrete event simulation (ADES) is performed on each member of the population to determine the product-based availability of each member. A new population is formed having members with a second set of design and operation values related to the first set of design and operation values through a genetic algorithm and the product-based availability determined by the ADES. Subsequent population members are then determined by iterating the genetic algorithm with product-based availability determined by ADES to form improved design and operation values from which the configuration parameters are selected for the buffered industrial process.

  3. Optimization and Control of Pressure Swing Adsorption Processes Under Uncertainty

    KAUST Repository

    Khajuria, Harish; Pistikopoulos, Efstratios N.

    2012-01-01

    The real-time periodic performance of a pressure swing adsorption (PSA) system strongly depends on the choice of key decision variables and operational considerations such as processing steps and column pressure temporal profiles, making its design

  4. Probabilistic safety assessment and optimal control of hazardous technological systems. A marked point process approach

    International Nuclear Information System (INIS)

    Holmberg, J.

    1997-04-01

    The thesis models risk management as an optimal control problem for a stochastic process. The approach classes the decisions made by management into three categories according to the control methods of a point process: (1) planned process lifetime, (2) modification of the design, and (3) operational decisions. The approach is used for optimization of plant shutdown criteria and surveillance test strategies of a hypothetical nuclear power plant

  5. Probabilistic safety assessment and optimal control of hazardous technological systems. A marked point process approach

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J [VTT Automation, Espoo (Finland)

    1997-04-01

    The thesis models risk management as an optimal control problem for a stochastic process. The approach classes the decisions made by management into three categories according to the control methods of a point process: (1) planned process lifetime, (2) modification of the design, and (3) operational decisions. The approach is used for optimization of plant shutdown criteria and surveillance test strategies of a hypothetical nuclear power plant. 62 refs. The thesis includes also five previous publications by author.

  6. Optimization of dissolution process parameters for uranium ore concentrate powders

    Energy Technology Data Exchange (ETDEWEB)

    Misra, M.; Reddy, D.M.; Reddy, A.L.V.; Tiwari, S.K.; Venkataswamy, J.; Setty, D.S.; Sheela, S.; Saibaba, N. [Nuclear Fuel Complex, Hyderabad (India)

    2013-07-01

    Nuclear fuel complex processes Uranium Ore Concentrate (UOC) for producing uranium dioxide powder required for the fabrication of fuel assemblies for Pressurized Heavy Water Reactor (PHWR)s in India. UOC is dissolved in nitric acid and further purified by solvent extraction process for producing nuclear grade UO{sub 2} powder. Dissolution of UOC in nitric acid involves complex nitric oxide based reactions, since it is in the form of Uranium octa oxide (U{sub 3}O{sub 8}) or Uranium Dioxide (UO{sub 2}). The process kinetics of UOC dissolution is largely influenced by parameters like concentration and flow rate of nitric acid, temperature and air flow rate and found to have effect on recovery of nitric oxide as nitric acid. The plant scale dissolution of 2 MT batch in a single reactor is studied and observed excellent recovery of oxides of nitrogen (NO{sub x}) as nitric acid. The dissolution process is automated by PLC based Supervisory Control and Data Acquisition (SCADA) system for accurate control of process parameters and successfully dissolved around 200 Metric Tons of UOC. The paper covers complex chemistry involved in UOC dissolution process and also SCADA system. The solid and liquid reactions were studied along with multiple stoichiometry of nitrous oxide generated. (author)

  7. Cellular automata-based forecasting of the impact of accidental fire and toxic dispersion in process industries

    International Nuclear Information System (INIS)

    Sarkar, Chinmoy; Abbasi, S.A.

    2006-01-01

    The strategies to prevent accidents from occurring in a process industry, or to minimize the harm if an accident does take place, always revolve around forecasting the likely accidents and their impacts. Based on the likely frequency and severity of the accidents, resources are committed towards preventing the accidents. Nearly all techniques of ranking hazardous units, be it the hazard and operability studies, fault tree analysis, hazard indice, etc. - qualitative as well as quantitative - depend essentially on the assessment of the likely frequency and the likely harm accidents in different units may cause. This fact makes it exceedingly important that the forecasting the accidents and their likely impact is done as accurately as possible. In the present study we introduce a new approach to accident forecasting based on the discrete modeling paradigm of cellular automata. In this treatment an accident is modeled as a self-evolving phenomena, the impact of which is strongly influenced by the size, nature, and position of the environmental components which lie in the vicinity of the accident site. The outward propagation of the mass, energy and momentum from the accident epicenter is modeled as a fast diffusion process occurring in discrete space-time coordinates. The quantum of energy and material that would flow into each discrete space element (cell) due to the accidental release is evaluated and the degree of vulnerability posed to the receptors if present in the cell is measured at the end of each time element. This approach is able to effectively take into account the modifications in the flux of energy and material which occur as a result of the heterogeneous environment prevailing between the accident epicenter and the receptor. Consequently, more realistic accident scenarios are generated than possible with the prevailing techniques. The efficacy of the approach has been illustrated with case studies

  8. Optimization of submerged arc welding process parameters using quasi-oppositional based Jaya algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Rao, R. Venkata; Rai, Dhiraj P. [Sardar Vallabhbhai National Institute of Technology, Gujarat (India)

    2017-05-15

    Submerged arc welding (SAW) is characterized as a multi-input process. Selection of optimum combination of process parameters of SAW process is a vital task in order to achieve high quality of weld and productivity. The objective of this work is to optimize the SAW process parameters using a simple optimization algorithm, which is fast, robust and convenient. Therefore, in this work a very recently proposed optimization algorithm named Jaya algorithm is applied to solve the optimization problems in SAW process. In addition, a modified version of Jaya algorithm with oppositional based learning, named “Quasi-oppositional based Jaya algorithm” (QO-Jaya) is proposed in order to improve the performance of the Jaya algorithm. Three optimization case studies are considered and the results obtained by Jaya algorithm and QO-Jaya algorithm are compared with the results obtained by well-known optimization algorithms such as Genetic algorithm (GA), Particle swarm optimization (PSO), Imperialist competitive algorithm (ICA) and Teaching learning based optimization (TLBO).

  9. Optimization of submerged arc welding process parameters using quasi-oppositional based Jaya algorithm

    International Nuclear Information System (INIS)

    Rao, R. Venkata; Rai, Dhiraj P.

    2017-01-01

    Submerged arc welding (SAW) is characterized as a multi-input process. Selection of optimum combination of process parameters of SAW process is a vital task in order to achieve high quality of weld and productivity. The objective of this work is to optimize the SAW process parameters using a simple optimization algorithm, which is fast, robust and convenient. Therefore, in this work a very recently proposed optimization algorithm named Jaya algorithm is applied to solve the optimization problems in SAW process. In addition, a modified version of Jaya algorithm with oppositional based learning, named “Quasi-oppositional based Jaya algorithm” (QO-Jaya) is proposed in order to improve the performance of the Jaya algorithm. Three optimization case studies are considered and the results obtained by Jaya algorithm and QO-Jaya algorithm are compared with the results obtained by well-known optimization algorithms such as Genetic algorithm (GA), Particle swarm optimization (PSO), Imperialist competitive algorithm (ICA) and Teaching learning based optimization (TLBO).

  10. Scaffolding as an effort for thinking process optimization on heredity

    Science.gov (United States)

    Azizah, N. R.; Masykuri, M.; Prayitno, B. A.

    2018-04-01

    Thinking is an activity and process of manipulating and transforming data or information into memory. Thinking process is different between one and other person. Thinking process can be developed by interaction between student and their environment, such as scaffolding. Given scaffolding is based on each student necessity. There are 2 level on scaffolding such as explaining, reviewing, and restructuring; and developing conceptual thinking. This research is aimed to describe student’s thinking process on heredity especially on inheritance that is before and after scaffolding. This research used descriptive qualitative method. There were three kinds of subject degree such as the students with high, middle, and low achieving students. The result showed that subjects had some difficulty in dihybrid inheritance question in different place. Most difficulty was on determining the number of different characteristic, parental genotype, gamete, and ratio of genotype and phenotype F2. Based on discussed during scaffolding showed that the subjects have some misunderstanding terms and difficulty to determine parental, gamete, genotype, and phenotype. Final result in this research showed that the subjects develop thinking process higher after scaffolding. Therefore the subjects can solve question properly.

  11. Optimal PID settings for first and second-order processes - Comparison with different controller tuning approaches

    OpenAIRE

    Pappas, Iosif

    2016-01-01

    PID controllers are extensively used in industry. Although many tuning methodologies exist, finding good controller settings is not an easy task and frequently optimization-based design is preferred to satisfy more complex criteria. In this thesis, the focus was to find which tuning approaches, if any, present close to optimal behavior. Pareto-optimal controllers were found for different first and second-order processes with time delay. Performance was quantified in terms of the integrat...

  12. Integrating multi-objective optimization with computational fluid dynamics to optimize boiler combustion process of a coal fired power plant

    International Nuclear Information System (INIS)

    Liu, Xingrang; Bansal, R.C.

    2014-01-01

    Highlights: • A coal fired power plant boiler combustion process model based on real data. • We propose multi-objective optimization with CFD to optimize boiler combustion. • The proposed method uses software CORBA C++ and ANSYS Fluent 14.5 with AI. • It optimizes heat flux transfers and maintains temperature to avoid ash melt. - Abstract: The dominant role of electricity generation and environment consideration have placed strong requirements on coal fired power plants, requiring them to improve boiler combustion efficiency and decrease carbon emission. Although neural network based optimization strategies are often applied to improve the coal fired power plant boiler efficiency, they are limited by some combustion related problems such as slagging. Slagging can seriously influence heat transfer rate and decrease the boiler efficiency. In addition, it is difficult to measure slag build-up. The lack of measurement for slagging can restrict conventional neural network based coal fired boiler optimization, because no data can be used to train the neural network. This paper proposes a novel method of integrating non-dominated sorting genetic algorithm (NSGA II) based multi-objective optimization with computational fluid dynamics (CFD) to decrease or even avoid slagging inside a coal fired boiler furnace and improve boiler combustion efficiency. Compared with conventional neural network based boiler optimization methods, the method developed in the work can control and optimize the fields of flue gas properties such as temperature field inside a boiler by adjusting the temperature and velocity of primary and secondary air in coal fired power plant boiler control systems. The temperature in the vicinity of water wall tubes of a boiler can be maintained within the ash melting temperature limit. The incoming ash particles cannot melt and bond to surface of heat transfer equipment of a boiler. So the trend of slagging inside furnace is controlled. Furthermore, the

  13. A Draw-In Sensor for Process Control and Optimization

    International Nuclear Information System (INIS)

    Mahayotsanun, Numpon; Cao, Jian; Peshkin, Michael

    2005-01-01

    Sheet metal forming is one of the major processes in manufacturing and is broadly used due to its high degree of design flexibility and low cost. In the sheet metal forming process, draw-in (planar movement of a sheet periphery) frequently occurs and is one of the most dominated indicators on the success of a forming process. Currently, monitoring and controlling draw-in during each stamping operation requires either time-consuming setup or a significant die modification. Most devices have been used only in laboratory settings. Our goal is to design a draw-in sensor providing high sensitivity in monitoring; ease of setup, measurement and controlling; and eventually be implemented in industry. Our design is based on the mutual inductance principle, which we considered physical factors affecting the characteristics of the draw-in sensor. Two different configurations, single-transducer and double-transducer of our draw-in sensors have been designed and tested. The results showed good linearity, especially for the double-transducer case. The output of the draw-in sensor was affected by the type of sheet metal, dimension of the transducer, and the distance between the transducer and the testing sheet metal. It was found that the result was insensitive to the waviness of the sheet metal if sheet thickness was thin. The invention, implementation, and integration of the draw-in sensor will have an enormous impact on revolutionizing the control of stamping process, will provide solid ground for process variation and uncertainty studies, and ultimately will affect the design decision process

  14. A Bayesian optimal design for degradation tests based on the inverse Gaussian process

    Energy Technology Data Exchange (ETDEWEB)

    Peng, Weiwen; Liu, Yu; Li, Yan Feng; Zhu, Shun Peng; Huang, Hong Zhong [University of Electronic Science and Technology of China, Chengdu (China)

    2014-10-15

    The inverse Gaussian process is recently introduced as an attractive and flexible stochastic process for degradation modeling. This process has been demonstrated as a valuable complement for models that are developed on the basis of the Wiener and gamma processes. We investigate the optimal design of the degradation tests on the basis of the inverse Gaussian process. In addition to an optimal design with pre-estimated planning values of model parameters, we also address the issue of uncertainty in the planning values by using the Bayesian method. An average pre-posterior variance of reliability is used as the optimization criterion. A trade-off between sample size and number of degradation observations is investigated in the degradation test planning. The effects of priors on the optimal designs and on the value of prior information are also investigated and quantified. The degradation test planning of a GaAs Laser device is performed to demonstrate the proposed method.

  15. Off-Policy Reinforcement Learning: Optimal Operational Control for Two-Time-Scale Industrial Processes.

    Science.gov (United States)

    Li, Jinna; Kiumarsi, Bahare; Chai, Tianyou; Lewis, Frank L; Fan, Jialu

    2017-12-01

    Industrial flow lines are composed of unit processes operating on a fast time scale and performance measurements known as operational indices measured at a slower time scale. This paper presents a model-free optimal solution to a class of two time-scale industrial processes using off-policy reinforcement learning (RL). First, the lower-layer unit process control loop with a fast sampling period and the upper-layer operational index dynamics at a slow time scale are modeled. Second, a general optimal operational control problem is formulated to optimally prescribe the set-points for the unit industrial process. Then, a zero-sum game off-policy RL algorithm is developed to find the optimal set-points by using data measured in real-time. Finally, a simulation experiment is employed for an industrial flotation process to show the effectiveness of the proposed method.

  16. Combining On-Line Characterization Tools with Modern Software Environments for Optimal Operation of Polymerization Processes

    Directory of Open Access Journals (Sweden)

    Navid Ghadipasha

    2016-02-01

    Full Text Available This paper discusses the initial steps towards the formulation and implementation of a generic and flexible model centric framework for integrated simulation, estimation, optimization and feedback control of polymerization processes. For the first time it combines the powerful capabilities of the automatic continuous on-line monitoring of polymerization system (ACOMP, with a modern simulation, estimation and optimization software environment towards an integrated scheme for the optimal operation of polymeric processes. An initial validation of the framework was performed for modelling and optimization using literature data, illustrating the flexibility of the method to apply under different systems and conditions. Subsequently, off-line capabilities of the system were fully tested experimentally for model validations, parameter estimation and process optimization using ACOMP data. Experimental results are provided for free radical solution polymerization of methyl methacrylate.

  17. Process Optimization for Spray Coating of Poly (vinyl pyrrolidone)

    DEFF Research Database (Denmark)

    Bose, Sanjukta; Keller, Stephan Sylvest; Boisen, Anja

    of drying of the spray on the substrate. The depositions can be broadly classified into a dry state, a wet state and an optimized condition in between. The profilometer scan in fig. 3 and the microscope images in fig.4 show the surface for a distance between the nozzle and the substrate of (a) 100mm, (b) 70......mm and (c) 90mm respectively. The further the nozzle is away from the substrate the faster the deposited polymer film dries. Spraying with a distance of 100mm gives rise to the dry state (fig. 3a) with avg. roughness (Ra) 158nm. When the distance between nozzle and substrate decreases to 70mm i...... a compromise between the dry and the wet state where Ra is 76nm but there are no edge peaks as shown before. With an increase in temperature (fig. 5a, b and c) the deposition moves from the wet to dry state were roughness increases due to rapid drying of the sprayed drops. Same dry state is observed...

  18. Optimization of multi-objective integrated process planning and scheduling problem using a priority based optimization algorithm

    Science.gov (United States)

    Ausaf, Muhammad Farhan; Gao, Liang; Li, Xinyu

    2015-12-01

    For increasing the overall performance of modern manufacturing systems, effective integration of process planning and scheduling functions has been an important area of consideration among researchers. Owing to the complexity of handling process planning and scheduling simultaneously, most of the research work has been limited to solving the integrated process planning and scheduling (IPPS) problem for a single objective function. As there are many conflicting objectives when dealing with process planning and scheduling, real world problems cannot be fully captured considering only a single objective for optimization. Therefore considering multi-objective IPPS (MOIPPS) problem is inevitable. Unfortunately, only a handful of research papers are available on solving MOIPPS problem. In this paper, an optimization algorithm for solving MOIPPS problem is presented. The proposed algorithm uses a set of dispatching rules coupled with priority assignment to optimize the IPPS problem for various objectives like makespan, total machine load, total tardiness, etc. A fixed sized external archive coupled with a crowding distance mechanism is used to store and maintain the non-dominated solutions. To compare the results with other algorithms, a C-matric based method has been used. Instances from four recent papers have been solved to demonstrate the effectiveness of the proposed algorithm. The experimental results show that the proposed method is an efficient approach for solving the MOIPPS problem.

  19. Developing a Quality Improvement Process to Optimize Faculty Success

    Science.gov (United States)

    Merillat, Linda; Scheibmeir, Monica

    2016-01-01

    As part of a major shift to embed quality improvement processes within a School of Nursing at a medium-sized Midwestern university, a faculty enrichment program using a Plan-Do-Act-Study design was implemented. A central focus for the program was the development and maintenance of an online faculty resource center identified as "My Faculty…

  20. Structural optimization for materially informed design to robotic production processes

    NARCIS (Netherlands)

    Bier, H.H.; Mostafavi, S.

    2015-01-01

    Hyperbody’s materially informed Design-to-Robotic-Production (D2RP) processes for additive and subtractive manufacturing aim to achieve performative porosity in architecture at various scales. An extended series of D2RP experiments aiming to produce prototypes at 1:1 scale wherein design materiality

  1. Biodiesel production from Jatropha curcas: Integrated process optimization

    International Nuclear Information System (INIS)

    Huerga, Ignacio R.; Zanuttini, María Soledad; Gross, Martín S.; Querini, Carlos A.

    2014-01-01

    Highlights: • The oil obtained from Jatropha curcas fruits has high variability in its properties. • A process for biodiesel production has been developed for small scale projects. • Oil neutralization with the glycerine phase has important advantages. • The glycerine phase and the meal are adequate to produce biogas. - Abstract: Energy obtained from renewable sources has increased its participation in the energy matrix worldwide, and it is expected to maintain this tendency. Both in large and small scales, there have been numerous developments and research with the aim of generating fuels and energy using different raw materials such as alternative crops, algae and lignocellulosic residues. In this work, Jatropha curcas plantation from the North West of Argentina was studied, with the objective of developing integrated processes for low and medium sizes farms. In these cases, glycerine purification and meal detoxification processes represent a very high cost, and usually are not included in the project. Consequently, alternative uses for these products are proposed. This study includes the evaluation of the Jatropha curcas crop during two years, evaluating the yields and oil properties. The solids left after the oil extraction were evaluated as solid fuels, the glycerine and the meal were used to generate biogas, and the oil was used to produce biodiesel. The oil pretreatment was carried out with the glycerine obtained in the biodiesel production process, thus neutralizing the free fatty acid, and decreasing the phosphorous and water content

  2. Optimizing an Immersion ESL Curriculum Using Analytic Hierarchy Process

    Science.gov (United States)

    Tang, Hui-Wen Vivian

    2011-01-01

    The main purpose of this study is to fill a substantial knowledge gap regarding reaching a uniform group decision in English curriculum design and planning. A comprehensive content-based course criterion model extracted from existing literature and expert opinions was developed. Analytical hierarchy process (AHP) was used to identify the relative…

  3. Optimizing access to conditions data in ATLAS event data processing

    CERN Document Server

    Rinaldi, Lorenzo; The ATLAS collaboration

    2018-01-01

    The processing of ATLAS event data requires access to conditions data which is stored in database systems. This data includes, for example alignment, calibration, and configuration information that may be characterized by large volumes, diverse content, and/or information which evolves over time as refinements are made in those conditions. Additional layers of complexity are added by the need to provide this information across the world-wide ATLAS computing grid and the sheer number of simultaneously executing processes on the grid, each demanding a unique set of conditions to proceed. Distributing this data to all the processes that require it in an efficient manner has proven to be an increasing challenge with the growing needs and number of event-wise tasks. In this presentation, we briefly describe the systems in which we have collected information about the use of conditions in event data processing. We then proceed to explain how this information has been used to refine not only reconstruction software ...

  4. Optimal design of an extrusion process for a hinge bracket

    International Nuclear Information System (INIS)

    Na, Geum Ju; Jang, Myung Geun; Kim, Jong Bong

    2016-01-01

    This study considers process design in forming a hinge bracket. A thin hinge bracket is typically produced by bending a sheet panel or welding a hollow bar into a sheet panel. However, the hinge bracket made by bending or welding does not have sufficient durability in severe operating conditions because of the stress concentration in the bended region or the low corrosion resistance of the welded region. Therefore, this study uses forming to produce the hinge bracket part of a foldable container and to ensure durability in difficult operating conditions. An extrusion process for a T-shaped hinge bracket is studied using finite element analysis. Preliminary analysis shows that a very high forging load is required to form the bracket by forging. Therefore, extrusion is considered as a candidate process. Producing the part through the extrusion process enables many brackets to be made in a single extrusion and through successive cutting of the extruded part, thereby reducing the manufacturing cost. The design focuses on reducing the extrusion load and on ensuring shape accuracy. An initial billet is designed to reduce the extrusion load and to obtain a geometrically accurate part. The extruded part is bent frequently because of uneven material flow. Thus, extrusion die geometries are designed to obtain straight parts.

  5. Optimal design of an extrusion process for a hinge bracket

    Energy Technology Data Exchange (ETDEWEB)

    Na, Geum Ju; Jang, Myung Geun; Kim, Jong Bong [Seoul National University, Seoul (Korea, Republic of)

    2016-05-15

    This study considers process design in forming a hinge bracket. A thin hinge bracket is typically produced by bending a sheet panel or welding a hollow bar into a sheet panel. However, the hinge bracket made by bending or welding does not have sufficient durability in severe operating conditions because of the stress concentration in the bended region or the low corrosion resistance of the welded region. Therefore, this study uses forming to produce the hinge bracket part of a foldable container and to ensure durability in difficult operating conditions. An extrusion process for a T-shaped hinge bracket is studied using finite element analysis. Preliminary analysis shows that a very high forging load is required to form the bracket by forging. Therefore, extrusion is considered as a candidate process. Producing the part through the extrusion process enables many brackets to be made in a single extrusion and through successive cutting of the extruded part, thereby reducing the manufacturing cost. The design focuses on reducing the extrusion load and on ensuring shape accuracy. An initial billet is designed to reduce the extrusion load and to obtain a geometrically accurate part. The extruded part is bent frequently because of uneven material flow. Thus, extrusion die geometries are designed to obtain straight parts.

  6. Process optimization and mechanistic studies of lead (II): Aspergillus ...

    African Journals Online (AJOL)

    The lead (II) accumulation potential of various biosorbent had been widely studied in the last few years, but an outstanding Pb(II) accumulating biomass still seems crucial for bringing the process to a successful application stage. This investigation describes the use of non-living biomass of Aspergillus caespitosus for ...

  7. Liquid radwaste process optimization at Catawba Nuclear Station

    International Nuclear Information System (INIS)

    Cauthen, B.E.; Taylor, J.C.

    1990-01-01

    Since commercial operation in 1985, Catawba Nuclear Station has experienced significant filtration problems with the radioactive liquid waste system. The performance of the filtration and ion exchange equipment has been significantly worse than other Duke Power stations. Full scale tests have been performed to investigate the causes and potential solutions to the waste processing difficulties. The initial waste stream characterization study revealed a large percentage of sub-micron particles. This information immediately suggested that the existing filtration equipment was not adequately sized to effectively process the waste stream. New technologies which would effectively enhance the performance of the processing system and reduce both operating and maintenance costs were researched. This included bag filters, depth filtration, custom designed ion exchange vessels and radionuclide specific ion exchange media. The subsequent full scale testing resulted in a processing scheme which resulted in extended filter life, over 100 percent increase in ion exchange bed life, 90 percent reduction in filter media costs and improved radionuclide removal. 4 refs., 4 figs., 1 tab

  8. Microfluidic chip designs process optimization and dimensional quality control

    DEFF Research Database (Denmark)

    Calaon, Matteo; Hansen, Hans Nørgaard; Tosello, Guido

    2015-01-01

    . Subsequent nickel electroplating was employed to replicate the obtained geometries on the tool, which was used to mold on transparent polymer substrates the functional structures. To assess the critical factors affecting the replication quality throughout the different steps of the proposed process chain...

  9. Optimization of business processes in banks through flexible workflow

    Science.gov (United States)

    Postolache, V.

    2017-08-01

    This article describes an integrated business model of a commercial bank. There are examples of components that go into its composition: wooden models and business processes, strategic goals, organizational structure, system architecture, operational and marketing risk models, etc. The practice has shown that the development and implementation of the integrated business model of the bank significantly increase operating efficiency and its management, ensures organizational and technology stable development. Considering the evolution of business processes in the banking sector, should be analysed their common characteristics. From the author’s point of view, a business process is a set of various activities of a commercial bank in which “Input” is one or more financial and material resources, as a result of this activity and “output” is created by banking product, which is some value to consumer. Using workflow technology, management business process efficiency issue is a matter of managing the integration of resources and sequence of actions aimed at achieving this goal. In turn, it implies management of jobs or functions’ interaction, synchronizing of the assignments periods, reducing delays in the transmission of the results etc. Workflow technology is very important for managers at all levels, as they can use it to easily strengthen the control over what is happening in a particular unit, and in the bank as a whole. The manager is able to plan, to implement rules, to interact within the framework of the company’s procedures and tasks entrusted to the system of the distribution function and execution control, alert on the implementation and issuance of the statistical data on the effectiveness of operating procedures. Development and active use of the integrated bank business model is one of the key success factors that contribute to long-term and stable development of the bank, increase employee efficiency and business processes, implement the

  10. Optimization of desalting process with centrifugation for condensation process of uranium from sea water

    International Nuclear Information System (INIS)

    Yamamoto, Tatsuya; Takase, Hisao; Fukuoka, Fumio

    1984-01-01

    Optimization of desalting of the slurry on the condensation process by the deposited slurry method for the recovery of uranium from sea water was studied. We have already published that the uranium rich deposit containing seven ppm uranium could be made on the sea bottom by the deposited slurry method. Uranium can be transferred to the anion exchange resin from titanic acid in the slurry. But in this case Cl - ions obstruct the adsorption of uranium on the anion exchange resin, so the slurry must be desalted before RIP method. It is considered that the cost of desalting of the slurry stage would be a large portion of the capital cost for the recovery of uranium from sea water. The cost of water required is comparable to the cost of energy so that the objective function consists of the cost of energy and the quantity of water. The consumption of energy and water required for desalting of the slurry with the multi-stage centrifugation were oprimized based on dynamic programming. (author)

  11. Optimal estimation of the intensity function of a spatial point process

    DEFF Research Database (Denmark)

    Guan, Yongtao; Jalilian, Abdollah; Waagepetersen, Rasmus

    easily computable estimating functions. We derive the optimal estimating function in a class of first-order estimating functions. The optimal estimating function depends on the solution of a certain Fredholm integral equation and reduces to the likelihood score in case of a Poisson process. We discuss...

  12. Optimization process for thin-walled High Performance Concrete sandwich panels

    DEFF Research Database (Denmark)

    Hodicky, Kamil; Hulin, Thomas; Schmidt, Jacob Wittrup

    2013-01-01

    A Nearly zero energy buildings are to become a requirement as part of the European energy pol-icy. There are many ways of designing nearly zero energy buildings, but there is a lack of knowledge on opti-mization processes in the sense of structurally and thermally efficient design with an optimal...

  13. Optimization of Nano-Process Deposition Parameters Based on Gravitational Search Algorithm

    Directory of Open Access Journals (Sweden)

    Norlina Mohd Sabri

    2016-06-01

    Full Text Available This research is focusing on the radio frequency (RF magnetron sputtering process, a physical vapor deposition technique which is widely used in thin film production. This process requires the optimized combination of deposition parameters in order to obtain the desirable thin film. The conventional method in the optimization of the deposition parameters had been reported to be costly and time consuming due to its trial and error nature. Thus, gravitational search algorithm (GSA technique had been proposed to solve this nano-process parameters optimization problem. In this research, the optimized parameter combination was expected to produce the desirable electrical and optical properties of the thin film. The performance of GSA in this research was compared with that of Particle Swarm Optimization (PSO, Genetic Algorithm (GA, Artificial Immune System (AIS and Ant Colony Optimization (ACO. Based on the overall results, the GSA optimized parameter combination had generated the best electrical and an acceptable optical properties of thin film compared to the others. This computational experiment is expected to overcome the problem of having to conduct repetitive laboratory experiments in obtaining the most optimized parameter combination. Based on this initial experiment, the adaptation of GSA into this problem could offer a more efficient and productive way of depositing quality thin film in the fabrication process.

  14. An algorithm for gradient-based dynamic optimization of UV flash processes

    DEFF Research Database (Denmark)

    Ritschel, Tobias Kasper Skovborg; Capolei, Andrea; Gaspar, Jozsef

    2017-01-01

    This paper presents a novel single-shooting algorithm for gradient-based solution of optimal control problems with vapor-liquid equilibrium constraints. Such optimal control problems are important in several engineering applications, for instance in control of distillation columns, in certain two...... softwareaswellastheperformanceofdifferentcompilersinaLinuxoperatingsystem. Thesetestsindicatethatreal-timenonlinear model predictive control of UV flash processes is computationally feasible....

  15. Optimization process planning using hybrid genetic algorithm and intelligent search for job shop machining.

    Science.gov (United States)

    Salehi, Mojtaba; Bahreininejad, Ardeshir

    2011-08-01

    Optimization of process planning is considered as the key technology for computer-aided process planning which is a rather complex and difficult procedure. A good process plan of a part is built up based on two elements: (1) the optimized sequence of the operations of the part; and (2) the optimized selection of the machine, cutting tool and Tool Access Direction (TAD) for each operation. In the present work, the process planning is divided into preliminary planning, and secondary/detailed planning. In the preliminary stage, based on the analysis of order and clustering constraints as a compulsive constraint aggregation in operation sequencing and using an intelligent searching strategy, the feasible sequences are generated. Then, in the detailed planning stage, using the genetic algorithm which prunes the initial feasible sequences, the optimized operation sequence and the optimized selection of the machine, cutting tool and TAD for each operation based on optimization constraints as an additive constraint aggregation are obtained. The main contribution of this work is the optimization of sequence of the operations of the part, and optimization of machine selection, cutting tool and TAD for each operation using the intelligent search and genetic algorithm simultaneously.

  16. Applications of genetic algorithms to optimization problems in the solvent extraction process for spent nuclear fuel

    International Nuclear Information System (INIS)

    Omori, Ryota, Sakakibara, Yasushi; Suzuki, Atsuyuki

    1997-01-01

    Applications of genetic algorithms (GAs) to optimization problems in the solvent extraction process for spent nuclear fuel are described. Genetic algorithms have been considered a promising tool for use in solving optimization problems in complicated and nonlinear systems because they require no derivatives of the objective function. In addition, they have the ability to treat a set of many possible solutions and consider multiple objectives simultaneously, so they can calculate many pareto optimal points on the trade-off curve between the competing objectives in a single iteration, which leads to small computing time. Genetic algorithms were applied to two optimization problems. First, process variables in the partitioning process were optimized using a weighted objective function. It was observed that the average fitness of a generation increased steadily as the generation proceeded and satisfactory solutions were obtained in all cases, which means that GAs are an appropriate method to obtain such an optimization. Secondly, GAs were applied to a multiobjective optimization problem in the co-decontamination process, and the trade-off curve between the loss of uranium and the solvent flow rate was successfully obtained. For both optimization problems, CPU time with the present method was estimated to be several tens of times smaller than with the random search method

  17. Optimized Enhanced Bioremediation Through 4D Geophysical Monitoring and Autonomous Data Collection, Processing and Analysis

    Science.gov (United States)

    2014-09-01

    ER-200717) Optimized Enhanced Bioremediation Through 4D Geophysical Monitoring and Autonomous Data Collection, Processing and Analysis...N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Optimized Enhanced Bioremediation Through 4D Geophysical Monitoring and Autonomous Data...8 2.1.2 The Geophysical Signatures of Bioremediation ......................................... 8 2.2 PRIOR

  18. Optimal Stopping Problems Driven by Lévy Processes and Pasting Principles

    NARCIS (Netherlands)

    Surya, B.A.

    2007-01-01

    Solving optimal stopping problems driven by Lévy processes has been a challenging task and has found many applications in modern theory of mathematical finance. For example situations in which optimal stopping typically arise include the problem of finding the arbitrage-free price of the American

  19. NDDP multi-stage flash desalination process simulator design process optimization

    International Nuclear Information System (INIS)

    Sashi Kumar, G.N.; Mahendra, A.K.; Sanyal, A.; Gouthaman, G.

    2009-03-01

    The improvement of NDDP-MSF plant's performance ratio (PR) from design value of 9.0 to 13.1 was achieved by optimizing the plant's operating parameters within the feasible zone of operation. This plant has 20% excess heat transfer area over the design condition which helped us to get a PR of 15.1 after optimization. Thus we have obtained, (1) A 45% increase in the output over design value by the optimization carried out with design heat transfer area. (2) A 68% increase in the output over design value by the optimization carried out with increased heat transfer area. This report discusses the approach, methodology and results of the optimization study carried out. A simulator, MSFSIM which predicts the performance of a multi-stage flash (MSF) desalination plant has been coupled with Genetic Algorithm (GA) optimizer. Exhaustive optimization case studies have been conducted on this plant with an objective to increase the performance ratio (PR). The steady state optimization performed was based on obtaining the best stage wise pressure profile to enhance thermal efficiency which in-turn improves the performance ratio. Apart from this, the recirculating brine flow rate was also optimized. This optimization study enabled us to increase the PR of NDDP-MSF plant from design value of 9.0 to an optimized value 13.1. The actual plant is provided with 20% additional heat transfer area over and above the design heat transfer area. Optimization with this additional heat transfer area has taken the PR to 15.1. A desire to maintain equal flashing rates in all of the stages (a feature required for long life of the plant and to avoid cascading effect of non-flashing triggered by any stage) of the MSF plant has also been achieved. The deviation in the flashing rates within stages has been reduced. The startup characteristic of the plant (i.e the variation of stage pressure and the variation of recirculation flow rate with time), have been optimized with a target to minimize the

  20. Optimization of upstream and development of cellulose hydrolysis process for cellulosic bio-ethanol production

    International Nuclear Information System (INIS)

    Bae, Hyeun Jong; Wi, Seung Gon; Lee, Yoon Gyo; Kim, Ho Myung; Kim, Su Bae

    2011-10-01

    The purpose of this project is optimization of upstream and development of cellulose hydrolysis process for cellulosic bio-ethanol production. The 2nd year Research scope includes: 1) Optimization of pre-treatment conditions for enzymatic hydrolysis of lignocellulosic biomass and 2) Demonstration of enzymatic hydrolysis by recombinant enzymes. To optimize the pretreatment, we applied two processes: a wet process (wet milling + popping), and dry process (popping + dry milling). Out of these, the wet process presented the best glucose yield with a 93.1% conversion, while the dry process yielded 69.6%, and the unpretreated process yielded <20%. The recombinant cellulolytic enzymes showed very high specific activity, about 80-1000 times on CMC and 13-70 times on filter paper at pH 3.5 and 55 .deg. C

  1. Optimization of the ultrasonic processing in a melt flow

    OpenAIRE

    Tzanakis, I; Lebon, GSB; Eskin, DG; Pericleous, K

    2016-01-01

    Ultrasonic cavitation treatment of melt significantly improves the downstream properties and quality of conventional and advanced metallic materials. However, the transfer of this technology to treating large melt volumes has been hindered by a lack of fundamental knowledge, allowing for the ultrasonic processing in the melt flow. In this study, we present the results of experimental validation of an advanced numerical model applied to the acoustic cavitation treatment of liquid aluminum duri...

  2. Boosting the IGCLC process efficiency by optimizing the desulfurization step

    International Nuclear Information System (INIS)

    Hamers, H.P.; Romano, M.C.; Spallina, V.; Chiesa, P.; Gallucci, F.; Sint Annaland, M. van

    2015-01-01

    Highlights: • Pre-CLC hot gas desulfurization and post-CLC desulfurization are assessed. • Process efficiency increases by 0.5–1% points with alternative desulfurization methods. • Alternative desulfurization methods are more beneficial for CFB configurations. - Abstract: In this paper the influence of the desulfurization method on the process efficiency of an integrated gasification chemical-looping combustion (IGCLC) systems is investigated for both packed beds and circulating fluidized bed CLC systems. Both reactor types have been integrated in an IGCLC power plant, in which three desulfurization methods have been compared: conventional cold gas desulfurization with Selexol (CGD), hot gas desulfurization with ZnO (HGD) and flue gas desulfurization after the CLC reactors (post-CLC). For CLC with packed bed reactors, the efficiency gain of the alternative desulfurization methods is about 0.5–0.7% points. This is relatively small, because of the relatively large amount of steam that has to be mixed with the fuel to avoid carbon deposition on the oxygen carrier. The HGD and post-CLC configurations do not contain a saturator and therefore more steam has to be mixed with a negative influence on the process efficiency. Carbon deposition is not an issue for circulating fluidized bed systems and therefore a somewhat higher efficiency gain of 0.8–1.0% point can be reached for this reactor system, assuming that complete fuel conversion can be reached and no sulfur species are formed on the solid, which is however thermodynamically possible for iron and manganese based oxygen carriers. From this study, it can be concluded that the adaptation of the desulfurization method results in higher process efficiencies, especially for the circulating fluidized bed system, while the number of operating units is reduced.

  3. Automatic Optimization of Hardware Accelerators for Image Processing

    OpenAIRE

    Reiche, Oliver; Häublein, Konrad; Reichenbach, Marc; Hannig, Frank; Teich, Jürgen; Fey, Dietmar

    2015-01-01

    In the domain of image processing, often real-time constraints are required. In particular, in safety-critical applications, such as X-ray computed tomography in medical imaging or advanced driver assistance systems in the automotive domain, timing is of utmost importance. A common approach to maintain real-time capabilities of compute-intensive applications is to offload those computations to dedicated accelerator hardware, such as Field Programmable Gate Arrays (FPGAs). Programming such arc...

  4. Deconvoluting the Friction Stir Weld Process for Optimizing Welds

    Science.gov (United States)

    Schneider, Judy; Nunes, Arthur C.

    2008-01-01

    In the friction stir welding process, the rotating surfaces of the pin and shoulder contact the weld metal and force a rotational flow within the weld metal. Heat, generated by the metal deformation as well as frictional slippage with the contact surface, softens the metal and makes it easier to deform. As in any thermo-mechanical processing of metal, the flow conditions are critical to the quality of the weld. For example, extrusion of metal from under the shoulder of an excessively hot weld may relax local pressure and result in wormhole defects. The trace of the weld joint in the wake of the weld may vary geometrically depending upon the flow streamlines around the tool with some geometry more vulnerable to loss of strength from joint contamination than others. The material flow path around the tool cannot be seen in real time during the weld. By using analytical "tools" based upon the principles of mathematics and physics, a weld model can be created to compute features that can be observed. By comparing the computed observations with actual data, the weld model can be validated or adjusted to get better agreement. Inputs to the model to predict weld structures and properties include: hot working properties ofthe metal, pin tool geometry, travel rate, rotation and plunge force. Since metals record their prior hot working history, the hot working conditions imparted during FSW can be quantified by interpreting the final microstructure. Variations in texture and grain size result from variations in the strain accommodated at a given strain rate and temperature. Microstructural data from a variety of FSWs has been correlated with prior marker studies to contribute to our understanding of the FSW process. Once this stage is reached, the weld modeling process can save significant development costs by reducing costly trial-and-error approaches to obtaining quality welds.

  5. Mean-Variance Optimization in Markov Decision Processes

    OpenAIRE

    Mannor, Shie; Tsitsiklis, John N.

    2011-01-01

    We consider finite horizon Markov decision processes under performance measures that involve both the mean and the variance of the cumulative reward. We show that either randomized or history-based policies can improve performance. We prove that the complexity of computing a policy that maximizes the mean reward under a variance constraint is NP-hard for some cases, and strongly NP-hard for others. We finally offer pseudo-polynomial exact and approximation algorithms.

  6. Decontamination of Chlorpyrifos packing using ionizing radiation: processing optimization

    International Nuclear Information System (INIS)

    Mori, Manoel Nunes; Sampa, Maria Helena de Oliveira; Duarte, Celina Lopes

    2007-01-01

    The discharge of empty plastic packing of pesticide can be an environmental concern causing problems to human health, animals and plants if done without inspection and monitoring. Among the commercial pesticides, chlorpyrifos, o, o- Diethyl - o- (3,5,6 - trichloro - 2 - pyridyl) phosphorothioate, has significant importance because of its wide distribution, extensive use and persistence. The most commonly used formulations include the emulsified concentrate, granule, wet powder and dispersible granule has significant importance because of its wide distribution and extensive use and persistence. The hydroxyl .OH attack is the most efficient process of chemical oxidation. The degradation-induced of chlorpyrifos by gamma radiolysis was studied in packaging of high-density polyethylene tree layer coextruded, named COEX, irradiated intact and fragments. The intact packing was irradiated with water and the fragmented packing was irradiated with water and with a solution of 50% of water and 50% of acetonitrile. An AECL 'Gammacell 2201 60 Co source and a multipurpose gamma irradiator were used in the processing. The chemical analysis of the chlorpyrifos and by-products were made using a gas chromatography associated to the mass spectrometry (MSGC-Shimadzu QP5000. Radiation processing of packing in pieces showed higher efficiency in removing chlorpyrifos than whole packing. The presence of water showed fundamental to promote the formation of frees radicals and acetonitrile facilitate the dissolution of chlorpyrifos and consequently its removal. (author)

  7. Decontamination of Chlorpyrifos packing using ionizing radiation: processing optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mori, Manoel Nunes; Sampa, Maria Helena de Oliveira; Duarte, Celina Lopes [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)]. E-mails: mnmori@ipen.br; mhosampa@ipen.br; clduarte@ipen.br

    2007-07-01

    The discharge of empty plastic packing of pesticide can be an environmental concern causing problems to human health, animals and plants if done without inspection and monitoring. Among the commercial pesticides, chlorpyrifos, o, o- Diethyl - o- (3,5,6 - trichloro - 2 - pyridyl) phosphorothioate, has significant importance because of its wide distribution, extensive use and persistence. The most commonly used formulations include the emulsified concentrate, granule, wet powder and dispersible granule has significant importance because of its wide distribution and extensive use and persistence. The hydroxyl .OH attack is the most efficient process of chemical oxidation. The degradation-induced of chlorpyrifos by gamma radiolysis was studied in packaging of high-density polyethylene tree layer coextruded, named COEX, irradiated intact and fragments. The intact packing was irradiated with water and the fragmented packing was irradiated with water and with a solution of 50% of water and 50% of acetonitrile. An AECL 'Gammacell 2201 {sup 60}Co source and a multipurpose gamma irradiator were used in the processing. The chemical analysis of the chlorpyrifos and by-products were made using a gas chromatography associated to the mass spectrometry (MSGC-Shimadzu QP5000. Radiation processing of packing in pieces showed higher efficiency in removing chlorpyrifos than whole packing. The presence of water showed fundamental to promote the formation of frees radicals and acetonitrile facilitate the dissolution of chlorpyrifos and consequently its removal. (author)

  8. Biodiesel production from vegetable oil: Process design, evaluation and optimization

    Directory of Open Access Journals (Sweden)

    Kianimanesh Hamid Reza

    2017-09-01

    Full Text Available To investigate the effect of reactor performance/configuration of biodiesel production on process parameters (mass & energy consumption, required facilities etc., two diverse production processes (from vegetable oil were implemented/designed using Aspen HYSYS V7.2. Two series reactors were taken into account where overall conversion was set to be 97.7% and 70% in first and second processes respectively. Comparative analysis showed that an increase in conversion yield caused to consumption reduction of oil, methanol, cold energy and hot energy up to 9.1%, 22%, 67.16% and 60.28% respectively; further, a number of facilities (e.g. boiler, heat exchanger, distillation tower were reduced. To reduce mass & energy consumption, mass/heat integration method was employed. Applying integration method showed that in the first design, methanol, cold and hot energy were decreased by 49.81%, 17.46% and 36.17% respectively; while in the second design, oil, methanol, cold and hot energy were decreased by 9%, 60.57% 19.62% and 36.58% respectively.

  9. PREFACE: Selected papers from the Fourth Annual q-bio Conference on Cellular Information Processing Selected papers from the Fourth Annual q-bio Conference on Cellular Information Processing

    Science.gov (United States)

    Nemenman, Ilya; Faeder, James R.; Hlavacek, William S.; Jiang, Yi; Wall, Michael E.; Zilman, Anton

    2011-10-01

    Summary This special issue consists of 11 original papers that elaborate on work presented at the Fourth Annual q-bio Conference on Cellular Information Processing, which was held on the campus of St John's College in Santa Fe, New Mexico, USA, 11-14 August 2010. Now in its fourth year, the q-bio conference has changed considerably over time. It is now well established and a major event in systems biology. The 2010 conference saw attendees from all continents (except Antarctica!) sharing novel results and participating in lively discussions at both the oral and poster sessions. The conference was oversubscribed and grew to 27 contributed talks, 16 poster spotlights and 137 contributed posters. We deliberately decreased the number of invited speakers to 21 to leave more space for contributed presentations, and the attendee feedback confirmed that the choice was a success. Although the q-bio conference has grown and matured, it has remained true to the original goal of being an intimate and dynamic event that brings together modeling, theory and quantitative experimentation for the study of cell regulation and information processing. Funded in part by a grant from NIGMS and by DOE funds through the Los Alamos National Laboratory Directed Research and Development program, the conference has continued to exhibit youth and vigor by attracting (and partially supporting) over 100 undergraduate, graduate and postdoctoral researchers. The associated q-bio summer school, which precedes the conference each year, further emphasizes the development of junior scientists and makes q-bio a singular event in its impact on the future of quantitative biology. In addition to an increased international presence, the conference has notably diversified its demographic representation within the USA, including increased participation from the southeastern corner of the country. One big change in the conference this year is our new publication partner, Physical Biology. Although we are very

  10. Comparative analysis of cogeneration power plants optimization based on stochastic method using superstructure and process simulator

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Leonardo Rodrigues de [Instituto Federal do Espirito Santo, Vitoria, ES (Brazil)], E-mail: leoaraujo@ifes.edu.br; Donatelli, Joao Luiz Marcon [Universidade Federal do Espirito Santo (UFES), Vitoria, ES (Brazil)], E-mail: joaoluiz@npd.ufes.br; Silva, Edmar Alino da Cruz [Instituto Tecnologico de Aeronautica (ITA/CTA), Sao Jose dos Campos, SP (Brazil); Azevedo, Joao Luiz F. [Instituto de Aeronautica e Espaco (CTA/IAE/ALA), Sao Jose dos Campos, SP (Brazil)

    2010-07-01

    Thermal systems are essential in facilities such as thermoelectric plants, cogeneration plants, refrigeration systems and air conditioning, among others, in which much of the energy consumed by humanity is processed. In a world with finite natural sources of fuels and growing energy demand, issues related with thermal system design, such as cost estimative, design complexity, environmental protection and optimization are becoming increasingly important. Therefore the need to understand the mechanisms that degrade energy, improve energy sources use, reduce environmental impacts and also reduce project, operation and maintenance costs. In recent years, a consistent development of procedures and techniques for computational design of thermal systems has occurred. In this context, the fundamental objective of this study is a performance comparative analysis of structural and parametric optimization of a cogeneration system using stochastic methods: genetic algorithm and simulated annealing. This research work uses a superstructure, modelled in a process simulator, IPSEpro of SimTech, in which the appropriate design case studied options are included. Accordingly, the cogeneration system optimal configuration is determined as a consequence of the optimization process, restricted within the configuration options included in the superstructure. The optimization routines are written in MsExcel Visual Basic, in order to work perfectly coupled to the simulator process. At the end of the optimization process, the system optimal configuration, given the characteristics of each specific problem, should be defined. (author)

  11. Process Cost Modeling for Multi-Disciplinary Design Optimization

    Science.gov (United States)

    Bao, Han P.; Freeman, William (Technical Monitor)

    2002-01-01

    For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This report outlines the development of a process-based cost model in which the physical elements of the vehicle are costed according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this report is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool. In successive sections, the report addresses the issues of cost modeling as follows. First, an introduction is presented to provide the background for the research work. Next, a quick review of cost estimation techniques is made with the intention to

  12. Real-time economic optimization for a fermentation process using Model Predictive Control

    DEFF Research Database (Denmark)

    Petersen, Lars Norbert; Jørgensen, John Bagterp

    2014-01-01

    Fermentation is a widely used process in production of many foods, beverages, and pharmaceuticals. The main goal of the control system is to maximize profit of the fermentation process, and thus this is also the main goal of this paper. We present a simple dynamic model for a fermentation process...... and demonstrate its usefulness in economic optimization. The model is formulated as an index-1 differential algebraic equation (DAE), which guarantees conservation of mass and energy in discrete form. The optimization is based on recent advances within Economic Nonlinear Model Predictive Control (E......-NMPC), and also utilizes the index-1 DAE model. The E-NMPC uses the single-shooting method and the adjoint method for computation of the optimization gradients. The process constraints are relaxed to soft-constraints on the outputs. Finally we derive the analytical solution to the economic optimization problem...

  13. Data processing and optimization system to study prospective interstate power interconnections

    Science.gov (United States)

    Podkovalnikov, Sergei; Trofimov, Ivan; Trofimov, Leonid

    2018-01-01

    The paper presents Data processing and optimization system for studying and making rational decisions on the formation of interstate electric power interconnections, with aim to increasing effectiveness of their functioning and expansion. The technologies for building and integrating a Data processing and optimization system including an object-oriented database and a predictive mathematical model for optimizing the expansion of electric power systems ORIRES, are described. The technology of collection and pre-processing of non-structured data collected from various sources and its loading to the object-oriented database, as well as processing and presentation of information in the GIS system are described. One of the approaches of graphical visualization of the results of optimization model is considered on the example of calculating the option for expansion of the South Korean electric power grid.

  14. Cellular gravity

    NARCIS (Netherlands)

    F.C. Gruau; J.T. Tromp (John)

    1999-01-01

    textabstractWe consider the problem of establishing gravity in cellular automata. In particular, when cellular automata states can be partitioned into empty, particle, and wall types, with the latter enclosing rectangular areas, we desire rules that will make the particles fall down and pile up on

  15. Processing and optimization of functional ceramic coatings and inorganic nanomaterials

    Science.gov (United States)

    Nyutu, Edward Kennedy G.

    Processing of functional inorganic materials including zero (0-D) dimensional (e.g. nanoparticles), 1-D (nanorods, nanofibers), and 2-D (films/coating) structures is of fundamental and technological interest. This research will have two major sections. The first part of section one focuses on the deposition of silicon dioxide onto a pre-deposited molybdenum disilicide coating on molybdenum substrates for both high (>1000 °C) and moderate (500-600 °C) temperature oxidation protection. Chemical vapor deposition (CVD/MOCVD) techniques will be utilized to deposit the metal suicide and oxide coatings. The focus of this study will be to establish optimum deposition conditions and evaluate the metal oxide coating as oxidation - thermal barriers for Mo substrates under both isothermal (static) and cyclic oxidation conditions. The second part of this section will involve a systematic evaluation of a boron nitride (BN) interface coating prepared by chemical vapor deposition. Ceramic matrix composites (CMCs) are prospective candidates for high (>1000 °C) temperature applications and fiber- matrix interfaces are the dominant design parameters in ceramic matrix composites (CMCs). An important goal of the study is to determine a set of process parameters, which would define a boron nitride (BN) interface coating by a chemical vapor deposition (CVD) process with respect to coating. In the first part of the second section, we will investigate a new approach to synthesize ultrafine metal oxides that combines microwave heating and an in-situ ultrasonic mixing of two or more liquid precursors with a tubular flow reactor. Different metal oxides such as nickel ferrite and zinc aluminate spinels will be studied. The synthesis of metal oxides were investigated in order to study the effects of the nozzle and microwave (INM process) on the purity, composition, and particle size of the resulting powders. The second part of this research section involves a study of microwave frequency

  16. Economic and process optimization of ethanol production by extractive fermentation

    Energy Technology Data Exchange (ETDEWEB)

    1992-01-01

    This report demonstrates by computer simulation the economic advantages of extractive fermentation on an industrial scale compared to the best alternative technology currently available. The simulations were based on a plant capacity of 100 x 10 6 L/y of azeotropic ethanol. The simulation results were verified with a fully integrated, computer controlled extractive fermentation process demonstration unit based around a 7 L fermentor operated with a synthetic glucose medium and using Saccharomyces cerevisiae. The system was also operated with natural substrates (blackstrap molasses and grain hydrolyzate). Preliminary tests with the organism Zymomonas mobilis were also carried out under extractive fermentation conditions.

  17. Optimization of blanking process using neural network simulation

    International Nuclear Information System (INIS)

    Hambli, R.

    2005-01-01

    The present work describes a methodology using the finite element method and neural network simulation in order to predict the optimum punch-die clearance during sheet metal blanking processes. A damage model is used in order to describe crack initiation and propagation into the sheet. The proposed approach combines predictive finite element and neural network modeling of the leading blanking parameters. Numerical results obtained by finite element computation including damage and fracture modeling were utilized to train the developed simulation environment based on back propagation neural network modeling. The comparative study between the numerical results and the experimental ones shows the good agreement. (author)

  18. Process Optimization for Injection Moulding of Passive Microwave Components

    DEFF Research Database (Denmark)

    Scholz, Steffen G.; Mueller, Tobias; Santos Machado, Leonardo

    2016-01-01

    The demand for micro components has increased during the last decade following the overall trend towards miniaturization. Injection moulding is the favoured technique for the mass manufacturing of micro components or larger parts with micro-structured areas due to its ability to cost effectively ...... algorithm for modelling, the influence of different moulding parameters on the final part quality was assessed. Firstly a process model and secondly a quality model has been calculated. The results shows that part quality can be controlled by monitoring characteristic numbers....

  19. LESSONS LEARNED THROUGH OPTIMIZATION OF THE VOLUNTARY CORRECTIVE ACTION PROCESS

    International Nuclear Information System (INIS)

    Thacker, M. S.; Freshour, P.; McDonald, W.

    2002-01-01

    Valuable experience in environmental remediation was gained at Sandia National Laboratories/New Mexico (Sandia) by concurrently conducting Voluntary Corrective Actions (VCAs) at three Solid Waste Management Units (SWMUs). Sandia combined the planning, implementation, and reporting phases of three VCAs with the goal of realizing significant savings in both cost and schedule. The lessons learned through this process have been successfully implemented within the Sandia Environmental Restoration (ER) Project and could be utilized at other locations with multiple ER sites. All lessons learned resulted from successful teaming with the New Mexico Environment Department (NMED) Hazardous Waste Bureau (HWB), Sandia management, a Sandia risk assessment team, and Sandia waste management personnel. Specific lessons learned included the following: (1) potential efficiencies can be exploited by reprioritization and rescheduling of activities; (2) cost and schedule reductions can be realized by combining similar work at contiguous sites into a single effort; (3) working with regulators to develop preliminary remediation goals (PRGs) and gain regulatory acceptance for VCA planning prior to project initiation results in significant time savings throughout the remediation and permit modification processes; (4) effective and thoughtful contingency planning removes uncertainties and defrays costs so that projects can be completed without interruption; (5) timely collection of waste characterization samples allows efficient disposal of waste streams, and (6) concurrent reporting of VCA activities results in significant savings in time for the authors and reviewers

  20. Drying of water based foundry coatings: Innovative test, process design and optimization methods

    DEFF Research Database (Denmark)

    Di Muoio, Giovanni Luca; Johansen, Bjørn Budolph

    on real industrial cases. These tools have been developed in order to simulate and optimize the drying process and reduce drying time and power consumption as well as production process design time and cost of expensive drying equipment. Results show that test methods from other industries can be used...... capacity goals there is a need to understand how to design, control and optimize drying processes. The main focus of this project was on the critical parameters and properties to be controlled in production in order to achieve a stable and predictable drying process. We propose for each of these parameters...... of Denmark with the overall aim to optimize the drying process of water based foundry coatings. Drying of foundry coatings is a relatively new process in the foundry industry that followed the introduction of water as a solvent. In order to avoid moisture related quality problems and reach production...

  1. FPGA based hardware optimized implementation of signal processing system for LFM pulsed radar

    Science.gov (United States)

    Azim, Noor ul; Jun, Wang

    2016-11-01

    Signal processing is one of the main parts of any radar system. Different signal processing algorithms are used to extract information about different parameters like range, speed, direction etc, of a target in the field of radar communication. This paper presents LFM (Linear Frequency Modulation) pulsed radar signal processing algorithms which are used to improve target detection, range resolution and to estimate the speed of a target. Firstly, these algorithms are simulated in MATLAB to verify the concept and theory. After the conceptual verification in MATLAB, the simulation is converted into implementation on hardware using Xilinx FPGA. Chosen FPGA is Xilinx Virtex-6 (XC6LVX75T). For hardware implementation pipeline optimization is adopted and also other factors are considered for resources optimization in the process of implementation. Focusing algorithms in this work for improving target detection, range resolution and speed estimation are hardware optimized fast convolution processing based pulse compression and pulse Doppler processing.

  2. Performance improvements of binary diffractive structures via optimization of the photolithography and dry etch processes

    Science.gov (United States)

    Welch, Kevin; Leonard, Jerry; Jones, Richard D.

    2010-08-01

    Increasingly stringent requirements on the performance of diffractive optical elements (DOEs) used in wafer scanner illumination systems are driving continuous improvements in their associated manufacturing processes. Specifically, these processes are designed to improve the output pattern uniformity of off-axis illumination systems to minimize degradation in the ultimate imaging performance of a lithographic tool. In this paper, we discuss performance improvements in both photolithographic patterning and RIE etching of fused silica diffractive optical structures. In summary, optimized photolithographic processes were developed to increase critical dimension uniformity and featuresize linearity across the substrate. The photoresist film thickness was also optimized for integration with an improved etch process. This etch process was itself optimized for pattern transfer fidelity, sidewall profile (wall angle, trench bottom flatness), and across-wafer etch depth uniformity. Improvements observed with these processes on idealized test structures (for ease of analysis) led to their implementation in product flows, with comparable increases in performance and yield on customer designs.

  3. Radioisotope applications for troubleshooting and optimizing industrial processes

    International Nuclear Information System (INIS)

    2002-03-01

    This brochure is intended to present the state-of -the-art in techniques for gamma scanning and neutron backscattering for troubleshooting inspection of columns, vessels, pipes, and tanks in many industrial processing sectors. It aims to provide not only an extensive description of what can be achieved by the application of radioisotope sealed sources but also sound experience-based guidance on all aspects of designing, carrying out and interpreting the results of industrial applications. Though it is written primarily for radioisotope practitioners, the brochure is also intended to function as an ambassador for the technology by promoting its benefits to governments, to the general public and to industrial end-users

  4. Numerical simulation of onshore separation processes - residence time optimization

    Energy Technology Data Exchange (ETDEWEB)

    Fonte, Clarissa Bergman; Oliveira Junior, Joao Americo Aguirre [Engineering Simulation and Scientific Software (ESSS), Florianopolis, SC (Brazil)], E-mails: clarissa@esss.com.br, joao.aguirre@esss.com.br; Dutra, Eduardo Stein Soares [PETROBRAS E e P Engenharia de Producao, Rio de Janeiro, RJ (Brazil). Gerencia de Engenharia de Instalacoes de Superficie e Automacao], E-mail: eduardodutra@petrobras.com.br

    2011-04-15

    Cylindrical tanks are commonly used in onshore facilities to process and treat oil and water streams. These tanks generate a gravitational separation and, when sedimentation velocity is reached, the residence time inside the tank is crucial to guarantee proper separation. The ideal geometry for a tank maximizes the effective residence time by providing the largest possible fluid path, along which sedimentation of the denser phase occurs. Large volume tanks can be used for this purpose. However, internal devices, which increase the effective residence time and decrease undesirable hydrodynamic effects, are a commonly used alternative, allowing a reduction in tank size. This study focuses on the application of computational fluid dynamics as a tool to analyze four geometries found in gravitational separation tanks to identify that which offers the highest residence time values. (author)

  5. Process optimization mental capacity and memory in schoolchildren

    Directory of Open Access Journals (Sweden)

    Kaminska T.M.

    2016-03-01

    Full Text Available Purpose — increase the processes of mental capacity, antioxidant and detoxication effects in the schoolchildren of different regions of residence the use of succinic acid. Patients and methods. Studies conducted in 3 groups of 30 children 7–10 years who took the drug succinic acid for 1 month 1 — villages Irpen region; 2 — industrial city; 3 — c. Kyiv. Results. Prior preparation course that includes succinic acid, the number of missed days at school on acute and recurrent respiratory infections during the month rehabilitation was: in group 1 — 7.4±1.5 days; in group 2 — 8.8±1.9 days; in group 3 — 5.6±0.7 days. After taking the drug significantly decreased frequency of morbidity and amounted to: in group 1 (1.4±0.2 days; in group 2 — 1.8±0.2 days; 3 group — 1.2±0.1 days. The drug was well tolerated by children, side effects were not observed. There was a rapid improvement in visual memory and RAM memory content in all groups of children. Under the influence of the drug significantly reduced glutathione system performance decreases level of superoxide dismutase, increases antioxidant activity, detected reduction of level glutathione-S-transferase in serum indicates increasing detoxification function of the liver. Conclusions. Severe detoxification effect of succinic acid and its ability to activate the functional processes and mental efficiency allows to recommen the reception of preparation by annually improvement of progress at school, memory and disability rates.

  6. Multidisciplinary design optimization of the diesel particulate filter in the composite regeneration process

    International Nuclear Information System (INIS)

    Zhang, Bin; E, Jiaqiang; Gong, Jinke; Yuan, Wenhua; Zuo, Wei; Li, Yu; Fu, Jun

    2016-01-01

    Highlights: • The multidisciplinary design optimization (MDO) for the DPF is presented. • MDO model and multi-objective functions of the DPF are established. • The optimal design parameters are obtained and DPF’s performances are improved. • The optimized results are verified by experiments. • The composite regeneration process of the optimized DPF allows a higher energy saving. - Abstract: In our previous works, the diesel particulate filter (DPF) using a new composite regeneration mode by coupling microwave and ceria-manganese base catalysts is verified as an effective way to reduce the particulate matter emission of the diesel engine. In order to improve the overall performance of this DPF, its multidisciplinary design optimization (MDO) model is established based on objective functions such as pressure drop, regeneration performance, microwave energy consumption, and thermal shock resistance. Then, the DPF is optimized by using MDO method based on adaptive mutative scale chaos optimization algorithm. The optimization results show that with the help of MDO, DPF’s pressure drop is decreased by 14.5%, regeneration efficiency is increased by 17.3%, microwave energy consumption is decreased by 17.6%, and thermal deformation is decreased by 25.3%. The optimization results are also verified by experiments, and the experimental results indicate that the optimized DPF has larger filtration efficiency, better emission performance and regeneration performance, smaller pressure drop, lower wall temperature and temperature gradient, and lower microwave energy consumption.

  7. Fast engineering optimization: A novel highly effective control parameterization approach for industrial dynamic processes.

    Science.gov (United States)

    Liu, Ping; Li, Guodong; Liu, Xinggao

    2015-09-01

    Control vector parameterization (CVP) is an important approach of the engineering optimization for the industrial dynamic processes. However, its major defect, the low optimization efficiency caused by calculating the relevant differential equations in the generated nonlinear programming (NLP) problem repeatedly, limits its wide application in the engineering optimization for the industrial dynamic processes. A novel highly effective control parameterization approach, fast-CVP, is first proposed to improve the optimization efficiency for industrial dynamic processes, where the costate gradient formulae is employed and a fast approximate scheme is presented to solve the differential equations in dynamic process simulation. Three well-known engineering optimization benchmark problems of the industrial dynamic processes are demonstrated as illustration. The research results show that the proposed fast approach achieves a fine performance that at least 90% of the computation time can be saved in contrast to the traditional CVP method, which reveals the effectiveness of the proposed fast engineering optimization approach for the industrial dynamic processes. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  8. Genetic Algorithm-Based Optimization for Surface Roughness in Cylindrically Grinding Process Using Helically Grooved Wheels

    Science.gov (United States)

    Çaydaş, Ulaş; Çelik, Mahmut

    The present work is focused on the optimization of process parameters in cylindrical surface grinding of AISI 1050 steel with grooved wheels. Response surface methodology (RSM) and genetic algorithm (GA) techniques were merged to optimize the input variable parameters of grinding. The revolution speed of workpiece, depth of cut and number of grooves on the wheel were changed to explore their experimental effects on the surface roughness of machined bars. The mathematical models were established between the input parameters and response by using RSM. Then, the developed RSM model was used as objective functions on GA to optimize the process parameters.

  9. Optimal Reinsurance-Investment Problem for an Insurer and a Reinsurer with Jump-Diffusion Process

    Directory of Open Access Journals (Sweden)

    Hanlei Hu

    2018-01-01

    Full Text Available The optimal reinsurance-investment strategies considering the interests of both the insurer and reinsurer are investigated. The surplus process is assumed to follow a jump-diffusion process and the insurer is permitted to purchase proportional reinsurance from the reinsurer. Applying dynamic programming approach and dual theory, the corresponding Hamilton-Jacobi-Bellman equations are derived and the optimal strategies for exponential utility function are obtained. In addition, several sensitivity analyses and numerical illustrations in the case with exponential claiming distributions are presented to analyze the effects of parameters about the optimal strategies.

  10. First-order Convex Optimization Methods for Signal and Image Processing

    DEFF Research Database (Denmark)

    Jensen, Tobias Lindstrøm

    2012-01-01

    In this thesis we investigate the use of first-order convex optimization methods applied to problems in signal and image processing. First we make a general introduction to convex optimization, first-order methods and their iteration complexity. Then we look at different techniques, which can...... be used with first-order methods such as smoothing, Lagrange multipliers and proximal gradient methods. We continue by presenting different applications of convex optimization and notable convex formulations with an emphasis on inverse problems and sparse signal processing. We also describe the multiple...

  11. Optimization of CO2 Laser Cutting Process using Taguchi and Dual Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    M. Madić

    2014-09-01

    Full Text Available Selection of optimal cutting parameter settings for obtaining high cut quality in CO2 laser cutting process is of great importance. Among various analytical and experimental optimization methods, the application of Taguchi and response surface methodology is one of most commonly used for laser cutting process optimization. Although the concept of dual response surface methodology for process optimization has been used with success, till date, no experimental study has been reported in the field of laser cutting. In this paper an approach for optimization of CO2 laser cutting process using Taguchi and dual response surface methodology is presented. The goal was to determine the near optimal laser cutting parameter values in order to ensure robust condition for minimization of average surface roughness. To obtain experimental database for development of response surface models, Taguchi’s L25 orthogonal array was implemented for experimental plan. Three cutting parameters, the cutting speed (3, 4, 5, 6, 7 m/min, the laser power (0.7, 0.9, 1.1, 1.3, 1.5 kW, and the assist gas pressure (3, 4, 5, 6, 7 bar, were used in the experiment. To obtain near optimal cutting parameters settings, multi-stage Monte Carlo simulation procedure was performed on the developed response surface models.

  12. Optimization of the production process using virtual model of a workspace

    Science.gov (United States)

    Monica, Z.

    2015-11-01

    Optimization of the production process is an element of the design cycle consisting of: problem definition, modelling, simulation, optimization and implementation. Without the use of simulation techniques, the only thing which could be achieved is larger or smaller improvement of the process, not the optimization (i.e., the best result it is possible to get for the conditions under which the process works). Optimization is generally management actions that are ultimately bring savings in time, resources, and raw materials and improve the performance of a specific process. It does not matter whether it is a service or manufacturing process. Optimizing the savings generated by improving and increasing the efficiency of the processes. Optimization consists primarily of organizational activities that require very little investment, or rely solely on the changing organization of work. Modern companies operating in a market economy shows a significant increase in interest in modern methods of production management and services. This trend is due to the high competitiveness among companies that want to achieve success are forced to continually modify the ways to manage and flexible response to changing demand. Modern methods of production management, not only imply a stable position of the company in the sector, but also influence the improvement of health and safety within the company and contribute to the implementation of more efficient rules for standardization work in the company. This is why in the paper is presented the application of such developed environment like Siemens NX to create the virtual model of a production system and to simulate as well as optimize its work. The analyzed system is the robotized workcell consisting of: machine tools, industrial robots, conveyors, auxiliary equipment and buffers. In the program could be defined the control program realizing the main task in the virtual workcell. It is possible, using this tool, to optimize both the

  13. Devolatilization studies of oil palm biomass for torrefaction process optimization

    International Nuclear Information System (INIS)

    Daud, D; Rahman, A Abd; Shamsuddin, A H

    2013-01-01

    Torrefaction of palm biomass, namely Empty Fruit Bunch (EFB) and Palm Kernel Shell (PKS), was conducted using thermogravimetric analyser (TGA). The experiment was conducted in variation of temperatures of 200 °C, 260 °C and 300 °C at a constant residence time of 30 minutes. During the torrefaction process, the sample went through identifiable drying and devolatilization stages from the TGA mass loss. The percentage of volatile gases released was then derived for each condition referring to proximate analysis results for both biomass. It was observed an average of 96.64% and 87.53 % of the total moisture is released for EFB and PKS respectively. In all cases the volatiles released was observed to increase as the torrefaction temperature was increased with significant variation between EFB and PKS. At 300°C EFB lost almost half of its volatiles matter while PKS lost slightly over one third. Results obtained can be used to optimise condition of torrefaction according to different types of oil palm biomass.

  14. Optimization of biodiesel production process using recycled vegetable oil

    Science.gov (United States)

    Lugo, Yarely

    Petro diesel toxic emissions and its limited resources have created an interest for the development of new energy resources, such as biodiesel. Biodiesel is traditionally produced by a transesterification reaction between vegetable oil and an alcohol in the presence of a catalyst. However, this process is slow and expensive due to the high cost of raw materials. Low costs feedstock oils such as recycled and animal fats are available but they cannot be transesterified with alkaline catalysts due to high content of free fatty acids, which can lead to undesirable reactions such as saponification. In this study, we reduce free fatty acids content by using an acid pre-treatment. We compare sulfuric acid, hydrochloric acid and ptoluenesulfonic acid (PTSA) to pre-treat recycled vegetable oil. PTSA removes water after 60 minutes of treatment at room temperature or within 15 minutes at 50°C. The pretreatment was followed by a transesterification reaction using alkaline catalyst. To minimize costs and accelerate reaction, the pretreatment and transesterification reaction of recycle vegetable oil was conducted at atmospheric pressure in a microwave oven. Biodiesel was characterized using a GC-MS method.

  15. Optimization of Hydrogen Production in Anaerobic Digestion Processes

    International Nuclear Information System (INIS)

    Cesar-Arturo Aceves-Lara; Eric Latrille; Thierry Conte; Nicolas Bernet; Pierre Buffiere; Jean-Philippe Steyer

    2006-01-01

    The hydrogen production using anaerobic digestion processes is strongly related to the operational conditions such as pH in the reactor, agitation of the liquid phase and hydraulic retention time (HRT). In this study, an experimental design has been carried out and the main effects and interactions between the three above mentioned factors have been evaluated. Experiments were performed in a continuous bioreactor with HRT of 6, 10 or 14 h, pH was regulated to 5.5, 5.75 or 6 and agitation speed was maintained at 150, 225 or 300 rpm. Molasses were used as substrate with a feeding concentration of 10 gCOD.L -1 . The maximum hydrogen rate production was 5.4 L.Lreactor -1 .d -1 . It was obtained for a pH of 5.5, a retention time of 6 h and an agitation speed of 300 rpm. The mathematical analysis of the experimental data revealed that two reactions could explain 89% of the total variance of the experimental data. Finally, the pseudo-stoichiometric coefficients were estimated and the effects of the operational conditions on the hydrogen production rates were calculated. (authors)

  16. Optimization of electrocoagulation process to treat biologically pretreated bagasse effluent

    Directory of Open Access Journals (Sweden)

    Thirugnanasambandham K.

    2014-01-01

    Full Text Available The main objective of the present study was to investigate the efficiency of electrocoagulation process as a post-treatment to treat biologically pretreated bagasse effluent using iron electrodes. The removal of chemical oxygen demand (COD and total suspended solids (TSS were studied under different operating conditions such as amount of dilution, initial pH, applied current and electrolyte dose by using response surface methodology (RSM coupled with four-factor three-level Box-Behnken experimental design (BBD. The experimental results were analyzed by Pareto analysis of variance (ANOVA and second order polynomial mathematical models were developed with high correlation of efficiency (R2 for COD, TSS removal and electrical energy consumption (EEC. The individual and combined effect of variables on responses was studied using three dimensional response surface plots. Under the optimum operating conditions, such as amount of dilution at 30 %, initial pH of 6.5, applied current of 8 mA cm-2 and electrolyte dose of 740 mg l-1 shows the higher removal efficiency of COD (98 % and TSS (93 % with EEC of 2.40 Wh, which were confirmed by validation experiments.

  17. Design strategy for optimal iterative learning control applied on a deep drawing process

    DEFF Research Database (Denmark)

    Endelt, Benny Ørtoft

    2017-01-01

    Metal forming processes in general can be characterised as repetitive processes; this work will take advantage of this characteristic by developing an algorithm or control system which transfers process information from part to part, reducing the impact of repetitive uncertainties, e.g. a gradual...... changes in the material properties. The process is highly non-linear and the system plant is modelled using a non-linear finite element and the gain factors for the iterative learning controller is identified solving a non-linear optimal control problem. The optimal control problem is formulated as a non...

  18. Fundamental Theories and Key Technologies for Smart and Optimal Manufacturing in the Process Industry

    Directory of Open Access Journals (Sweden)

    Feng Qian

    2017-04-01

    Full Text Available Given the significant requirements for transforming and promoting the process industry, we present the major limitations of current petrochemical enterprises, including limitations in decision-making, production operation, efficiency and security, information integration, and so forth. To promote a vision of the process industry with efficient, green, and smart production, modern information technology should be utilized throughout the entire optimization process for production, management, and marketing. To focus on smart equipment in manufacturing processes, as well as on the adaptive intelligent optimization of the manufacturing process, operating mode, and supply chain management, we put forward several key scientific problems in engineering in a demand-driven and application-oriented manner, namely: ① intelligent sensing and integration of all process information, including production and management information; ② collaborative decision-making in the supply chain, industry chain, and value chain, driven by knowledge; ③ cooperative control and optimization of plant-wide production processes via human-cyber-physical interaction; and ④ life-cycle assessments for safety and environmental footprint monitoring, in addition to tracing analysis and risk control. In order to solve these limitations and core scientific problems, we further present fundamental theories and key technologies for smart and optimal manufacturing in the process industry. Although this paper discusses the process industry in China, the conclusions in this paper can be extended to the process industry around the world.

  19. Optimization of TRPO process parameters for americium extraction from high level waste

    International Nuclear Information System (INIS)

    Chen Jing; Wang Jianchen; Song Chongli

    2001-01-01

    The numerical calculations for Am multistage fractional extraction by trialkyl phosphine oxide (TRPO) were verified by a hot test. 1750L/t-U high level waste (HLW) was used as the feed to the TRPO process. The analysis used the simple objective function to minimize the total waste content in the TRPO process streams. Some process parameters were optimized after other parameters were selected. The optimal process parameters for Am extraction by TRPO are: 10 stages for extraction and 2 stages for scrubbing; a flow rate ratio of 0.931 for extraction and 4.42 for scrubbing; nitric acid concentration of 1.35 mol/L for the feed and 0.5 mol/L for the scrubbing solution. Finally, the nitric acid and Am concentration profiles in the optimal TRPO extraction process are given

  20. Modeling, Simulation and Optimization of Hydrogen Production Process from Glycerol using Steam Reforming

    International Nuclear Information System (INIS)

    Park, Jeongpil; Cho, Sunghyun; Kim, Tae-Ok; Shin, Dongil; Lee, Seunghwan; Moon, Dong Ju

    2014-01-01

    For improved sustainability of the biorefinery industry, biorefinery-byproduct glycerol is being investigated as an alternate source for hydrogen production. This research designs and optimizes a hydrogen-production process for small hydrogen stations using steam reforming of purified glycerol as the main reaction, replacing existing processes relying on steam methane reforming. Modeling, simulation and optimization using a commercial process simulator are performed for the proposed hydrogen production process from glycerol. The mixture of glycerol and steam are used for making syngas in the reforming process. Then hydrogen are produced from carbon monoxide and steam through the water-gas shift reaction. Finally, hydrogen is separated from carbon dioxide using PSA. This study shows higher yield than former U.S.. DOE and Linde studies. Economic evaluations are performed for optimal planning of constructing domestic hydrogen energy infrastructure based on the proposed glycerol-based hydrogen station