WorldWideScience

Sample records for optimize cellular processes

  1. Cellular scanning strategy for selective laser melting: Generating reliable, optimized scanning paths and processing parameters

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2015-01-01

    to generate optimized cellular scanning strategies and processing parameters, with an objective of reducing thermal asymmetries and mechanical deformations. The optimized scanning strategies are used for selective laser melting of the standard samples, and experimental and numerical results are compared....... gradients that occur during the process. While process monitoring and control of selective laser melting is an active area of research, establishing the reliability and robustness of the process still remains a challenge.In this paper, a methodology for generating reliable, optimized scanning paths...

  2. Cellular scanning strategy for selective laser melting: Generating reliable, optimized scanning paths and processing parameters

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2015-01-01

    Selective laser melting is yet to become a standardized industrial manufacturing technique. The process continues to suffer from defects such as distortions, residual stresses, localized deformations and warpage caused primarily due to the localized heating, rapid cooling and high temperature...... gradients that occur during the process. While process monitoring and control of selective laser melting is an active area of research, establishing the reliability and robustness of the process still remains a challenge.In this paper, a methodology for generating reliable, optimized scanning paths...... and process parameters for selective laser melting of a standard sample is introduced. The processing of the sample is simulated by sequentially coupling a calibrated 3D pseudo-analytical thermal model with a 3D finite element mechanical model.The optimized processing parameters are subjected to a Monte Carlo...

  3. Stochastic Nature in Cellular Processes

    Institute of Scientific and Technical Information of China (English)

    刘波; 刘圣君; 王祺; 晏世伟; 耿轶钊; SAKATA Fumihiko; GAO Xing-Fa

    2011-01-01

    The importance of stochasticity in cellular processes is increasingly recognized in both theoretical and experimental studies. General features of stochasticity in gene regulation and expression are briefly reviewed in this article, which include the main experimental phenomena, classification, quantization and regulation of noises. The correlation and transmission of noise in cascade networks are analyzed further and the stochastic simulation methods that can capture effects of intrinsic and extrinsic noise are described.

  4. Optimal temporal patterns for dynamical cellular signaling

    Science.gov (United States)

    Hasegawa, Yoshihiko

    2016-11-01

    Cells use temporal dynamical patterns to transmit information via signaling pathways. As optimality with respect to the environment plays a fundamental role in biological systems, organisms have evolved optimal ways to transmit information. Here, we use optimal control theory to obtain the dynamical signal patterns for the optimal transmission of information, in terms of efficiency (low energy) and reliability (low uncertainty). Adopting an activation-deactivation decoding network, we reproduce several dynamical patterns found in actual signals, such as steep, gradual, and overshooting dynamics. Notably, when minimizing the energy of the input signal, the optimal signals exhibit overshooting, which is a biphasic pattern with transient and steady phases; this pattern is prevalent in actual dynamical patterns. We also identify conditions in which these three patterns (steep, gradual, and overshooting) confer advantages. Our study shows that cellular signal transduction is governed by the principle of minimizing free energy dissipation and uncertainty; these constraints serve as selective pressures when designing dynamical signaling patterns.

  5. Efficiency of cellular information processing

    CERN Document Server

    Barato, Andre C; Seifert, Udo

    2014-01-01

    We show that a rate of conditional Shannon entropy reduction, characterizing the learning of an internal process about an external process, is bounded by the thermodynamic entropy production. This approach allows for the definition of an informational efficiency that can be used to study cellular information processing. We analyze three models of increasing complexity inspired by the E. coli sensory network, where the external process is an external ligand concentration jumping between two values. We start with a simple model for which ATP must be consumed so that a protein inside the cell can learn about the external concentration. With a second model for a single receptor we show that the rate at which the receptor learns about the external environment can be nonzero even without any dissipation inside the cell since chemical work done by the external process compensates for this learning rate. The third model is more complete, also containing adaptation. For this model we show inter alia that a bacterium i...

  6. Optimal flux patterns in cellular metabolic networks

    Energy Technology Data Exchange (ETDEWEB)

    Almaas, E

    2007-01-20

    The availability of whole-cell level metabolic networks of high quality has made it possible to develop a predictive understanding of bacterial metabolism. Using the optimization framework of flux balance analysis, I investigate metabolic response and activity patterns to variations in the availability of nutrient and chemical factors such as oxygen and ammonia by simulating 30,000 random cellular environments. The distribution of reaction fluxes is heavy-tailed for the bacteria H. pylori and E. coli, and the eukaryote S. cerevisiae. While the majority of flux balance investigations have relied on implementations of the simplex method, it is necessary to use interior-point optimization algorithms to adequately characterize the full range of activity patterns on metabolic networks. The interior-point activity pattern is bimodal for E. coli and S. cerevisiae, suggesting that most metabolic reaction are either in frequent use or are rarely active. The trimodal activity pattern of H. pylori indicates that a group of its metabolic reactions (20%) are active in approximately half of the simulated environments. Constructing the high-flux backbone of the network for every environment, there is a clear trend that the more frequently a reaction is active, the more likely it is a part of the backbone. Finally, I briefly discuss the predicted activity patterns of the central-carbon metabolic pathways for the sample of random environments.

  7. Optimal flux patterns in cellular metabolic networks

    Energy Technology Data Exchange (ETDEWEB)

    Almaas, E

    2007-01-20

    The availability of whole-cell level metabolic networks of high quality has made it possible to develop a predictive understanding of bacterial metabolism. Using the optimization framework of flux balance analysis, I investigate metabolic response and activity patterns to variations in the availability of nutrient and chemical factors such as oxygen and ammonia by simulating 30,000 random cellular environments. The distribution of reaction fluxes is heavy-tailed for the bacteria H. pylori and E. coli, and the eukaryote S. cerevisiae. While the majority of flux balance investigations have relied on implementations of the simplex method, it is necessary to use interior-point optimization algorithms to adequately characterize the full range of activity patterns on metabolic networks. The interior-point activity pattern is bimodal for E. coli and S. cerevisiae, suggesting that most metabolic reaction are either in frequent use or are rarely active. The trimodal activity pattern of H. pylori indicates that a group of its metabolic reactions (20%) are active in approximately half of the simulated environments. Constructing the high-flux backbone of the network for every environment, there is a clear trend that the more frequently a reaction is active, the more likely it is a part of the backbone. Finally, I briefly discuss the predicted activity patterns of the central-carbon metabolic pathways for the sample of random environments.

  8. EXTERIOR PRESSURE OF THE GASEOUS MEDIUM AS AN ADDITIONAL TECHNOLOGICAL FACTOR FOR OPTIMIZING THE VAPORIZATION PROCESS IN THE PRODUCTION OF CELLULAR SILICATE CONCRETE

    Directory of Open Access Journals (Sweden)

    A. A. Rezanov

    2012-11-01

    Full Text Available Statement of the problem. The quality of silicate porous concrete is largely determined by vapor-ization processes at the stage of the formation of the macrostructure of the obtained material. In the production of cellular concrete with the use of injection molding, the existing manufacturing technologies do not enable the expeditious handling of the vaporization process. This is why there is a growing need to develop additional efficient methods of handling the vaporization process thus improving cellular silicate concrete.Results. Based on modelling and detailed examination of the balance of pressure affecting devel-oping gas pores, mechanisms and factors governing a defect-free structure are found. An additional governing factor, which is a pressure of the external gaseous medium, was discovered. The approaches to handling the vaporization process have been developed and a plant fitted with a system of automatic control of vaporization process by conscious operative pressuring effect from the external gaseous phase on a poring mixture has been designed.Conclusions. Theoretical validation along with the results of the experimental study help to arrive at the conclusion about the efficiency of the suggested system in controlling vaporization that could provide a good addition to the traditional injection molding and make it more susceptible against varying characteristics of raw materials.

  9. Optimized Cellular Core for Rotorcraft Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Patz Materials and Technologies proposes to develop a unique structural cellular core material to improve mechanical performance, reduce platform weight and lower...

  10. Optimal Band Allocation for Cognitive Cellular Networks

    CERN Document Server

    Liu, Tingting

    2011-01-01

    FCC new regulation for cognitive use of the TV white space spectrum provides a new means for improving traditional cellular network performance. But it also introduces a number of technical challenges. This letter studies one of the challenges, that is, given the significant differences in the propagation property and the transmit power limitations between the cellular band and the TV white space, how to jointly utilize both bands such that the benefit from the TV white space for improving cellular network performance is maximized. Both analytical and simulation results are provided.

  11. Optimized Cellular Core for Rotorcraft Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Patz Materials and Technologies has developed, produced and tested, as part of the Phase-I SBIR, a new form of composite cellular core material, named Interply Core,...

  12. Optimal signal patterns for dynamical cellular communication

    CERN Document Server

    Hasegawa, Yoshihiko

    2015-01-01

    Cells transmit information via signaling pathways, using temporal dynamical patterns. As optimality with respect to environments is the universal principle in biological systems, organisms have acquired an optimal way of transmitting information. Here we obtain optimal dynamical signal patterns which can transmit information efficiently (low power) and reliably (high accuracy) using the optimal control theory. Adopting an activation-inactivation decoding network, we reproduced several dynamical patterns found in actual signals, such as steep, gradual and overshooting dynamics. Notably, when minimizing the power of the input signal, optimal signals exhibit the overshooting pattern, which is a biphasic pattern with transient and steady phases; this pattern is prevalent in actual dynamical patterns as it can be generated by an incoherent feed-forward loop (FFL), a common motif in biochemical networks. We also identified conditions when the three patterns, steep, gradual and overshooting, confer advantages.

  13. Estimating cellular parameters through optimization procedures: elementary principles and applications

    Directory of Open Access Journals (Sweden)

    Akatsuki eKimura

    2015-03-01

    Full Text Available Construction of quantitative models is a primary goal of quantitative biology, which aims to understand cellular and organismal phenomena in a quantitative manner. In this article, we introduce optimization procedures to search for parameters in a quantitative model that can reproduce experimental data. The aim of optimization is to minimize the sum of squared errors (SSE in a prediction or to maximize likelihood. A (local maximum of likelihood or (local minimum of the SSE can efficiently be identified using gradient approaches. Addition of a stochastic process enables us to identify the global maximum/minimum without becoming trapped in local maxima/minima. Sampling approaches take advantage of increasing computational power to test numerous sets of parameters in order to determine the optimum set. By combining Bayesian inference with gradient or sampling approaches, we can estimate both the optimum parameters and the form of the likelihood function related to the parameters. Finally, we introduce four examples of research that utilize parameter optimization to obtain biological insights from quantified data: transcriptional regulation, bacterial chemotaxis, morphogenesis, and cell cycle regulation. With practical knowledge of parameter optimization, cell and developmental biologists can develop realistic models that reproduce their observations and thus, obtain mechanistic insights into phenomena of interest.

  14. Optimization of Inter Cellular Movement of Parts in Cellular Manufacturing System Using Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Siva Prasad Darla

    2014-01-01

    Full Text Available In the modern manufacturing environment, Cellular Manufacturing Systems (CMS have gained greater importance in job shop or batch-type production to gain economic advantage similar to those of mass production. Successful implementation of CMS highly depends on the determination of part families; machine cells and minimizing inter cellular movement. This study considers machine component grouping problems namely inter-cellular movement and cell load variation by developing a mathematical model and optimizing the solution using Genetic Algorithm to arrive at a cell formation to minimize the inter-cellular movement and cell load variation. The results are presented with a numerical example.

  15. Cellular automata in image processing and geometry

    CERN Document Server

    Adamatzky, Andrew; Sun, Xianfang

    2014-01-01

    The book presents findings, views and ideas on what exact problems of image processing, pattern recognition and generation can be efficiently solved by cellular automata architectures. This volume provides a convenient collection in this area, in which publications are otherwise widely scattered throughout the literature. The topics covered include image compression and resizing; skeletonization, erosion and dilation; convex hull computation, edge detection and segmentation; forgery detection and content based retrieval; and pattern generation. The book advances the theory of image processing, pattern recognition and generation as well as the design of efficient algorithms and hardware for parallel image processing and analysis. It is aimed at computer scientists, software programmers, electronic engineers, mathematicians and physicists, and at everyone who studies or develops cellular automaton algorithms and tools for image processing and analysis, or develops novel architectures and implementations of mass...

  16. EFFECTIVENESS OF CELLULAR INJECTION MOLDING PROCESS

    Directory of Open Access Journals (Sweden)

    Tomasz Garbacz

    2013-06-01

    Full Text Available In a study of cellular injection, molding process uses polyvinylchloride PVC. Polymers modified with introducing blowing agents into them in the Laboratory of the Department of Technologies and Materiase of Technical University of Kosice. For technological reasons, blowing agents have a form of granules. In the experiment, the content of the blowing agent (0–2,0 % by mass fed into the processed polymer was adopted as a variable factor. In the studies presented in the article, the chemical blowing agents occurring in the granulated form with a diameter of 1.2 to 1.4 mm were used. The view of the technological line for cellular injection molding and injection mold cavity with injection moldings are shown in Figure 1. The results of the determination of selected properties of injection molded parts for various polymeric materials, obtained with different content of blowing agents, are shown in Figures 4-7. Microscopic examination of cross-sectional structure of the moldings were obtained using the author's position image analysis of porous structure. Based on analysis of photographs taken (Figures 7, 8, 9 it was found that the coating containing 1.0% of blowing agents is a clearly visible solid outer layer and uniform distribution of pores and their sizes are similar.

  17. Gateway Deployment optimization in Cellular Wi-Fi Mesh Networks

    Directory of Open Access Journals (Sweden)

    Rajesh Prasad

    2006-07-01

    Full Text Available With the standardization of IEEE 802.11, there has been an explosive growth of wireless local area networks (WLAN. Recently, this cost effective technology is being developed aggressively for establishing metro-scale “cellular Wi-Fi” network to support seamless Internet access in the urban area. We envision a large scale WLAN system in the future where Access Points (APs will be scattered over an entire city enabling people to use their mobile devices ubiquitously. The problem addressed in this paper involves finding the minimum number of gateways and their optimal placement so as to minimize the network installation costs while maintaining reliability, flexibility and an acceptable grade of service. The problem is modeled taking a network graph, where the nodes represents either the Access Points of IEEE 802.11 or wired backbone gateways. In this paper, we present two methods (1 an innovative approach using integer linear programming (ILP for gateway selection in the cellular Wi-Fi network, and (2 a completely new heuristic (OPEN/CLOSE to solve the gateway selection problem. In the ILP model, we developed a set of linear inequalities based on various constraints. The ILP model is solved by using lp-solve, a simplex-based software for linear and integer programming problems. The second approach is an OPEN/CLOSE heuristic, tailored for cellular Wi-Fi, which arrives at a sub-optimal solution. Java programming language is used for simulation in OPEN/CLOSE heuristic. Extensive simulations are carried out for performance evaluation. Simulation results show that the proposed approaches can effectively identify a set of gateways at optimal locations in a cellular Wi-Fi network, resulting in an overall cost reduction of up to 50%. The technique presented in this paper is generalized and can be used for gateway selection for other networks as well.

  18. Cellular Neural Networks for NP-Hard Optimization

    Directory of Open Access Journals (Sweden)

    Mária Ercsey-Ravasz

    2009-02-01

    Full Text Available A cellular neural/nonlinear network (CNN is used for NP-hard optimization. We prove that a CNN in which the parameters of all cells can be separately controlled is the analog correspondent of a two-dimensional Ising-type (Edwards-Anderson spin-glass system. Using the properties of CNN, we show that one single operation (template always yields a local minimum of the spin-glass energy function. This way, a very fast optimization method, similar to simulated annealing, can be built. Estimating the simulation time needed on CNN-based computers, and comparing it with the time needed on normal digital computers using the simulated annealing algorithm, the results are astonishing. CNN computers could be faster than digital computers already at 10×10 lattice sizes. The local control of the template parameters was already partially realized on some of the hardwares, we think this study could further motivate their development in this direction.

  19. Fuel management optimization based on power profile by Cellular Automata

    Energy Technology Data Exchange (ETDEWEB)

    Fadaei, Amir Hosein, E-mail: Fadaei_amir@aut.ac.i [Faculty of Nuclear Engineering and Physics, Amirkabir University of Technology (Tehran Polytechnique), Hafez Street, Tehran (Iran, Islamic Republic of); Moghaddam, Nader Maleki [Faculty of Nuclear Engineering and Physics, Amirkabir University of Technology (Tehran Polytechnique), Hafez Street, Tehran (Iran, Islamic Republic of); Zahedinejad, Ehsan [Department of Energy Engineering, Sharif University of Technology, Azadi Str., Tehran (Iran, Islamic Republic of); Fadaei, Mohammad Mehdi [Department of Electrical Engineering, Faculty of Engineering, Central Tehran Branch, Islamic Azad University, Punak Square, Tehran (Iran, Islamic Republic of); Kia, Shabnam [Faculty of Engineering, Islamic Azad University, Science and Research Branch, Punak Square, Tehran (Iran, Islamic Republic of)

    2010-12-15

    Fuel management in PWR nuclear reactors is comprised of a collection of principles and practices required for the planning, scheduling, refueling, and safe operation of nuclear power plants to minimize the total plant and system energy costs to the extent possible. Despite remarkable advancements in optimization procedures, inherent complexities in nuclear reactor structure and strong inter-dependency among the fundamental parameters of the core make it necessary to evaluate the most efficient arrangement of the core. Several patterns have been presented so far to determine the best configuration of fuels in the reactor core by emphasis on minimizing the local power peaking factor (P{sub q}). In this research, a new strategy for optimizing the fuel arrangements in a VVER-1000 reactor core is developed while lowering the P{sub q} is considered as the main target. For this purpose, a Fuel Quality Factor, Z(r), served to depict the reactor core pattern. Mapping to ideal pattern is tracked over the optimization procedure in which the ideal pattern is prepared with considering the Z(r) constraints and their effects on flux and P{sub q} uniformity. For finding the best configuration corresponding to the desired pattern, Cellular Automata (CA) is applied as a powerful and reliable tool on optimization procedure. To obtain the Z(r) constraints, the MCNP code was used and core calculations were performed by WIMS and CITATION codes. The results are compared with the predictions of a Neural Network as a smart optimization method, and the Final Safety Analysis Report (FSAR) as a reference proposed by the designer.

  20. Multi-objective optimization of cellular scanning strategy in selective laser melting

    DEFF Research Database (Denmark)

    Ahrari, Ali; Deb, Kalyanmoy; Mohanty, Sankhya

    2017-01-01

    The scanning strategy for selective laser melting - an additive manufacturing process - determines the temperature fields during the manufacturing process, which in turn affects residual stresses and distortions, two of the main sources of process-induced defects. The goal of this study...... is to develop a multi-objective approach to optimize the cellular scanning strategy such that the two aforementioned defects are minimized. The decision variable in the chosen problem is a combination of the sequence in which cells are processed and one of six scanning strategies applied to each cell. Thus...

  1. Optimal Control of Teaching Process

    Institute of Scientific and Technical Information of China (English)

    BAO Man; ZHANG Guo-zhi

    2002-01-01

    The authors first put forward quadratic form performance index as a criterion of measuringmerits and demerits of teaching process. On this base, we got a low of optimal control after the quantificationof the teacher's functions. It must play a leading role on how the teacher fully controls the whole teachingprocess.

  2. Rejuvenating cellular respiration for optimizing respiratory function: targeting mitochondria.

    Science.gov (United States)

    Agrawal, Anurag; Mabalirajan, Ulaganathan

    2016-01-15

    Altered bioenergetics with increased mitochondrial reactive oxygen species production and degradation of epithelial function are key aspects of pathogenesis in asthma and chronic obstructive pulmonary disease (COPD). This motif is not unique to obstructive airway disease, reported in related airway diseases such as bronchopulmonary dysplasia and parenchymal diseases such as pulmonary fibrosis. Similarly, mitochondrial dysfunction in vascular endothelium or skeletal muscles contributes to the development of pulmonary hypertension and systemic manifestations of lung disease. In experimental models of COPD or asthma, the use of mitochondria-targeted antioxidants, such as MitoQ, has substantially improved mitochondrial health and restored respiratory function. Modulation of noncoding RNA or protein regulators of mitochondrial biogenesis, dynamics, or degradation has been found to be effective in models of fibrosis, emphysema, asthma, and pulmonary hypertension. Transfer of healthy mitochondria to epithelial cells has been associated with remarkable therapeutic efficacy in models of acute lung injury and asthma. Together, these form a 3R model--repair, reprogramming, and replacement--for mitochondria-targeted therapies in lung disease. This review highlights the key role of mitochondrial function in lung health and disease, with a focus on asthma and COPD, and provides an overview of mitochondria-targeted strategies for rejuvenating cellular respiration and optimizing respiratory function in lung diseases. Copyright © 2016 the American Physiological Society.

  3. Additional force field in cooling process of cellular Al alloy

    Institute of Scientific and Technical Information of China (English)

    郑明军; 何德坪; 戴戈

    2002-01-01

    The foaming process of Al alloy is similar to that of Al, but there is a solid-liquid state zone in the solidification process of cellular Al alloy which does not exist in the case of Al. In the unidirectional solidification of cellular Al alloy, the proportion of the solid phase gradually reduces from the solid front to the liquid front. This will introduce a force and result in a serious quick shrinkage. By the mathematic and physical mode, the solidification of the cellular Al alloy is studied. The data measured by experiment are close to the result calculated by the mode. This kind of shrinkage can be solved by suitable cooling method in appropriate growth stage. The compressive strength of the cellular Al alloy made by this way is 40% higher than that of cellular Al.

  4. Minimal model for complex dynamics in cellular processes.

    Science.gov (United States)

    Suguna, C; Chowdhury, K K; Sinha, S

    1999-11-01

    Cellular functions are controlled and coordinated by the complex circuitry of biochemical pathways regulated by genetic and metabolic feedback processes. This paper aims to show, with the help of a minimal model of a regulated biochemical pathway, that the common nonlinearities and control structures present in biomolecular interactions are capable of eliciting a variety of functional dynamics, such as homeostasis, periodic, complex, and chaotic oscillations, including transients, that are observed in various cellular processes.

  5. Green Cellular - Optimizing the Cellular Network for Minimal Emission from Mobile Stations

    CERN Document Server

    Ezri, Doron

    2009-01-01

    Wireless systems, which include cellular phones, have become an essential part of the modern life. However the mounting evidence that cellular radiation might adversely affect the health of its users, leads to a growing concern among authorities and the general public. Radiating antennas in the proximity of the user, such as antennas of mobile phones are of special interest for this matter. In this paper we suggest a new architecture for wireless networks, aiming at minimal emission from mobile stations, without any additional radiation sources. The new architecture, dubbed Green Cellular, abandons the classical transceiver base station design and suggests the augmentation of transceiver base stations with receive only devices. These devices, dubbed Green Antennas, are not aiming at coverage extension but rather at minimizing the emission from mobile stations. We discuss the implications of the Green Cellular architecture on 3G and 4G cellular technologies. We conclude by showing that employing the Green Cell...

  6. Manipulation of Cellular Processing Bodies and Their Constituents by Viruses

    OpenAIRE

    Pattnaik, Asit K.; Dinh, Phat X.

    2013-01-01

    The processing bodies (PBs) are a form of cytoplasmic aggregates that house the cellular RNA decay machinery as well as many RNA-binding proteins and mRNAs. The PBs are constitutively present in eukaryotic cells and are involved in maintaining cellular homeostasis by regulating RNA metabolism, cell signaling, and survival. Virus infections result in modification of the PBs and their constituents. Many viruses induce compositionally altered PBs, while many others use specific components of the...

  7. Triple Bioluminescence Imaging for In Vivo Monitoring of Cellular Processes

    Directory of Open Access Journals (Sweden)

    Casey A Maguire

    2013-01-01

    Full Text Available Bioluminescence imaging (BLI has shown to be crucial for monitoring in vivo biological processes. So far, only dual bioluminescence imaging using firefly (Fluc and Renilla or Gaussia (Gluc luciferase has been achieved due to the lack of availability of other efficiently expressed luciferases using different substrates. Here, we characterized a codon-optimized luciferase from Vargula hilgendorfii (Vluc as a reporter for mammalian gene expression. We showed that Vluc can be multiplexed with Gluc and Fluc for sequential imaging of three distinct cellular phenomena in the same biological system using vargulin, coelenterazine, and D-luciferin substrates, respectively. We applied this triple imaging system to monitor the effect of soluble tumor necrosis factor-related apoptosis-inducing ligand (sTRAIL delivered using an adeno-associated viral vector (AAV on brain tumors in mice. Vluc imaging showed efficient sTRAIL gene delivery to the brain, while Fluc imaging revealed a robust antiglioma therapy. Further, nuclear factor-κB (NF-κB activation in response to sTRAIL binding to glioma cells death receptors was monitored by Gluc imaging. This work is the first demonstration of trimodal in vivo bioluminescence imaging and will have a broad applicability in many different fields including immunology, oncology, virology, and neuroscience.

  8. OPTIMIZING AN ALUMINUM EXTRUSION PROCESS

    Directory of Open Access Journals (Sweden)

    Mohammed Ali Hajeeh

    2013-01-01

    Full Text Available Minimizing the amount of scrap generated in an aluminum extrusion process. An optimizing model is constructed in order to select the best cutting patterns of aluminum logs and billets of various sizes and shapes. The model applied to real data obtained from an existing extrusion factory in Kuwait. Results from using the suggested model provided substantial reductions in the amount of scrap generated. Using sound mathematical approaches contribute significantly in reducing waste and savings when compared to the existing non scientific techniques.

  9. Cellular Particle Dynamics simulation of biomechanical relaxation processes of multi-cellular systems

    Science.gov (United States)

    McCune, Matthew; Kosztin, Ioan

    2013-03-01

    Cellular Particle Dynamics (CPD) is a theoretical-computational-experimental framework for describing and predicting the time evolution of biomechanical relaxation processes of multi-cellular systems, such as fusion, sorting and compression. In CPD, cells are modeled as an ensemble of cellular particles (CPs) that interact via short range contact interactions, characterized by an attractive (adhesive interaction) and a repulsive (excluded volume interaction) component. The time evolution of the spatial conformation of the multicellular system is determined by following the trajectories of all CPs through numerical integration of their equations of motion. Here we present CPD simulation results for the fusion of both spherical and cylindrical multi-cellular aggregates. First, we calibrate the relevant CPD model parameters for a given cell type by comparing the CPD simulation results for the fusion of two spherical aggregates to the corresponding experimental results. Next, CPD simulations are used to predict the time evolution of the fusion of cylindrical aggregates. The latter is relevant for the formation of tubular multi-cellular structures (i.e., primitive blood vessels) created by the novel bioprinting technology. Work supported by NSF [PHY-0957914]. Computer time provided by the University of Missouri Bioinformatics Consortium.

  10. The Algorithm of Continuous Optimization Based on the Modified Cellular Automaton

    Directory of Open Access Journals (Sweden)

    Oleg Evsutin

    2016-08-01

    Full Text Available This article is devoted to the application of the cellular automata mathematical apparatus to the problem of continuous optimization. The cellular automaton with an objective function is introduced as a new modification of the classic cellular automaton. The algorithm of continuous optimization, which is based on dynamics of the cellular automaton having the property of geometric symmetry, is obtained. The results of the simulation experiments with the obtained algorithm on standard test functions are provided, and a comparison between the analogs is shown.

  11. STATISTICAL OPTIMIZATION OF PROCESS VARIABLES FOR ...

    African Journals Online (AJOL)

    2012-11-03

    Nov 3, 2012 ... predictive results. The osmotic dehydration process was optimized for water loss and solutes gain. ... Process Variables Optimization for Osmotic Dehydration of Okra in Sucrose Solution. 371 ...... Science des Aliments, Vol. 10,.

  12. Dynamic Optimization of UV Flash Processes

    DEFF Research Database (Denmark)

    Ritschel, Tobias Kasper Skovborg; Capolei, Andrea; Jørgensen, John Bagterp

    2017-01-01

    UV ash processes, also referred to as isoenergetic-isochoric ash processes, occur for dynamic simulation and optimization of vapor-liquid equilibrium processes. Dynamic optimization and nonlinear model predictive control of distillation columns, certain two-phase ow problems, as well as oil reser...... that the optimization solver, the compiler, and high-performance linear algebra software are all important for e_cient dynamic optimization of UV ash processes....

  13. MODERNIZATION OF TECHNOLOGICAL LINE FOR CELLULAR EXTRUSION PROCESS

    Directory of Open Access Journals (Sweden)

    Tomasz Garbacz

    2014-06-01

    As part of the modernization of the cellular extrusion technology the extrusion head was designed and made. During the designing and modeling of the head the Auto CAD programe was used. After the prototyping the extrusion head was tested. In the article specification of cellular extrusion process of thermoplastics was presented. In the research, the endothermal chemical blowing agents in amount 1,0% by mass were used. The quantity of used blowing agent has a direct influence on density and structure of the extruded product of modified polymers. However, these properties have further influence on porosity, impact strength, hardness, tensile strength and another.

  14. Piezo proteins: regulators of mechanosensation and other cellular processes.

    Science.gov (United States)

    Bagriantsev, Sviatoslav N; Gracheva, Elena O; Gallagher, Patrick G

    2014-11-14

    Piezo proteins have recently been identified as ion channels mediating mechanosensory transduction in mammalian cells. Characterization of these channels has yielded important insights into mechanisms of somatosensation, as well as other mechano-associated biologic processes such as sensing of shear stress, particularly in the vasculature, and regulation of urine flow and bladder distention. Other roles for Piezo proteins have emerged, some unexpected, including participation in cellular development, volume regulation, cellular migration, proliferation, and elongation. Mutations in human Piezo proteins have been associated with a variety of disorders including hereditary xerocytosis and several syndromes with muscular contracture as a prominent feature.

  15. Optimal control of induction heating processes

    CERN Document Server

    Rapoport, Edgar

    2006-01-01

    This book introduces new approaches to solving optimal control problems in induction heating process applications. Optimal Control of Induction Heating Processes demonstrates how to apply and use new optimization techniques for different types of induction heating installations. Focusing on practical methods for solving real engineering optimization problems, the text features a variety of specific optimization examples for induction heater modes and designs, particularly those used in industrial applications. The book describes basic physical phenomena in induction heating and induction

  16. Resource Partitioning and Routing Optimization in Relay Enhanced Cellular Network

    Institute of Scientific and Technical Information of China (English)

    LIU Tao; RONG Meng-tian; SHI Hong-kui; XUE Yi-sheng

    2007-01-01

    A joint routing and resource partitioning scheme were proposed to improve cell capacity and user throughput of cellular network enhanced with two-hop fixed relay nodes (FRNs).Radio resources are partitioned under a reuse partitioning based framework, which guarantees effective and efficient inter-cell interference management.At the same time, each mobile terminal was assigned a channel-dependent route by the routing controller,which tries to maximize the cell capacity under the constraint imposed by reuse partitioning.Intensive computer simulations demonstrate the performance superiority of the FRN enhanced cellular network employing this scheme in comparison with conventional network, as well as the validity of the channel-dependent routing mechanism.

  17. Simulation Based Optimization of Complex Monolithic Composite Structures Using Cellular Core Technology

    Science.gov (United States)

    Hickmott, Curtis W.

    Cellular core tooling is a new technology which has the capability to manufacture complex integrated monolithic composite structures. This novel tooling method utilizes thermoplastic cellular cores as inner tooling. The semi-rigid nature of the cellular cores makes them convenient for lay-up, and under autoclave temperature and pressure they soften and expand providing uniform compaction on all surfaces including internal features such as ribs and spar tubes. This process has the capability of developing fully optimized aerospace structures by reducing or eliminating assembly using fasteners or bonded joints. The technology is studied in the context of evaluating its capabilities, advantages, and limitations in developing high quality structures. The complex nature of these parts has led to development of a model using the Finite Element Analysis (FEA) software Abaqus and the plug-in COMPRO Common Component Architecture (CCA) provided by Convergent Manufacturing Technologies. This model utilizes a "virtual autoclave" technique to simulate temperature profiles, resin flow paths, and ultimately deformation from residual stress. A model has been developed simulating the temperature profile during curing of composite parts made with the cellular core technology. While modeling of composites has been performed in the past, this project will look to take this existing knowledge and apply it to this new manufacturing method capable of building more complex parts and develop a model designed specifically for building large, complex components with a high degree of accuracy. The model development has been carried out in conjunction with experimental validation. A double box beam structure was chosen for analysis to determine the effects of the technology on internal ribs and joints. Double box beams were manufactured and sectioned into T-joints for characterization. Mechanical behavior of T-joints was performed using the T-joint pull-off test and compared to traditional

  18. Energy Management Optimization for Cellular Networks under Renewable Energy Generation Uncertainty

    KAUST Repository

    Rached, Nadhir Ben

    2017-03-28

    The integration of renewable energy (RE) as an alternative power source for cellular networks has been deeply investigated in literature. However, RE generation is often assumed to be deterministic; an impractical assumption for realistic scenarios. In this paper, an efficient energy procurement strategy for cellular networks powered simultaneously by the smart grid (SG) and locally deployed RE sources characterized by uncertain processes is proposed. For a one-day operation cycle, the mobile operator aims to reduce its total energy cost by optimizing the amounts of energy to be procured from the local RE sources and SG at each time period. Additionally, it aims to determine the amount of extra generated RE to be sold back to SG. A chance constrained optimization is first proposed to deal with the RE generation uncertainty. Then, two convex approximation approaches: Chernoff and Chebyshev methods, characterized by different levels of knowledge about the RE generation, are developed to determine the energy procurement strategy for different risk levels. In addition, their performances are analyzed for various daily scenarios through selected simulation results. It is shown that the higher complex Chernoff method outperforms the Chebyshev one for different risk levels set by the operator.

  19. A synthetic biology approach to understanding cellular information processing.

    Science.gov (United States)

    Riccione, Katherine A; Smith, Robert P; Lee, Anna J; You, Lingchong

    2012-09-21

    The survival of cells and organisms requires proper responses to environmental signals. These responses are governed by cellular networks, which serve to process diverse environmental cues. Biological networks often contain recurring network topologies called "motifs". It has been recognized that the study of such motifs allows one to predict the response of a biological network and thus cellular behavior. However, studying a single motif in complete isolation of all other network motifs in a natural setting is difficult. Synthetic biology has emerged as a powerful approach to understanding the dynamic properties of network motifs. In addition to testing existing theoretical predictions, construction and analysis of synthetic gene circuits has led to the discovery of novel motif dynamics, such as how the combination of simple motifs can lead to autonomous dynamics or how noise in transcription and translation can affect the dynamics of a motif. Here, we review developments in synthetic biology as they pertain to increasing our understanding of cellular information processing. We highlight several types of dynamic behaviors that diverse motifs can generate, including the control of input/output responses, the generation of autonomous spatial and temporal dynamics, as well as the influence of noise in motif dynamics and cellular behavior.

  20. Optimal channel utilization and service protection in cellular communication systems

    DEFF Research Database (Denmark)

    Iversen, Villy Bæk

    1997-01-01

    In mobile communications an efficient utilization of the channels is of great importance.In this paper we consider the basic principles for obtaining the maximum utilization, and we study strategies for obtaining these limits.In general a high degree of sharing is efficient, but requires service...... protection mechanisms for protecting services and subscriber groups.We study cellular systems with overlaid cells, and the effect of overlapping cells, and we show that by dynamic channel allocation we obtain a high utilization.The models are generalizations of the Erlang-B formula, and can be evaluated...

  1. An Optimization Framework for Travel Pattern Interpretation of Cellular Data

    Directory of Open Access Journals (Sweden)

    Sarit Freund

    2013-09-01

    This paper explores methods for identifying travel patterns from cellular data. A primary challenge in this research is to provide an interpretation of the raw data that distinguishes between activity durations and travel durations. A novel framework is proposed for this purpose, based on a grading scheme for candidate interpretations of the raw data. A genetic algorithm is used to find interpretations with high grades, which are considered as the most reasonable ones. The proposed method is tested on a dataset of records covering 9454 cell-phone users over a period of one week. Preliminary evaluation of the resulting interpretations is presented.

  2. Complement-Mediated Regulation of Metabolism and Basic Cellular Processes.

    Science.gov (United States)

    Hess, Christoph; Kemper, Claudia

    2016-08-16

    Complement is well appreciated as a critical arm of innate immunity. It is required for the removal of invading pathogens and works by directly destroying them through the activation of innate and adaptive immune cells. However, complement activation and function is not confined to the extracellular space but also occurs within cells. Recent work indicates that complement activation regulates key metabolic pathways and thus can impact fundamental cellular processes, such as survival, proliferation, and autophagy. Newly identified functions of complement include a key role in shaping metabolic reprogramming, which underlies T cell effector differentiation, and a role as a nexus for interactions with other effector systems, in particular the inflammasome and Notch transcription-factor networks. This review focuses on the contributions of complement to basic processes of the cell, in particular the integration of complement with cellular metabolism and the potential implications in infection and other disease settings.

  3. Misconceptions on ATP thermodynamic role in cellular processes

    Directory of Open Access Journals (Sweden)

    R. M. Martins

    2011-04-01

    Full Text Available The occurrence and permanence of misconceptions have negative implications on the learning processes  since it impairs theconstruction of significant learning. Misconceptions correction is a complex task due to the difficulties intheir detection and high resistance to their removal. The mainobjective of the present work was to investigate misconceptions about the thermodynamic role of theATP in cellular processes. Tests were realized with high school (HS, undergraduate (UG and graduate students involved in PhD programs (G. In this survey students answered a 15 item questionnaire dealing with the ATP role as the cellular energy source. The stability of such misconceptionswere verified: one result shows that 68% HS, 92% UG and 91% G students state that the energy from ATP hydrolysis is responsiblefor driving cellular processes. Overall results show that students carry misconceptions on basic thermodynamic concepts such as energy transfer and chemical reactions spontaneity. One source of the prevalence of the discussed misconceptions aretextbooks, where schemes, figures and even text early introducefalse concepts on the ATP role.

  4. Topology optimization of adaptive fluid-actuated cellular structures with arbitrary polygonal motor cells

    Science.gov (United States)

    Lv, Jun; Tang, Liang; Li, Wenbo; Liu, Lei; Zhang, Hongwu

    2016-05-01

    This paper mainly focuses on the fast and efficient design method for plant bioinspired fluidic cellular materials and structures composed of polygonal motor cells. Here we developed a novel structural optimization method with arbitrary polygonal coarse-grid elements based on multiscale finite element frameworks. The fluidic cellular structures are meshed with irregular polygonal coarse-grid elements according to their natural size and the shape of the imbedded motor cells. The multiscale base functions of solid displacement and hydraulic pressure are then constructed to bring the small-scale information of the irregular motor cells to the large-scale simulations on the polygonal coarse-grid elements. On this basis, a new topology optimization method based on the resulting polygonal coarse-grid elements is proposed to determine the optimal distributions or number of motor cells in the smart cellular structures. Three types of optimization problems are solved according to the usages of the fluidic cellular structures. Firstly, the proposed optimization method is utilized to minimize the system compliance of the load-bearing fluidic cellular structures. Second, the method is further extended to design biomimetic compliant actuators of the fluidic cellular materials due to the fact that non-uniform volume expansions of fluid in the cells can induce elastic action. Third, the optimization problem focuses on the weight minimization of the cellular structure under the constraints for the compliance of the whole system. Several representative examples are investigated to validate the effectiveness of the proposed polygon-based topology optimization method of the smart materials.

  5. Multiscale mathematical modeling and simulation of cellular dynamical process.

    Science.gov (United States)

    Nakaoka, Shinji

    2014-01-01

    Epidermal homeostasis is maintained by dynamic interactions among molecules and cells at different spatiotemporal scales. Mathematical modeling and simulation is expected to provide clear understanding and precise description of multiscaleness in tissue homeostasis under systems perspective. We introduce a stochastic process-based description of multiscale dynamics. Agent-based modeling as a framework of multiscale modeling to achieve consistent integration of definitive subsystems is proposed. A newly developed algorithm that particularly aims to perform stochastic simulations of cellular dynamical process is introduced. Finally we review applications of multiscale modeling and quantitative study to important aspects of epidermal and epithelial homeostasis.

  6. Cellular Automaton Simulation of Evacuation Process in Story

    Institute of Scientific and Technical Information of China (English)

    MA Chang-Qun; ZHENG Rong-Sen; GAO Chun-Yuan; QIU Bing; DENG Min-Yi; KONC Ling-Jiang; LIU Mu-Ren

    2008-01-01

    Computer simulations on the evacuation process in a story are launched with cellular automaton in this article. The story is composed of five rooms and one corridor. Influence of various parameters on the evacuation process is investigated. It shows that the width of the door of rooms has little influence but the width of the corridor and themaximum velocity of the pedestrian have great influence on the time for evacuation. The relation between evacuation time and the width of corridor is found as tc ∝ W-.0.84. It is also found that appropriate shape of the room is helpful to evacuation.

  7. Wave dynamic processes in cellular detonation reflection from wedges

    Institute of Scientific and Technical Information of China (English)

    Zongmin Hu; Zonglin Jiang

    2007-01-01

    When the cell width of the incident deto-nation wave (IDW) is comparable to or larger than theMach stem height,self-similarity will fail during IDWreflection from a wedge surface.In this paper,the det-onation reflection from wedges is investigated for thewave dynamic processes occurring in the wave front,including transverse shock motion and detonation cellvariations behind the Mach stem.A detailed reactionmodel is implemented to simulate two-dimensional cel-lular detonations in stoichiometric mixtures of H2/O2diluted by Argon.The numerical results show that thetransverse waves,which cross the triple point trajec-tory of Mach reflection,travel along the Mach stem andreflect back from the wedge surface,control the size ofthe cells in the region swept by the Mach stem.It is theenergy carried by these transverse waves that sustainsthe triple-wave-collision with a higher frequency withinthe over-driven Mach stem.In some cases,local wavedynamic processes and wave structures play a dominantrole in determining the pattern of cellular record,lead-ing to the fact that the cellular patterns after the Machstem exhibit some peculiar modes.

  8. Cellular trade-offs and optimal resource allocation during cyanobacterial diurnal growth.

    Science.gov (United States)

    Reimers, Alexandra-M; Knoop, Henning; Bockmayr, Alexander; Steuer, Ralf

    2017-07-18

    Cyanobacteria are an integral part of Earth's biogeochemical cycles and a promising resource for the synthesis of renewable bioproducts from atmospheric CO2 Growth and metabolism of cyanobacteria are inherently tied to the diurnal rhythm of light availability. As yet, however, insight into the stoichiometric and energetic constraints of cyanobacterial diurnal growth is limited. Here, we develop a computational framework to investigate the optimal allocation of cellular resources during diurnal phototrophic growth using a genome-scale metabolic reconstruction of the cyanobacterium Synechococcus elongatus PCC 7942. We formulate phototrophic growth as an autocatalytic process and solve the resulting time-dependent resource allocation problem using constraint-based analysis. Based on a narrow and well-defined set of parameters, our approach results in an ab initio prediction of growth properties over a full diurnal cycle. The computational model allows us to study the optimality of metabolite partitioning during diurnal growth. The cyclic pattern of glycogen accumulation, an emergent property of the model, has timing characteristics that are in qualitative agreement with experimental findings. The approach presented here provides insight into the time-dependent resource allocation problem of phototrophic diurnal growth and may serve as a general framework to assess the optimality of metabolic strategies that evolved in phototrophic organisms under diurnal conditions.

  9. Two-material optimization of plate armour for blast mitigation using hybrid cellular automata

    Science.gov (United States)

    Goetz, J.; Tan, H.; Renaud, J.; Tovar, A.

    2012-08-01

    With the increased use of improvised explosive devices in regions at war, the threat to military and civilian life has risen. Cabin penetration and gross acceleration are the primary threats in an explosive event. Cabin penetration crushes occupants, damaging the lower body. Acceleration causes death at high magnitudes. This investigation develops a process of designing armour that simultaneously mitigates cabin penetration and acceleration. The hybrid cellular automaton (HCA) method of topology optimization has proven efficient and robust in problems involving large, plastic deformations such as crash impact. Here HCA is extended to the design of armour under blast loading. The ability to distribute two metallic phases, as opposed to one material and void, is also added. The blast wave energy transforms on impact into internal energy (IE) inside the solid medium. Maximum attenuation occurs with maximized IE. The resulting structures show HCA's potential for designing blast mitigating armour structures.

  10. Optimization and control of metal forming processes

    NARCIS (Netherlands)

    Havinga, Gosse Tjipke

    2016-01-01

    Inevitable variations in process and material properties limit the accuracy of metal forming processes. Robust optimization methods or control systems can be used to improve the production accuracy. Robust optimization methods are used to design production processes with low sensitivity to the distu

  11. Optimal operation of batch membrane processes

    CERN Document Server

    Paulen, Radoslav

    2016-01-01

    This study concentrates on a general optimization of a particular class of membrane separation processes: those involving batch diafiltration. Existing practices are explained and operational improvements based on optimal control theory are suggested. The first part of the book introduces the theory of membrane processes, optimal control and dynamic optimization. Separation problems are defined and mathematical models of batch membrane processes derived. The control theory focuses on problems of dynamic optimization from a chemical-engineering point of view. Analytical and numerical methods that can be exploited to treat problems of optimal control for membrane processes are described. The second part of the text builds on this theoretical basis to establish solutions for membrane models of increasing complexity. Each chapter starts with a derivation of optimal operation and continues with case studies exemplifying various aspects of the control problems under consideration. The authors work their way from th...

  12. A simple yeast-based strategy to identify host cellular processes targeted by bacterial effector proteins.

    Directory of Open Access Journals (Sweden)

    Eran Bosis

    Full Text Available Bacterial effector proteins, which are delivered into the host cell via the type III secretion system, play a key role in the pathogenicity of gram-negative bacteria by modulating various host cellular processes to the benefit of the pathogen. To identify cellular processes targeted by bacterial effectors, we developed a simple strategy that uses an array of yeast deletion strains fitted into a single 96-well plate. The array is unique in that it was optimized computationally such that despite the small number of deletion strains, it covers the majority of genes in the yeast synthetic lethal interaction network. The deletion strains in the array are screened for hypersensitivity to the expression of a bacterial effector of interest. The hypersensitive deletion strains are then analyzed for their synthetic lethal interactions to identify potential targets of the bacterial effector. We describe the identification, using this approach, of a cellular process targeted by the Xanthomonas campestris type III effector XopE2. Interestingly, we discover that XopE2 affects the yeast cell wall and the endoplasmic reticulum stress response. More generally, the use of a single 96-well plate makes the screening process accessible to any laboratory and facilitates the analysis of a large number of bacterial effectors in a short period of time. It therefore provides a promising platform for studying the functions and cellular targets of bacterial effectors and other virulence proteins.

  13. Parallelizing the Cellular Potts Model on graphics processing units

    Science.gov (United States)

    Tapia, José Juan; D'Souza, Roshan M.

    2011-04-01

    The Cellular Potts Model (CPM) is a lattice based modeling technique used for simulating cellular structures in computational biology. The computational complexity of the model means that current serial implementations restrict the size of simulation to a level well below biological relevance. Parallelization on computing clusters enables scaling the size of the simulation but marginally addresses computational speed due to the limited memory bandwidth between nodes. In this paper we present new data-parallel algorithms and data structures for simulating the Cellular Potts Model on graphics processing units. Our implementations handle most terms in the Hamiltonian, including cell-cell adhesion constraint, cell volume constraint, cell surface area constraint, and cell haptotaxis. We use fine level checkerboards with lock mechanisms using atomic operations to enable consistent updates while maintaining a high level of parallelism. A new data-parallel memory allocation algorithm has been developed to handle cell division. Tests show that our implementation enables simulations of >10 cells with lattice sizes of up to 256 3 on a single graphics card. Benchmarks show that our implementation runs ˜80× faster than serial implementations, and ˜5× faster than previous parallel implementations on computing clusters consisting of 25 nodes. The wide availability and economy of graphics cards mean that our techniques will enable simulation of realistically sized models at a fraction of the time and cost of previous implementations and are expected to greatly broaden the scope of CPM applications.

  14. Smallest-Small-World Cellular Harmony Search for Optimization of Unconstrained Benchmark Problems

    Directory of Open Access Journals (Sweden)

    Sung Soo Im

    2013-01-01

    Full Text Available We presented a new hybrid method that combines cellular harmony search algorithms with the Smallest-Small-World theory. A harmony search (HS algorithm is based on musical performance processes that occur when a musician searches for a better state of harmony. Harmony search has successfully been applied to a wide variety of practical optimization problems. Most of the previous researches have sought to improve the performance of the HS algorithm by changing the pitch adjusting rate and harmony memory considering rate. However, there has been a lack of studies to improve the performance of the algorithm by the formation of population structures. Therefore, we proposed an improved HS algorithm that uses the cellular automata formation and the topological structure of Smallest-Small-World network. The improved HS algorithm has a high clustering coefficient and a short characteristic path length, having good exploration and exploitation efficiencies. Nine benchmark functions were applied to evaluate the performance of the proposed algorithm. Unlike the existing improved HS algorithm, the proposed algorithm is expected to have improved algorithmic efficiency from the formation of the population structure.

  15. Public Evacuation Process Modeling and Simulatiaon Based on Cellular Automata

    Directory of Open Access Journals (Sweden)

    Zhikun Wang

    2013-11-01

    Full Text Available Considering attraction of the nearest exit, repulsive force of the fire, barrier and its display style, effect of fire exit location on escape time in fire hazard, a mathematical model of evacuation process model was build based on cellular automatic theory. The program was developed by JavaScript. The influencing factors of evacuation were obtained through the simulation model by inputting crew size, creating initial positions of crew and fire seat stochastically. The experimental results show that the evacuation simulation model with authenticity and validity, which has guiding significance for people evacuation and public escape system design.  

  16. Optimal planning for cellular networks for smart metering infrastructure in rural and remote areas

    Directory of Open Access Journals (Sweden)

    Andrés Masache

    2015-07-01

    Full Text Available Smart metering is used to control, monitor and know the system status in real time; to this effect, the incorporation of smart grids primarily benefits the electrical system; similarly, the reuse of infrastructure and cellular spectrum help mitigate the time and cost of its implementation. In order to reduce traffic and saturation of cellular networks, this paper aims at determining the optimal route for information transmission analyzing the optimal routing through distances and optimal routing through traffic flow. This analysis helps determine what the optimal route is, when there is no traffic on the wireless network, or when there is prolonged traffic, and what the traffic tendencies are, that may occur by excessive information transmission of smart meters to electric distribution companies.

  17. Optimization of Cellular Resources Evading Intra and Inter Tier Interference in Femto cells Equipped Macro cell Networks

    CERN Document Server

    Shakhakarmi, Niraj

    2012-01-01

    Cellular network resources are essential to be optimized in Femto cells equipped macro cell networks. This is achieved by increasing the cellular coverage and channel capacity, and reducing power usage and interference between femto cells and macro cells. In this paper, the optimization approach for cellular resources with installed femto cells in macro cell networks has been addressed by deploying smart antennas applications and effect power adaptation method which significantly optimize the cellular coverage, channel capacity, power usage, and intra and inter tier interference. The simulation results also illustrate the outstanding performance of this optimization methodology.

  18. Active Cellular Mechanics and Information Processing in the Living Cell

    Science.gov (United States)

    Rao, M.

    2014-07-01

    I will present our recent work on the organization of signaling molecules on the surface of living cells. Using novel experimental and theoretical approaches we have found that many cell surface receptors are organized as dynamic clusters driven by active currents and stresses generated by the cortical cytoskeleton adjoining the cell surface. We have shown that this organization is optimal for both information processing and computation. In connecting active mechanics in the cell with information processing and computation, we bring together two of the seminal works of Alan Turing.

  19. Multiple Objective Optimization and Optimal Control of Fermentation Processes

    Directory of Open Access Journals (Sweden)

    Mitko Petrov

    2008-10-01

    Full Text Available A multiple objective optimization is applied for finding an optimum policy of fed-batch processes of whey fermentation and L-lysine production. The multiple objective optimization problems are transformed to a standard problem of optimization with single objective function by a general utility function with weight coefficients for each single utility coefficient criteria. A combined algorithm is applied when solving the maximizing decision problem. The algorithm includes a method for random search of finding an initial point and a method based on the fuzzy sets theory, combined in order to find the best solution of the optimization problem. The application of the combined algorithm eliminates the main disadvantage of the used fuzzy optimization method, namely it decreases the number of discrete values of the control variables. Thus, the algorithm allows problems with larger scale to be solved. After this multiple optimization, the useful product quality rises and the residual substrate concentration at the end of the process decreases. In this way, the process productivity is increased.

  20. Controlled inflation of voids in cellular polymer ferroelectrets: Optimizing electromechanical transducer properties

    Science.gov (United States)

    Wegener, M.; Wirges, W.; Gerhard-Multhaupt, R.; Dansachmüller, M.; Schwödiauer, R.; Bauer-Gogonea, S.; Bauer, S.; Paajanen, M.; Minkkinen, H.; Raukola, J.

    2004-01-01

    When exposed to sufficiently high electric fields, polymer-foam electret materials with closed cells exhibit ferroelectric-like behavior and may therefore be called ferroelectrets. In cellular ferroelectrets, the influence of the cell size and shape distributions on the application-relevant properties is not yet understood. Therefore, controlled inflation experiments were carried out on cellular polypropylene films, and the resulting elastical and electromechanical parameters were determined. The elastic modulus in the thickness direction shows a minimum with a corresponding maximum in the electromechanical transducer coefficient. The resonance frequency shifts as a function of the elastic modulus and the relative density of the inflated cellular films. Therefore, the transducer properties of cellular ferroelectrets can be optimized by means of controlled inflation.

  1. Using Electromagnetic Algorithm for Total Costs of Sub-contractor Optimization in the Cellular Manufacturing Problem

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Shahriari

    2016-12-01

    Full Text Available In this paper, we present a non-linear binary programing for optimizing a specific cost in cellular manufacturing system in a controlled production condition. The system parameters are determined by the continuous distribution functions. The aim of the presented model is to optimize the total cost of imposed sub-contractors to the manufacturing system by determining how to allocate the machines and parts to each seller. In this system, DM could control the occupation level of each machine in the system. For solving the presented model, we used the electromagnetic meta-heuristic algorithm and Taguchi method for determining the optimal algorithm parameters.

  2. Optimizing Classification in Intelligence Processing

    Science.gov (United States)

    2010-12-01

    ACC Classification Accuracy AUC Area Under the ROC Curve CI Competitive Intelligence COMINT Communications Intelligence DoD Department of...indispensible tool to support a national leader’s decision making process, competitive intelligence (CI) has emerged in recent decades as an environment meant...effectiveness for the intelligence product in competitive intelligence environment: accuracy, objectivity, usability, relevance, readiness, and timeliness

  3. Optimization of the ion implantation process

    Science.gov (United States)

    Maczka, D.; Latuszynski, A.; Kuduk, R.; Partyka, J.

    This work is devoted to the optimization of the ion implantation process in the implanter Unimas of the Institute of Physics, Maria Curie-Sklodowska University, Lublin. The results obtained during several years of operation allow us to determine the optimal work parameters of the device [1-3].

  4. Scalar Parameters Optimization in PDE Based Medical Image Denoising by using Cellular Wave Computing

    Directory of Open Access Journals (Sweden)

    GACSÁDI Alexandru

    2016-10-01

    Full Text Available In order to help with biomedical images, a set of complex and effective mathematical models are available, based on the PDE (PDE - partial differential equation. On one hand, effective implementation of these methods is difficult, due to the difficulty of determining the scalar parameter values, on which the image processing efficiency depends, while on the other hand, due to the considerable computing power needed in order to perform in real time. Currently there are no analytical and / or experimental methods in the literature for the exact values determination of the scaled parameters to provide the best results for a specific image processing. This paper proposes a method for optimizing the values of a scaling parameter set, which ensure effective noise reduction of medical images by using cellular wave computing. To assess the overall performance of noise extraction, the error function (quantitative component and direct visualization (qualitative component are used at the same time. Moreover, by using this analysis, the degree to which the CNN templates are robust against the range of values of the scalar parameter, is obtainable.

  5. Optimization of the diabetic nephropathy treatment with attention to the special features of cellular inflammation mechanisms

    Directory of Open Access Journals (Sweden)

    Тетяна Дмитрівна Щербань

    2016-02-01

    Full Text Available Aim. Optimization of the diabetic nephropathy (DN treatment in association with hypertonic disease (HD based on the study of neutrophil chain of pathogenic cellular mechanisms of these diseases development and the special features of its clinical course.Materials and methods. There were complexly examined 86 patients with HD associated with DN and 30 patients with isolated HD. The control group was formed by 30 practically healthy persons. The activity of NO-synthases in neutrophils was detected by Green colorimetric methods using Griess reagent. The expression of ІСАМ-1 (CD54, CD11b-integrin and inducible NO-synthase on neutrophils was detected by the indirect immunocytochemical method. Oxygen-depending activity of neutrophils was assessed in NBT-test.Results. Expression of adhesive molecules of CD54and CD11b-integrin on neutrophils of peripheral blood essentially increases (р <0,001 in patients with DN in association with HD comparing with isolated HD and the control group.At associated pathology on the background of high oxygen-depending activity of neutrophils its functional reserve decreases that results in intensification of inflammatory processes in kidneys (р<0,001.In comorbid patients chronization of pathological process results in imbalance of NO-synthases system in neutrophils: on the background of decrease of activity of constituent NO-synthases the expression and activity of inducible NO-synthase increase (р<0,001 .The use of L-arginine hydrochloride in the complex therapy of patients with DN associated with HD intensifies organoprotective effect of basal therapy, results in facilitation of the clinical course, decreases albuminuria, corrects the functional indices of neutrophils and diminishes imbalance in NO-synthases system.Conclusions. In patients with DN in association with HD the neutrophil chain of cellular inflammation mechanisms are activated: expression of adhesive molecules grows, oxygen-depending metabolism is

  6. Optimizing Processes to Minimize Risk

    Science.gov (United States)

    Loyd, David

    2017-01-01

    NASA, like the other hazardous industries, has suffered very catastrophic losses. Human error will likely never be completely eliminated as a factor in our failures. When you can't eliminate risk, focus on mitigating the worst consequences and recovering operations. Bolstering processes to emphasize the role of integration and problem solving is key to success. Building an effective Safety Culture bolsters skill-based performance that minimizes risk and encourages successful engagement.

  7. Optimization of the Processing of Mo Disks

    Energy Technology Data Exchange (ETDEWEB)

    Tkac, Peter [Argonne National Lab. (ANL), Argonne, IL (United States); Rotsch, David A. [Argonne National Lab. (ANL), Argonne, IL (United States); Stepinski, Dominique [Argonne National Lab. (ANL), Argonne, IL (United States); Makarashvili, Vakhtang [Argonne National Lab. (ANL), Argonne, IL (United States); Harvey, James [NorthStar Medical Technologies, LLC, Madison, WI (United States); Vandegrift, George F. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-01-01

    The objective of this work is to decrease the processing time for irradiated disks of enriched Mo for the production of 99Mo. Results are given for the dissolution of nonirradiated Mo disks, optimization of the process for large-scale dissolution of sintered disks, optimization of the removal of the main side products (Zr and Nb) from dissolved targets, and dissolution of irradiated Mo disks.

  8. Fatigue design of a cellular phone folder using regression model-based multi-objective optimization

    Science.gov (United States)

    Kim, Young Gyun; Lee, Jongsoo

    2016-08-01

    In a folding cellular phone, the folding device is repeatedly opened and closed by the user, which eventually results in fatigue damage, particularly to the front of the folder. Hence, it is important to improve the safety and endurance of the folder while also reducing its weight. This article presents an optimal design for the folder front that maximizes its fatigue endurance while minimizing its thickness. Design data for analysis and optimization were obtained experimentally using a test jig. Multi-objective optimization was carried out using a nonlinear regression model. Three regression methods were employed: back-propagation neural networks, logistic regression and support vector machines. The AdaBoost ensemble technique was also used to improve the approximation. Two-objective Pareto-optimal solutions were identified using the non-dominated sorting genetic algorithm (NSGA-II). Finally, a numerically optimized solution was validated against experimental product data, in terms of both fatigue endurance and thickness index.

  9. Cellular phosphatases facilitate combinatorial processing of receptor-activated signals

    Directory of Open Access Journals (Sweden)

    Siddiqui Zaved

    2008-09-01

    Full Text Available Abstract Background Although reciprocal regulation of protein phosphorylation represents a key aspect of signal transduction, a larger perspective on how these various interactions integrate to contribute towards signal processing is presently unclear. For example, a key unanswered question is that of how phosphatase-mediated regulation of phosphorylation at the individual nodes of the signaling network translates into modulation of the net signal output and, thereby, the cellular phenotypic response. Results To address the above question we, in the present study, examined the dynamics of signaling from the B cell antigen receptor (BCR under conditions where individual cellular phosphatases were selectively depleted by siRNA. Results from such experiments revealed a highly enmeshed structure for the signaling network where each signaling node was linked to multiple phosphatases on the one hand, and each phosphatase to several nodes on the other. This resulted in a configuration where individual signaling intermediates could be influenced by a spectrum of regulatory phosphatases, but with the composition of the spectrum differing from one intermediate to another. Consequently, each node differentially experienced perturbations in phosphatase activity, yielding a unique fingerprint of nodal signals characteristic to that perturbation. This heterogeneity in nodal experiences, to a given perturbation, led to combinatorial manipulation of the corresponding signaling axes for the downstream transcription factors. Conclusion Our cumulative results reveal that it is the tight integration of phosphatases into the signaling network that provides the plasticity by which perturbation-specific information can be transmitted in the form of a multivariate output to the downstream transcription factor network. This output in turn specifies a context-defined response, when translated into the resulting gene expression profile.

  10. Optimized Energy Procurement for Cellular Networks with Uncertain Renewable Energy Generation

    KAUST Repository

    Rached, Nadhir B.

    2017-02-07

    Renewable energy (RE) is an emerging solution for reducing carbon dioxide (CO2) emissions from cellular networks. One of the challenges of using RE sources is to handle its inherent uncertainty. In this paper, a RE powered cellular network is investigated. For a one-day operation cycle, the cellular network aims to reduce energy procurement costs from the smart grid by optimizing the amounts of energy procured from their locally deployed RE sources as well as from the smart grid. In addition to that, it aims to determine the extra amount of energy to be sold to the electrical grid at each time period. Chance constrained optimization is first proposed to deal with the randomness in the RE generation. Then, to make the optimization problem tractable, two well- know convex approximation methods, namely; Chernoff and Chebyshev based-approaches, are analyzed in details. Numerical results investigate the optimized energy procurement for various daily scenarios and compare between the performances of the employed convex approximation approaches.

  11. Conventional and novel processing methods for cellular ceramics

    National Research Council Canada - National Science Library

    Paolo Colombo

    2006-01-01

    Cellular ceramics are a class of highly porous materials that covers a wide range of structures, such as foams, honeycombs, interconnected rods, interconnected fibres, interconnected hollow spheres...

  12. Optimized Energy Efficiency and Spectral Efficiency Resource Allocation Strategies for Phantom Cellular Networks

    KAUST Repository

    Abdelhady, Amr, M.

    2016-01-06

    Multi-teir hetrogeneous networks have become an essential constituent for next generation cellular networks. Meanwhile, energy efficiency (EE) has been considered a critical design criterion along with the traditional spectral efficiency (SE) metric. In this context, we study power and spectrum allocation for the recently proposed two-teir architecture known as Phantom cellular networks. The optimization framework includes both EE and SE, where we propose an algorithm that computes the SE and EE resource allocation for Phantom cellular networks. Then, we compare the performance of both design strategies versus the number of users, and the ration of Phantom cellresource blocks to the total number or resource blocks. We aim to investigate the effect of some system parameters to acheive improved SE or EE performance at a non-significant loss in EE or SE performance, respectively. It was found that the system parameters can be tuned so that the EE solution does not yield a significant loss in the SE performance.

  13. Heterogeneous architecture to process swarm optimization algorithms

    Directory of Open Access Journals (Sweden)

    Maria A. Dávila-Guzmán

    2014-01-01

    Full Text Available Since few years ago, the parallel processing has been embedded in personal computers by including co-processing units as the graphics processing units resulting in a heterogeneous platform. This paper presents the implementation of swarm algorithms on this platform to solve several functions from optimization problems, where they highlight their inherent parallel processing and distributed control features. In the swarm algorithms, each individual and dimension problem are parallelized by the granularity of the processing system which also offer low communication latency between individuals through the embedded processing. To evaluate the potential of swarm algorithms on graphics processing units we have implemented two of them: the particle swarm optimization algorithm and the bacterial foraging optimization algorithm. The algorithms’ performance is measured using the acceleration where they are contrasted between a typical sequential processing platform and the NVIDIA GeForce GTX480 heterogeneous platform; the results show that the particle swarm algorithm obtained up to 36.82x and the bacterial foraging swarm algorithm obtained up to 9.26x. Finally, the effect to increase the size of the population is evaluated where we show both the dispersion and the quality of the solutions are decreased despite of high acceleration performance since the initial distribution of the individuals can converge to local optimal solution.

  14. Variance optimal stopping for geometric Levy processes

    DEFF Research Database (Denmark)

    Gad, Kamille Sofie Tågholt; Pedersen, Jesper Lund

    2015-01-01

    The main result of this paper is the solution to the optimal stopping problem of maximizing the variance of a geometric Lévy process. We call this problem the variance problem. We show that, for some geometric Lévy processes, we achieve higher variances by allowing randomized stopping. Furthermore...

  15. Greedy scheduling of cellular self-replication leads to optimal doubling times with a log-Frechet distribution.

    Science.gov (United States)

    Pugatch, Rami

    2015-02-24

    Bacterial self-replication is a complex process composed of many de novo synthesis steps catalyzed by a myriad of molecular processing units, e.g., the transcription-translation machinery, metabolic enzymes, and the replisome. Successful completion of all production tasks requires a schedule-a temporal assignment of each of the production tasks to its respective processing units that respects ordering and resource constraints. Most intracellular growth processes are well characterized. However, the manner in which they are coordinated under the control of a scheduling policy is not well understood. When fast replication is favored, a schedule that minimizes the completion time is desirable. However, if resources are scarce, it is typically computationally hard to find such a schedule, in the worst case. Here, we show that optimal scheduling naturally emerges in cellular self-replication. Optimal doubling time is obtained by maintaining a sufficiently large inventory of intermediate metabolites and processing units required for self-replication and additionally requiring that these processing units be "greedy," i.e., not idle if they can perform a production task. We calculate the distribution of doubling times of such optimally scheduled self-replicating factories, and find it has a universal form-log-Frechet, not sensitive to many microscopic details. Analyzing two recent datasets of Escherichia coli growing in a stationary medium, we find excellent agreement between the observed doubling-time distribution and the predicted universal distribution, suggesting E. coli is optimally scheduling its replication. Greedy scheduling appears as a simple generic route to optimal scheduling when speed is the optimization criterion. Other criteria such as efficiency require more elaborate scheduling policies and tighter regulation.

  16. Characterization of Deciliation-Regeneration Process of Tetrahymena Pyriformis for Cellular Robot Fabrication

    Institute of Scientific and Technical Information of China (English)

    Dal Hyung Kim; Sean E. Brigandi; Paul Kim; Doyoung Byun; Min Jun Kim

    2011-01-01

    Artificial magnetotactic Tetrahymena pyriformis GL (T. pyriformis) cells were created by the internalization of iron oxide nano particles and became controllable with a time-varying external magnetic field. Thus, T. pyriformis can be utilized as a cellular robot to conduct micro-scale tasks such as transportation and manipulation. To complete these tasks, loading inorganic or organic materials onto the cell body is essential, but functionalization of the cell membrane is obstructed by their motile organelles, cilia. Dibucaine HCl, a local anesthetic, removes the cilia from the cell body, and the functional group would be absorbed more efficiently during cilia regeneration. In this paper, we characterize the recovery of artificial magnetotactic T.pyriformis after the deciliation process to optimize a cellular robot fabrication process. After sufficient time to recover, the motility rate and the average velocity of the deciliated cells were six and ten percent lower than that of non-deciliated cells, respectively. We showed that the motile cells after recovery can still be controlled using magnetotaxis, making T. pyriformis a good candidate to be used as a cellular robot.

  17. Energy-Efficient Relay Selection and Optimal Relay Location in Cooperative Cellular Networks with Asymmetric Traffic

    CERN Document Server

    Yang, Wei; Sun, Wanlu

    2010-01-01

    Energy-efficient communication is an important requirement for mobile relay networks due to the limited battery power of user terminals. This paper considers energy-efficient relaying schemes through selection of mobile relays in cooperative cellular systems with asymmetric traffic. The total energy consumption per information bit of the battery-powered terminals, i.e., the mobile station (MS) and the relay, is derived in theory. In the Joint Uplink and Downlink Relay Selection (JUDRS) scheme we proposed, the relay which minimizes the total energy consumption is selected. Additionally, the energy-efficient cooperation regions are investigated, and the optimal relay location is found for cooperative cellular systems with asymmetric traffic. The results reveal that the MS-relay and the relay-base station (BS) channels have different influence over relay selection decisions for optimal energy-efficiency. Information theoretic analysis of the diversity-multiplexing tradeoff (DMT) demonstrates that the proposed sc...

  18. Topological optimization for the design of microstructures of isotropic cellular materials

    Science.gov (United States)

    Radman, A.; Huang, X.; Xie, Y. M.

    2013-11-01

    The aim of this study was to design isotropic periodic microstructures of cellular materials using the bidirectional evolutionary structural optimization (BESO) technique. The goal was to determine the optimal distribution of material phase within the periodic base cell. Maximizing bulk modulus or shear modulus was selected as the objective of the material design subject to an isotropy constraint and a volume constraint. The effective properties of the material were found using the homogenization method based on finite element analyses of the base cell. The proposed BESO procedure utilizes the gradient-based sensitivity method to impose the isotropy constraint and gradually evolve the microstructures of cellular materials to an optimum. Numerical examples show the computational efficiency of the approach. A series of new and interesting microstructures of isotropic cellular materials that maximize the bulk or shear modulus have been found and presented. The methodology can be extended to incorporate other material properties of interest such as designing isotropic cellular materials with negative Poisson's ratio.

  19. Manufacturing processes of cellular concrete products for the construction

    Directory of Open Access Journals (Sweden)

    Fakhratov Мuhammet

    2017-01-01

    Full Text Available Cellular concrete takes the lead in the world of construction as a structural insulation material used in the construction and reconstruction of buildings and constructions of various purposes. In this artificial stone building material, pores are distributed relatively evenly and occupy from 20 to 90% of the concrete volume, ensuring good thermal qualities, which allows cellular concrete houses to keep warmth well. For production of cellular concrete, Portland cement, “burnt lime”, and fine-pulverized blast furnace slags, with a hardening activator are used as binders. As silica components, quartz sand or “fly ash” obtained by combustion of pulverized fuel in power plants as well as secondary products of different ore dressing treatments are used. The low density and high thermal insulation properties of cellular concrete enables 3 times lighter wall weight than the weight of brick walls and 1.7 times lighter than the walls of ceramsite concrete. Thermal insulation and mechanical properties of cellular concrete make possible to construct of it single-layer protecting structures with the desired thermal resistance. Cellular concrete is divided into aerated concretes and foam concretes, whose physical/mechanical and operational performance is, ceteris paribus, almost identical. By the method of hydrothermal treatment cellular concretes are divided into two groups: concrete of autoclave and non-autoclave curing.

  20. Optimal Design of Multistage Two-Dimensional Cellular-Cored Sandwich Panel Heat Exchanger

    Directory of Open Access Journals (Sweden)

    Yongcun Zhang

    2014-08-01

    Full Text Available For a two-dimensional (2D cellular-cored sandwich panel heat exchanger, there exists an optimum cell size to achieve the maximum heat transfer with the prescribed pressure drop when the length is fixed and the two plates are isothermal. However, in engineering design, it is difficult to find 2D cellular materials with the ideal cell size because the cell size selected must be from those commercially available, which are discrete, not continuous. In order to obtain the maximum heat dissipation, an innovative design scheme is proposed for the sandwich panel heat exchanger which is divided into multiple stages in the direction of fluid flow where the 2D cellular material in each stage has a specific cell size. An analytical model is presented to evaluate the thermal performance of the multistage sandwich panel heat exchanger when all 2D cellular materials have the same porosity. Also, a new parameter named equivalent cell size (ECS is defined, which is dependent on the cell size and length of cellular material in all stages. Results show that the maximum heat dissipation design of the multistage sandwich panel heat exchanger can be converted to make the ECS equal to the optimal cell size of the single-stage exchanger.

  1. Novel Optimization Approach to Mixing Process Intensification

    Institute of Scientific and Technical Information of China (English)

    Guo Kai; Liu Botan; Li Qi; Liu Chunjiang

    2015-01-01

    An approach was presented to intensify the mixing process. Firstly, a novel concept, the dissipationof mass transfer ability(DMA) associated with convective mass transfer, was defined via an analogy to the heat-work conversion. Accordingly, the focus on mass transfer enhancement can be shifted to seek the extremum of the DMA of the system. To this end, an optimization principle was proposed. A mathematical model was then developed to formu-late the optimization into a variational problem. Subsequently, the intensification of the mixing process for a gas mix-ture in a micro-tube was provided to demonstrate the proposed principle. In the demonstration example, an optimized velocity field was obtained in which the mixing ability was improved, i.e., the mixing process should be intensifiedby adjusting the velocity field in related equipment. Therefore, a specific procedure was provided to produce a mixer with geometric irregularities associated with an ideal velocity.

  2. Synthesis of optimal adsorptive carbon capture processes.

    Energy Technology Data Exchange (ETDEWEB)

    chang, Y.; Cozad, A.; Kim, H.; Lee, A.; Vouzis, P.; Konda, M.; Simon, A.; Sahinidis, N.; Miller, D.

    2011-01-01

    Solid sorbent carbon capture systems have the potential to require significantly lower regeneration energy compared to aqueous monoethanol amine (MEA) systems. To date, the majority of work on solid sorbents has focused on developing the sorbent materials themselves. In order to advance these technologies, it is necessary to design systems that can exploit the full potential and unique characteristics of these materials. The Department of Energy (DOE) recently initiated the Carbon Capture Simulation Initiative (CCSI) to develop computational tools to accelerate the commercialization of carbon capture technology. Solid sorbents is the first Industry Challenge Problem considered under this initiative. An early goal of the initiative is to demonstrate a superstructure-based framework to synthesize an optimal solid sorbent carbon capture process. For a given solid sorbent, there are a number of potential reactors and reactor configurations consisting of various fluidized bed reactors, moving bed reactors, and fixed bed reactors. Detailed process models for these reactors have been modeled using Aspen Custom Modeler; however, such models are computationally intractable for large optimization-based process synthesis. Thus, in order to facilitate the use of these models for process synthesis, we have developed an approach for generating simple algebraic surrogate models that can be used in an optimization formulation. This presentation will describe the superstructure formulation which uses these surrogate models to choose among various process alternatives and will describe the resulting optimal process configuration.

  3. Cellular modelling using P systems and process algebra

    Institute of Scientific and Technical Information of China (English)

    Francisco J.Romero-Campero; Marian Gheorghe; Gabriel Ciobanu; John M. Auld; Mario J. Pérez-Jiménez

    2007-01-01

    In this paper various molecular chemical interactions are modelled under different computational paradigms. P systems and π-calculus are used to describe intra-cellular reactions like protein-protein interactions and gene regulation control.

  4. Synthesis and Optimization of a Methanol Process

    DEFF Research Database (Denmark)

    Grue, J.; Bendtsen, Jan Dimon

    2003-01-01

    In the present paper, a simulation model for a methanol process is proposed. The objective is to develop a model for flowsheet optimization, which requires simple thermodynamic and unit operation models. Simplified thermodynamic models are combined with a more advanced model for the rate of react......In the present paper, a simulation model for a methanol process is proposed. The objective is to develop a model for flowsheet optimization, which requires simple thermodynamic and unit operation models. Simplified thermodynamic models are combined with a more advanced model for the rate...

  5. Bidirectional optimization of the melting spinning process.

    Science.gov (United States)

    Liang, Xiao; Ding, Yongsheng; Wang, Zidong; Hao, Kuangrong; Hone, Kate; Wang, Huaping

    2014-02-01

    A bidirectional optimizing approach for the melting spinning process based on an immune-enhanced neural network is proposed. The proposed bidirectional model can not only reveal the internal nonlinear relationship between the process configuration and the quality indices of the fibers as final product, but also provide a tool for engineers to develop new fiber products with expected quality specifications. A neural network is taken as the basis for the bidirectional model, and an immune component is introduced to enlarge the searching scope of the solution field so that the neural network has a larger possibility to find the appropriate and reasonable solution, and the error of prediction can therefore be eliminated. The proposed intelligent model can also help to determine what kind of process configuration should be made in order to produce satisfactory fiber products. To make the proposed model practical to the manufacturing, a software platform is developed. Simulation results show that the proposed model can eliminate the approximation error raised by the neural network-based optimizing model, which is due to the extension of focusing scope by the artificial immune mechanism. Meanwhile, the proposed model with the corresponding software can conduct optimization in two directions, namely, the process optimization and category development, and the corresponding results outperform those with an ordinary neural network-based intelligent model. It is also proved that the proposed model has the potential to act as a valuable tool from which the engineers and decision makers of the spinning process could benefit.

  6. Study on Parameter Optimization Design of Drum Brake Based on Hybrid Cellular Multiobjective Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Yi Zhang

    2012-01-01

    Full Text Available In consideration of the significant role the brake plays in ensuring the fast and safe running of vehicles, and since the present parameter optimization design models of brake are far from the practical application, this paper proposes a multiobjective optimization model of drum brake, aiming at maximizing the braking efficiency and minimizing the volume and temperature rise of drum brake. As the commonly used optimization algorithms are of some deficiency, we present a differential evolution cellular multiobjective genetic algorithm (DECell by introducing differential evolution strategy into the canonical cellular genetic algorithm for tackling this problem. For DECell, the gained Pareto front could be as close as possible to the exact Pareto front, and also the diversity of nondominated individuals could be better maintained. The experiments on the test functions reveal that DECell is of good performance in solving high-dimension nonlinear multiobjective problems. And the results of optimizing the new brake model indicate that DECell obviously outperforms the compared popular algorithm NSGA-II concerning the number of obtained brake design parameter sets, the speed, and stability for finding them.

  7. On the optimization of endoreversible processes

    Science.gov (United States)

    Pescetti, D.

    2014-03-01

    This paper is intended for undergraduates and specialists in thermodynamics and related areas. We consider and discuss the optimization of endoreversible thermodynamic processes under the condition of maximum work production. Explicit thermodynamic analyses of the solutions are carried out for the Novikov and Agrawal processes. It is shown that the efficiencies at maximum work production and maximum power output are not necessarily equal. They are for the Novikov process but not for the Agrawal process. The role of the constraints is put into evidence. The physical aspects are enhanced by the simplicity of the involved mathematics.

  8. Optimal control of a CSTR process

    Directory of Open Access Journals (Sweden)

    A. Soukkou

    2008-12-01

    Full Text Available Designing an effective criterion and learning algorithm for find the best structure is a major problem in the control design process. In this paper, the fuzzy optimal control methodology is applied to the design of the feedback loops of an Exothermic Continuous Stirred Tank Reactor system. The objective of design process is to find an optimal structure/gains of the Robust and Optimal Takagi Sugeno Fuzzy Controller (ROFLC. The control signal thus obtained will minimize a performance index, which is a function of the tracking/regulating errors, the quantity of the energy of the control signal applied to the system, and the number of fuzzy rules. The genetic learning is proposed for constructing the ROFLC. The chromosome genes are arranged into two parts, the binary-coded part contains the control genes and the real-coded part contains the genes parameters representing the fuzzy knowledge base. The effectiveness of this chromosome formulation enables the fuzzy sets and rules to be optimally reduced. The performances of the ROFLC are compared to these found by the traditional PD controller with Genetic Optimization (PD_GO. Simulations demonstrate that the proposed ROFLC and PD_GO has successfully met the design specifications.

  9. PROPOSAL OF SPATIAL OPTIMIZATION OF PRODUCTION PROCESS IN PROCESS DESIGNER

    Directory of Open Access Journals (Sweden)

    Peter Malega

    2015-03-01

    Full Text Available This contribution is focused on optimizing the use of space in the production process using software Process Designer. The aim of this contribution is to suggest possible improvements to the existing layout of the selected production process. Production process was analysed in terms of inputs, outputs and course of actions. Nowadays there are many software solutions aimed at optimizing the use of space. One of these software products is the Process Designer, which belongs to the product line Tecnomatix. This software is primarily aimed at production planning. With Process Designer is possible to design the layout of production and subsequently to analyse the production or to change according to the current needs of the company.

  10. Computer performance optimization systems, applications, processes

    CERN Document Server

    Osterhage, Wolfgang W

    2013-01-01

    Computing power performance was important at times when hardware was still expensive, because hardware had to be put to the best use. Later on this criterion was no longer critical, since hardware had become inexpensive. Meanwhile, however, people have realized that performance again plays a significant role, because of the major drain on system resources involved in developing complex applications. This book distinguishes between three levels of performance optimization: the system level, application level and business processes level. On each, optimizations can be achieved and cost-cutting p

  11. Calibrating floor field cellular automaton models for pedestrian dynamics by using likelihood function optimization

    Science.gov (United States)

    Lovreglio, Ruggiero; Ronchi, Enrico; Nilsson, Daniel

    2015-11-01

    The formulation of pedestrian floor field cellular automaton models is generally based on hypothetical assumptions to represent reality. This paper proposes a novel methodology to calibrate these models using experimental trajectories. The methodology is based on likelihood function optimization and allows verifying whether the parameters defining a model statistically affect pedestrian navigation. Moreover, it allows comparing different model specifications or the parameters of the same model estimated using different data collection techniques, e.g. virtual reality experiment, real data, etc. The methodology is here implemented using navigation data collected in a Virtual Reality tunnel evacuation experiment including 96 participants. A trajectory dataset in the proximity of an emergency exit is used to test and compare different metrics, i.e. Euclidean and modified Euclidean distance, for the static floor field. In the present case study, modified Euclidean metrics provide better fitting with the data. A new formulation using random parameters for pedestrian cellular automaton models is also defined and tested.

  12. Simulation and optimization of fractional crystallization processes

    DEFF Research Database (Denmark)

    Thomsen, Kaj; Rasmussen, Peter; Gani, Rafiqul

    1998-01-01

    A general method for the calculation of various types of phase diagrams for aqueous electrolyte mixtures is outlined. It is shown how the thermodynamic equilibrium precipitation process can be used to satisfy the operational needs of industrial crystallizer/centrifuge units. Examples of simulation...... and optimization of fractional crystallization processes are shown. In one of these examples, a process with multiple steady states is analyzed. The thermodynamic model applied for describing the highly non-ideal aqueous electrolyte systems is the Extended UNIQUAC model. (C) 1998 Published by Elsevier Science Ltd...

  13. Comprehensive Optimization Process of Paranasal Sinus Radiography

    Energy Technology Data Exchange (ETDEWEB)

    Saarakkala, S. (Dept. of Clinical Radiology, Kuopio Univ. Hospital, Kuopio (Finland)); Nironen, K.; Hermunen, H.; Aarnio, J.; Heikkinen, J.O. (Dept. of Radiology, Etel-Savo Hospital District, Mikkeli Central Hospital, Mikkeli (Finland))

    2009-04-15

    Background: The optimization of radiological examinations is important in order to reduce unnecessary patient radiation exposure. Purpose: To perform a comprehensive optimization process for paranasal sinus radiography at Mikkeli Central Hospital (Finland). Material and Methods: Patients with suspicion of acute sinusitis were imaged with a Kodak computed radiography (CR) system (n=20) and with a Philips digital radiography (DR) system (n=30) using focus-detector distances (FDDs) of 110 cm, 150 cm, or 200 cm. Patients' radiation exposure was determined in terms of entrance surface dose and dose-area product. Furthermore, an anatomical phantom was used for the estimation of point doses inside the head. Clinical image quality was evaluated by an experienced radiologist, and physical image quality was evaluated from the digital radiography phantom. Results: Patient doses were significantly lower and image quality better with the DR system compared to the CR system. The differences in patient dose and physical image quality were small with varying FDD. Clinical image quality of the DR system was lowest with FDD of 200 cm. Further, imaging with FDD of 150 cm was technically easier for the technologist to perform than with FDD of 110 cm. Conclusion: After optimization, it was recommended that the DR system with FDD of 150 cm should always be used at Mikkeli Central Hospital. We recommend this kind of comprehensive approach in all optimization processes of radiological examinations.

  14. Reducing residual stresses and deformations in selective laser melting through multi-level multi-scale optimization of cellular scanning strategy

    Science.gov (United States)

    Mohanty, Sankhya; Hattel, Jesper H.

    2016-04-01

    Residual stresses and deformations continue to remain one of the primary challenges towards expanding the scope of selective laser melting as an industrial scale manufacturing process. While process monitoring and feedback-based process control of the process has shown significant potential, there is still dearth of techniques to tackle the issue. Numerical modelling of selective laser melting process has thus been an active area of research in the last few years. However, large computational resource requirements have slowed the usage of these models for optimizing the process. In this paper, a calibrated, fast, multiscale thermal model coupled with a 3D finite element mechanical model is used to simulate residual stress formation and deformations during selective laser melting. The resulting reduction in thermal model computation time allows evolutionary algorithm-based optimization of the process. A multilevel optimization strategy is adopted using a customized genetic algorithm developed for optimizing cellular scanning strategy for selective laser melting, with an objective of reducing residual stresses and deformations. The resulting thermo-mechanically optimized cellular scanning strategies are compared with standard scanning strategies and have been used to manufacture standard samples.

  15. Optimization Techniques for RFID Complex Event Processing

    Institute of Scientific and Technical Information of China (English)

    Hai-Long Liu; Qun Chen; Zhan-Huai Li

    2009-01-01

    One research crucial to wider adoption of Radio Frequency Identification (RFID) technology is how to efficiently transform sequences of RFID readings into meaningful business events. Contrary to traditional events, RFID readings are usually of high volume and velocity, and have the attributes representing their reading objects, occurrence times and spots. Based on these characteristics and the TVon-deterministic Finite Automata (NFA) implementation framework, this paper studies the performance issues of RFID complex event processing and proposes corresponding optimization techniques. Our techniques include: (1) taking advantage of negation events or exclusiveness between events to prune intermediate results, thus reduces memory consumption; (2) with different selectivities of complex events, purposefully reordering the join operations between events to improve overall efficiency, achieve higher stream throughput; (3) utilizing the slot-based or B+-tree-based approach to optimizing the processing performance with the time window constraint. We present the analytical results of these techniques and validate their effectiveness through experiments.

  16. TDMA Achieves the Optimal Diversity Gain in Relay-Assisted Cellular Networks

    CERN Document Server

    Bi, Suzhi; Zhang,

    2011-01-01

    In multi-access wireless networks, transmission scheduling is a key component that determines the efficiency and fairness of wireless spectrum allocation. At one extreme, greedy opportunistic scheduling that allocates airtime to the user with the largest instantaneous channel gain achieves the optimal spectrum efficiency and transmission reliability but the poorest user-level fairness. At the other extreme, fixed TDMA scheduling achieves the fairest airtime allocation but the lowest spectrum efficiency and transmission reliability. To balance the two competing objectives, extensive research efforts have been spent on designing opportunistic scheduling schemes that reach certain tradeoff points between the two extremes. In this paper and in contrast to the conventional wisdom, we find that in relay-assisted cellular networks, fixed TDMA achieves the same optimal diversity gain as greedy opportunistic scheduling. In addition, by incorporating very limited opportunism, a simple relaxed-TDMA scheme asymptotically...

  17. Optimization Framework and Graph-Based Approach for Relay-Assisted Bidirectional OFDMA Cellular Networks

    CERN Document Server

    Liu, Yuan; Li, Bin; Shen, Hui

    2010-01-01

    This paper considers a relay-assisted bidirectional cellular network where the base station (BS) communicates with each mobile station (MS) using OFDMA for both uplink and downlink. The goal is to improve the overall system performance by exploring the full potential of the network in various dimensions including user, subcarrier, relay, and bidirectional traffic. In this work, we first introduce a novel three-time-slot time-division duplexing (TDD) transmission protocol. This protocol unifies direct transmission, one-way relaying and network-coded two-way relaying between the BS and each MS. Using the proposed three-time-slot TDD protocol, we then propose an optimization framework for resource allocation to achieve the following gains: cooperative diversity (via relay selection), network coding gain (via bidirectional transmission mode selection), and multiuser diversity (via subcarrier assignment). We formulate the problem as a combinatorial optimization problem, which is NP-complete. To make it more tracta...

  18. Design Process Optimization Based on Design Process Gene Mapping

    Institute of Scientific and Technical Information of China (English)

    LI Bo; TONG Shu-rong

    2011-01-01

    The idea of genetic engineering is introduced into the area of product design to improve the design efficiency. A method towards design process optimization based on the design process gene is proposed through analyzing the correlation between the design process gene and characteristics of the design process. The concept of the design process gene is analyzed and categorized into five categories that are the task specification gene, the concept design gene, the overall design gene, the detailed design gene and the processing design gene in the light of five design phases. The elements and their interactions involved in each kind of design process gene signprocess gene mapping is drawn with its structure disclosed based on its function that process gene.

  19. Linear programming embedded particle swarm optimization for solving an extended model of dynamic virtual cellular manufacturing systems

    Directory of Open Access Journals (Sweden)

    H. Rezazadeh

    2009-04-01

    Full Text Available The concept of virtual cellular manufacturing system (VCMS is finding acceptance among researchers as an extension to grouptechnology. In fact, in order to realize benefits of cellular manufacturing system in the functional layout, the VCMS createsprovisional groups of resources (machines, parts and workers in the production planning and control system. This paperdevelops a mathematical model to design the VCMS under a dynamic environment with a more integrated approach whereproduction planning, system reconfiguration and workforce requirements decisions are incorporated. The advantages of theproposed model are as follows: considering the operations sequence, alternative process plans for part types, machine timecapacity,worker time‐capacity, cross‐training, lot splitting, maximal cell size, balanced workload for cells and workers. Anefficient linear programming embedded particle swarm optimization algorithm is used to solve the proposed model. Thealgorithm searches over the 0‐1 integer variables and for each 0‐1 integer solution visited; corresponding values of integervariables are determined by solving a linear programming sub‐problem using the simplex algorithm. Numerical examples showthat the proposed method is efficient and effective in searching for near optimal solutions.

  20. On process optimization considering LCA methodology.

    Science.gov (United States)

    Pieragostini, Carla; Mussati, Miguel C; Aguirre, Pío

    2012-04-15

    The goal of this work is to research the state-of-the-art in process optimization techniques and tools based on LCA, focused in the process engineering field. A collection of methods, approaches, applications, specific software packages, and insights regarding experiences and progress made in applying the LCA methodology coupled to optimization frameworks is provided, and general trends are identified. The "cradle-to-gate" concept to define the system boundaries is the most used approach in practice, instead of the "cradle-to-grave" approach. Normally, the relationship between inventory data and impact category indicators is linearly expressed by the characterization factors; then, synergic effects of the contaminants are neglected. Among the LCIA methods, the eco-indicator 99, which is based on the endpoint category and the panel method, is the most used in practice. A single environmental impact function, resulting from the aggregation of environmental impacts, is formulated as the environmental objective in most analyzed cases. SimaPro is the most used software for LCA applications in literature analyzed. The multi-objective optimization is the most used approach for dealing with this kind of problems, where the ε-constraint method for generating the Pareto set is the most applied technique. However, a renewed interest in formulating a single economic objective function in optimization frameworks can be observed, favored by the development of life cycle cost software and progress made in assessing costs of environmental externalities. Finally, a trend to deal with multi-period scenarios into integrated LCA-optimization frameworks can be distinguished providing more accurate results upon data availability.

  1. Optimized Hybrid Resource Allocation in Wireless Cellular Networks with and without Channel Reassignment

    Directory of Open Access Journals (Sweden)

    Xin Wu

    2010-01-01

    Full Text Available In cellular networks, it is important to determine an optimal channel assignment scheme so that the available channels, which are considered as “limited” resources in cellular networks, are used as efficiently as possible. The objective of the channel assignment scheme is to minimize the call-blocking and the call-dropping probabilities. In this paper, we present two efficient integer linear programming (ILP formulations, for optimally allocating a channel (from a pool of available channels to an incoming call such that both “hard” and “soft” constraints are satisfied. Our first formulation, ILP1, does not allow channel reassignment of the existing calls, while our second formulation, ILP2, allows such reassignment. Both formulations can handle hard constraints, which includes co-site and adjacent channel constraints, in addition to the standard co-channel constraints. The simplified problem (with only co-channel constraints can be treated as a special case of our formulation. In addition to the hard constraints, we also consider soft constraints, such as, the packing condition, resonance condition, and limiting rearrangements, to further improve the network performance. We present the simulation results on a benchmark 49 cell environment with 70 channels that validate the performance of our approach.

  2. THE OPTIMIZATION OF PLUSH YARNS BULKING PROCESS

    Directory of Open Access Journals (Sweden)

    VINEREANU Adam

    2014-05-01

    Full Text Available This paper presents the experiments that were conducted on the installation of continuous bulking and thermofixing “SUPERBA” type TVP-2S for optimization of the plush yarns bulking process. There were considered plush yarns Nm 6.5/2, made of the fibrous blend of 50% indigenous wool sort 41 and 50% PES. In the first stage, it performs a thermal treatment with a turboprevaporizer at a temperature lower than thermofixing temperature, at atmospheric pressure, such that the plush yarns - deposed in a freely state on a belt conveyor - are uniformly bulking and contracting. It was followed the mathematical modeling procedure, working with a factorial program, rotatable central composite type, and two independent variables. After analyzing the parameters that have a direct influence on the bulking degree, there were selected the pre-vaporization temperature (coded x1,oC and the velocity of belt inside pre-vaporizer (coded x 2, m/min. As for the dependent variable, it was chosen the plush yarn diameter (coded y, mm. There were found the coordinates of the optimal point, and then this pair of values was verified in practice. These coordinates are: x1optim= 90oC and x 2optim= 6.5 m/min. The conclusion is that the goal was accomplished: it was obtained a good cover degree f or double-plush carpets by reducing the number of tufts per unit surface.

  3. Discovery of Transition Rules for Cellular Automata Using Artificial Bee Colony and Particle Swarm Optimization Algorithms in Urban Growth Modeling

    Directory of Open Access Journals (Sweden)

    Fereydoun Naghibi

    2016-12-01

    Full Text Available This paper presents an advanced method in urban growth modeling to discover transition rules of cellular automata (CA using the artificial bee colony (ABC optimization algorithm. Also, comparisons between the simulation results of CA models optimized by the ABC algorithm and the particle swarm optimization algorithms (PSO as intelligent approaches were performed to evaluate the potential of the proposed methods. According to previous studies, swarm intelligence algorithms for solving optimization problems such as discovering transition rules of CA in land use change/urban growth modeling can produce reasonable results. Modeling of urban growth as a dynamic process is not straightforward because of the existence of nonlinearity and heterogeneity among effective involved variables which can cause a number of challenges for traditional CA. ABC algorithm, the new powerful swarm based optimization algorithms, can be used to capture optimized transition rules of CA. This paper has proposed a methodology based on remote sensing data for modeling urban growth with CA calibrated by the ABC algorithm. The performance of ABC-CA, PSO-CA, and CA-logistic models in land use change detection is tested for the city of Urmia, Iran, between 2004 and 2014. Validations of the models based on statistical measures such as overall accuracy, figure of merit, and total operating characteristic were made. We showed that the overall accuracy of the ABC-CA model was 89%, which was 1.5% and 6.2% higher than those of the PSO-CA and CA-logistic model, respectively. Moreover, the allocation disagreement (simulation error of the simulation results for the ABC-CA, PSO-CA, and CA-logistic models are 11%, 12.5%, and 17.2%, respectively. Finally, for all evaluation indices including running time, convergence capability, flexibility, statistical measurements, and the produced spatial patterns, the ABC-CA model performance showed relative improvement and therefore its superiority was

  4. Utility Optimal Scheduling in Processing Networks

    CERN Document Server

    Huang, Longbo

    2010-01-01

    We consider the problem of utility optimal scheduling in general \\emph{processing networks} with random arrivals and network conditions. These are generalizations of traditional data networks where commodities in one or more queues can be combined to produce new commodities that are delivered to other parts of the network. This can be used to model problems such as in-network data fusion, stream processing, and grid computing. Scheduling actions are complicated by the \\emph{underflow problem} that arises when some queues with required components go empty. In this paper, we develop the Perturbed Max-Weight algorithm (PMW) to achieve optimal utility. The idea of PMW is to perturb the weights used by the usual Max-Weight algorithm to ``push'' queue levels towards non-zero values (avoiding underflows). We show that when the perturbations are carefully chosen, PMW is able to achieve a utility that is within $O(1/V)$ of the optimal value for any $V\\geq1$, while ensuring an average network backlog of $O(V)$.

  5. Analysis and Optimization of Central Processing Unit Process Parameters

    Science.gov (United States)

    Kaja Bantha Navas, R.; Venkata Chaitana Vignan, Budi; Durganadh, Margani; Rama Krishna, Chunduri

    2017-05-01

    The rapid growth of computer has made processing more data capable, which increase the heat dissipation. Hence the system unit CPU must be cooled against operating temperature. This paper presents a novel approach for the optimization of operating parameters on Central Processing Unit with single response based on response graph method. These methods have a series of steps from of proposed approach which are capable of decreasing uncertainty caused by engineering judgment in the Taguchi method. Orthogonal Array value was taken from ANSYS report. The method shows a good convergence with the experimental and the optimum process parameters.

  6. Mobile Phone Service Process Hiccups at Cellular Inc.

    Science.gov (United States)

    Edgington, Theresa M.

    2010-01-01

    This teaching case documents an actual case of process execution and failure. The case is useful in MIS introductory courses seeking to demonstrate the interdependencies within a business process, and the concept of cascading failure at the process level. This case demonstrates benefits and potential problems with information technology systems,…

  7. Mobile Phone Service Process Hiccups at Cellular Inc.

    Science.gov (United States)

    Edgington, Theresa M.

    2010-01-01

    This teaching case documents an actual case of process execution and failure. The case is useful in MIS introductory courses seeking to demonstrate the interdependencies within a business process, and the concept of cascading failure at the process level. This case demonstrates benefits and potential problems with information technology systems,…

  8. (Sub-)Optimality of Treating Interference as Noise in the Cellular Uplink With Weak Interference

    KAUST Repository

    Gherekhloo, Soheil

    2015-11-09

    Despite the simplicity of the scheme of treating interference as noise (TIN), it was shown to be sum-capacity optimal in the Gaussian interference channel (IC) with very-weak (noisy) interference. In this paper, the two-user IC is altered by introducing an additional transmitter that wants to communicate with one of the receivers of the IC. The resulting network thus consists of a point-to-point channel interfering with a multiple access channel (MAC) and is denoted by PIMAC. The sum-capacity of the PIMAC is studied with main focus on the optimality of TIN. It turns out that TIN in its naive variant, where all transmitters are active and both receivers use TIN for decoding, is not the best choice for the PIMAC. In fact, a scheme that combines both time division multiple access and TIN (TDMA-TIN) strictly outperforms the naive-TIN scheme. Furthermore, it is shown that in some regimes, TDMA-TIN achieves the sum-capacity for the deterministic PIMAC and the sum-capacity within a constant gap for the Gaussian PIMAC. In addition, it is shown that, even for very-weak interference, there are some regimes where a combination of interference alignment with power control and TIN at the receiver side outperforms TDMA-TIN. As a consequence, on the one hand, TIN in a cellular uplink is approximately optimal in certain regimes. On the other hand, those regimes cannot be simply described by the strength of interference.

  9. Discrete stochastic processes and optimal filtering

    CERN Document Server

    Bertein, Jean-Claude

    2012-01-01

    Optimal filtering applied to stationary and non-stationary signals provides the most efficient means of dealing with problems arising from the extraction of noise signals. Moreover, it is a fundamental feature in a range of applications, such as in navigation in aerospace and aeronautics, filter processing in the telecommunications industry, etc. This book provides a comprehensive overview of this area, discussing random and Gaussian vectors, outlining the results necessary for the creation of Wiener and adaptive filters used for stationary signals, as well as examining Kalman filters which ar

  10. Optimal Nesting for Continuous Shape Stamping Processes

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    This paper discusses the optimal nesting problem for minimizing the scrap in continuous shape stamping processes. The shape sliding technique is used to propose a new approach, OVERLAP-and-ESCAPE, to solve the problem of continuously nesting shapes onto a metal coil of fixed or selectable width. The approach is used to construct the objective function of the mathematical model of the problem using the Simulated Annealing Algorithm to determine the globally minimal configurations for the nesting problems. Some representative cases are studied and the results are encouraging. An automatic nesting software package for manufacturing bicycle chain link blanks is also described.

  11. Optimal Hamiltonian Simulation by Quantum Signal Processing

    Science.gov (United States)

    Low, Guang Hao; Chuang, Isaac L.

    2017-01-01

    The physics of quantum mechanics is the inspiration for, and underlies, quantum computation. As such, one expects physical intuition to be highly influential in the understanding and design of many quantum algorithms, particularly simulation of physical systems. Surprisingly, this has been challenging, with current Hamiltonian simulation algorithms remaining abstract and often the result of sophisticated but unintuitive constructions. We contend that physical intuition can lead to optimal simulation methods by showing that a focus on simple single-qubit rotations elegantly furnishes an optimal algorithm for Hamiltonian simulation, a universal problem that encapsulates all the power of quantum computation. Specifically, we show that the query complexity of implementing time evolution by a d -sparse Hamiltonian H ^ for time-interval t with error ɛ is O [t d ∥H ^ ∥max+log (1 /ɛ ) /log log (1 /ɛ ) ] , which matches lower bounds in all parameters. This connection is made through general three-step "quantum signal processing" methodology, comprised of (i) transducing eigenvalues of H ^ into a single ancilla qubit, (ii) transforming these eigenvalues through an optimal-length sequence of single-qubit rotations, and (iii) projecting this ancilla with near unity success probability.

  12. Mathematical Analysis and Optimization of Infiltration Processes

    Science.gov (United States)

    Chang, H.-C.; Gottlieb, D.; Marion, M.; Sheldon, B. W.

    1997-01-01

    A variety of infiltration techniques can be used to fabricate solid materials, particularly composites. In general these processes can be described with at least one time dependent partial differential equation describing the evolution of the solid phase, coupled to one or more partial differential equations describing mass transport through a porous structure. This paper presents a detailed mathematical analysis of a relatively simple set of equations which is used to describe chemical vapor infiltration. The results demonstrate that the process is controlled by only two parameters, alpha and beta. The optimization problem associated with minimizing the infiltration time is also considered. Allowing alpha and beta to vary with time leads to significant reductions in the infiltration time, compared with the conventional case where alpha and beta are treated as constants.

  13. Big Data Components for Business Process Optimization

    Directory of Open Access Journals (Sweden)

    Mircea Raducu TRIFU

    2016-01-01

    Full Text Available In these days, more and more people talk about Big Data, Hadoop, noSQL and so on, but very few technical people have the necessary expertise and knowledge to work with those concepts and technologies. The present issue explains one of the concept that stand behind two of those keywords, and this is the map reduce concept. MapReduce model is the one that makes the Big Data and Hadoop so powerful, fast, and diverse for business process optimization. MapReduce is a programming model with an implementation built to process and generate large data sets. In addition, it is presented the benefits of integrating Hadoop in the context of Business Intelligence and Data Warehousing applications. The concepts and technologies behind big data let organizations to reach a variety of objectives. Like other new information technologies, the main important objective of big data technology is to bring dramatic cost reduction.

  14. The Brewing Process: Optimizing the Fermentation

    Directory of Open Access Journals (Sweden)

    Teodora Coldea

    2014-11-01

    Full Text Available Beer is a carbonated alcoholic beverage obtained by alcoholic fermentation of malt wort boiled with hops. Brown beer obtained at Beer Pilot Station of University of Agricultural Sciences and Veterinary Medicine Cluj-Napoca was the result of a recipe based on blond, caramel and black malt in different proportions, water, hops and yeast. This study aimed to monitorize the evolution of wort in primary and secondary alcoholic fermentation in order to optimize the process. Two wort batches were assambled in order to increase the brewing yeast fermentation performance. The primary fermentation was 14 days, followed by another 14 days of secondary fermentation (maturation. The must fermentation monitoring was done by the automatic FermentoStar analyzer. The whole fermentation process was monitorized (temperature, pH, alcohol concentration, apparent and total wort extract.

  15. Nearly Optimal Resource Allocation for Downlink OFDMA in 2-D Cellular Networks

    CERN Document Server

    Ksairi, Nassar; Ciblat, Philippe

    2010-01-01

    In this paper, we propose a resource allocation algorithm for the downlink of sectorized two-dimensional (2-D) OFDMA cellular networks assuming statistical Channel State Information (CSI) and fractional frequency reuse. The proposed algorithm can be implemented in a distributed fashion without the need to any central controlling units. Its performance is analyzed assuming fast fading Rayleigh channels and Gaussian distributed multicell interference. We show that the transmit power of this simple algorithm tends, as the number of users grows to infinity, to the same limit as the minimal power required to satisfy all users' rate requirements i.e., the proposed resource allocation algorithm is asymptotically optimal. As a byproduct of this asymptotic analysis, we characterize a relevant value of the reuse factor that only depends on an average state of the network.

  16. Research on generalized optimization process for mechanical product

    Institute of Scientific and Technical Information of China (English)

    冯培恩; 邱清盈; 潘双夏; 董黎刚; 李善春

    1999-01-01

    The generalized optimization process for mechanical product is proposed which includes functional optimization phase, conceptual design optimization phase, technical design optimization phase (that is further divided into product modeling phase, optimization process scheduling phase, optimization modeling phase and multi-computer collaborative optimizing phase), and result analysis and evaluation phase. The characteristics of the generalized optimization are incarnated such as oriented to the design of entire system, whole process and overall performance of a product and combined with human intelligent and artificial intelligent optimization. The functions and the achieved strategies of each key phase in the generalized optimization process are discussed. A prototype of the generalized optimization supported system for mechanical product is preliminarily established.

  17. Kanban simulation model for production process optimization

    Directory of Open Access Journals (Sweden)

    Golchev Riste

    2015-01-01

    Full Text Available A long time has passed since the KANBAN system has been established as an efficient method for coping with the excessive inventory. Still, the possibilities for its improvement through its integration with other different approaches should be investigated further. The basic research challenge of this paper is to present benefits of KANBAN implementation supported with Discrete Event Simulation (DES. In that direction, at the beginning, the basics of KANBAN system are presented with emphasis on the information and material flow, together with a methodology for implementation of KANBAN system. Certain analysis on combining the simulation with this methodology is presented. The paper is concluded with a practical example which shows that through understanding the philosophy of the implementation methodology of KANBAN system and the simulation methodology, a simulation model can be created which can serve as a basis for a variety of experiments that can be conducted within a short period of time, resulting with production process optimization.

  18. The Cilium: Cellular Antenna and Central Processing Unit

    OpenAIRE

    Malicki, Jarema J.; Johnson, Colin A.

    2017-01-01

    Cilia mediate an astonishing diversity of processes. Recent advances provide unexpected insights into the regulatory mechanisms of cilium formation, and reveal diverse regulatory inputs that are related to the cell cycle, cytoskeleton, proteostasis, and cilia-mediated signaling itself. Ciliogenesis and cilia maintenance are regulated by reciprocal antagonistic or synergistic influences, often acting in parallel to each other. By receiving parallel inputs, cilia appear to integrate multiple si...

  19. Optimization of machining processes using pattern search algorithm

    OpenAIRE

    Miloš Madić; Miroslav Radovanović

    2014-01-01

    Optimization of machining processes not only increases machining efficiency and economics, but also the end product quality. In recent years, among the traditional optimization methods, stochastic direct search optimization methods such as meta-heuristic algorithms are being increasingly applied for solving machining optimization problems. Their ability to deal with complex, multi-dimensional and ill-behaved optimization problems made them the preferred optimization tool by most researchers a...

  20. The Cilium: Cellular Antenna and Central Processing Unit.

    Science.gov (United States)

    Malicki, Jarema J; Johnson, Colin A

    2017-02-01

    Cilia mediate an astonishing diversity of processes. Recent advances provide unexpected insights into the regulatory mechanisms of cilium formation, and reveal diverse regulatory inputs that are related to the cell cycle, cytoskeleton, proteostasis, and cilia-mediated signaling itself. Ciliogenesis and cilia maintenance are regulated by reciprocal antagonistic or synergistic influences, often acting in parallel to each other. By receiving parallel inputs, cilia appear to integrate multiple signals into specific outputs and may have functions similar to logic gates of digital systems. Some combinations of input signals appear to impose higher hierarchical control related to the cell cycle. An integrated view of these regulatory inputs will be necessary to understand ciliogenesis and its wider relevance to human biology. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Optimization of a biological sulfate reduction process

    Energy Technology Data Exchange (ETDEWEB)

    Lebel, A.

    1985-01-01

    A biological sulfate reduction process is presented. It is intended to treat sulfate wastes by converting them to hydrogen sulfide which can be further oxidized to elemental sulfur. An optimization study of a completely-mixed reactor system was performed. Major operating parameters were determined at the bench-scale level. The study was conducted in batch-culture experiments, using a mixed Desulfovibrio culture from sewage. Kinetic values were extrapolated using the Michaelis-Menten model, which best fitted the experimental data. The iron loading and the sulfate loading significantly affected the growth and metabolism of sulfate reducing bacteria (SRB). A model to determine V/sub m/ from the iron and sulfate loading values was explored. The model is limited by sulfate loading greater than 4.3 g/l, where bacterial growth is inhibited. Iron loading is not anticipated to suppress the bacterial metabolism efficiency since it remained in the linear pattern even at inhibition levels. Studies of the metabolic behavior of SRB, using lactic acid as the carbon source, showed a requirement of 2.7 moles of lactate for each mole of sulfate. This technique and its application to the sulfur recovery process are discussed.

  2. Tensegrity II. How structural networks influence cellular information processing networks

    Science.gov (United States)

    Ingber, Donald E.

    2003-01-01

    The major challenge in biology today is biocomplexity: the need to explain how cell and tissue behaviors emerge from collective interactions within complex molecular networks. Part I of this two-part article, described a mechanical model of cell structure based on tensegrity architecture that explains how the mechanical behavior of the cell emerges from physical interactions among the different molecular filament systems that form the cytoskeleton. Recent work shows that the cytoskeleton also orients much of the cell's metabolic and signal transduction machinery and that mechanical distortion of cells and the cytoskeleton through cell surface integrin receptors can profoundly affect cell behavior. In particular, gradual variations in this single physical control parameter (cell shape distortion) can switch cells between distinct gene programs (e.g. growth, differentiation and apoptosis), and this process can be viewed as a biological phase transition. Part II of this article covers how combined use of tensegrity and solid-state mechanochemistry by cells may mediate mechanotransduction and facilitate integration of chemical and physical signals that are responsible for control of cell behavior. In addition, it examines how cell structural networks affect gene and protein signaling networks to produce characteristic phenotypes and cell fate transitions during tissue development.

  3. Regulation of mammalian microRNA processing and function by cellular signaling and subcellular localization

    OpenAIRE

    2008-01-01

    For many microRNAs, in many normal tissues and in cancer cells, the cellular levels of mature microRNAs are not simply determined by transcription of microRNA genes. This mini-review will discuss how microRNA biogenesis and function can be regulated by various nuclear and cytoplasmic processing events, including emerging evidence that microRNA pathway components can be selectively regulated by control of their subcellular localization and by modifications that occur during dynamic cellular si...

  4. GA based CNC turning center exploitation process parameters optimization

    Directory of Open Access Journals (Sweden)

    Z. Car

    2009-01-01

    Full Text Available This paper presents machining parameters (turning process optimization based on the use of artificial intelligence. To obtain greater efficiency and productivity of the machine tool, optimal cutting parameters have to be obtained. In order to find optimal cutting parameters, the genetic algorithm (GA has been used as an optimal solution finder. Optimization has to yield minimum machining time and minimum production cost, while considering technological and material constrains.

  5. A Generic Methodology for Superstructure Optimization of Different Processing Networks

    DEFF Research Database (Denmark)

    Bertran, Maria-Ona; Frauzem, Rebecca; Zhang, Lei

    2016-01-01

    In this paper, we propose a generic computer-aided methodology for synthesis of different processing networks using superstructure optimization. The methodology can handle different network optimization problems of various application fields. It integrates databases with a common data architectur...

  6. A Generic Methodology for Superstructure Optimization of Different Processing Networks

    DEFF Research Database (Denmark)

    Bertran, Maria-Ona; Frauzem, Rebecca; Zhang, Lei;

    2016-01-01

    In this paper, we propose a generic computer-aided methodology for synthesis of different processing networks using superstructure optimization. The methodology can handle different network optimization problems of various application fields. It integrates databases with a common data architecture...

  7. Parameter estimation with a novel gradient-based optimization method for biological lattice-gas cellular automaton models.

    Science.gov (United States)

    Mente, Carsten; Prade, Ina; Brusch, Lutz; Breier, Georg; Deutsch, Andreas

    2011-07-01

    Lattice-gas cellular automata (LGCAs) can serve as stochastic mathematical models for collective behavior (e.g. pattern formation) emerging in populations of interacting cells. In this paper, a two-phase optimization algorithm for global parameter estimation in LGCA models is presented. In the first phase, local minima are identified through gradient-based optimization. Algorithmic differentiation is adopted to calculate the necessary gradient information. In the second phase, for global optimization of the parameter set, a multi-level single-linkage method is used. As an example, the parameter estimation algorithm is applied to a LGCA model for early in vitro angiogenic pattern formation.

  8. A cellular automata model for simulating fed-batch penicillin fermentation process

    Institute of Scientific and Technical Information of China (English)

    Yu Naigong; Ruan Xiaogang

    2006-01-01

    A cellular automata model to simulate penicillin fed-batch fermentation process(CAPFM)was established in this study,based on a morphologically structured dynamic penicillin production model,that is in turn based on the growth mechanism of penicillin producing microorganisms and the characteristics of penicillin fed-batch fermentation.CAPFM uses the three-dimensional cellular automata as a growth space,and a Moore-type neighborhood as the cellular neighborhood.The transition roles of CAPFM are designed based on mechanical and structural kinetic models of penicillin batch-fed fermentation processes.Every cell of CAPFM represents a single or specific number of penicillin producing microorganisms,and has various state.The simulation experimental results show that CAPFM replicates the evolutionary behavior of penicillin batch-fed fermentation processes described by the structured penicillin production kinetic model accordingly.

  9. Process optimization of friction stir welding based on thermal models

    DEFF Research Database (Denmark)

    Larsen, Anders Astrup

    2010-01-01

    This thesis investigates how to apply optimization methods to numerical models of a friction stir welding process. The work is intended as a proof-of-concept using different methods that are applicable to models of high complexity, possibly with high computational cost, and without the possibility...... information of the high-fidelity model. The optimization schemes are applied to stationary thermal models of differing complexity of the friction stir welding process. The optimization problems considered are based on optimizing the temperature field in the workpiece by finding optimal translational speed....... Also an optimization problem based on a microstructure model is solved, allowing the hardness distribution in the plate to be optimized. The use of purely thermal models represents a simplification of the real process; nonetheless, it shows the applicability of the optimization methods considered...

  10. Image Processing Oriented to Security Optimization

    Directory of Open Access Journals (Sweden)

    Ion Ivan

    2010-06-01

    Full Text Available This paper presents the main aspects of the digital content security. It describes the content of watermarking, presenting the steganography concept. SteganoGraphy application is presented and the algorithm used is analyzed. Optimization techniques are introduces to minimize the risk of discovering the information embedded into digital content by means of invisible watermarking. Techniques of analyzing the digital content results and identify the possible countermeasures for optimizing the steganography algorithm are presented.

  11. OPTIMAL CONTROL OF CNC CUTTING PROCESS

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    The intelligent optimizing method of cutting parameters and the cutting stable districts searching method are set up. The cutting parameters of each cutting pass could be optimized automatically, the cutting chatter is predicted through setting up the dynamic cutting force AR(2) model on-line, the spindle rotation speed is adjusted according to the predicting results so as to ensure the cutting system work in stable district.

  12. Utility Theory for Evaluation of Optimal Process Condition of SAW: A Multi-Response Optimization Approach

    Science.gov (United States)

    Datta, Saurav; Biswas, Ajay; Bhaumik, Swapan; Majumdar, Gautam

    2011-01-01

    Multi-objective optimization problem has been solved in order to estimate an optimal process environment consisting of optimal parametric combination to achieve desired quality indicators (related to bead geometry) of submerged arc weld of mild steel. The quality indicators selected in the study were bead height, penetration depth, bead width and percentage dilution. Taguchi method followed by utility concept has been adopted to evaluate the optimal process condition achieving multiple objective requirements of the desired quality weld.

  13. Application of GA in optimization of pore network models generated by multi-cellular growth algorithms

    Science.gov (United States)

    Jamshidi, Saeid; Boozarjomehry, Ramin Bozorgmehry; Pishvaie, Mahmoud Reza

    2009-10-01

    In pore network modeling, the void space of a rock sample is represented at the microscopic scale by a network of pores connected by throats. Construction of a reasonable representation of the geometry and topology of the pore space will lead to a reliable prediction of the properties of porous media. Recently, the theory of multi-cellular growth (or L-systems) has been used as a flexible tool for generation of pore network models which do not require any special information such as 2D SEM or 3D pore space images. In general, the networks generated by this method are irregular pore network models which are inherently closer to the complicated nature of the porous media rather than regular lattice networks. In this approach, the construction process is controlled only by the production rules that govern the development process of the network. In this study, genetic algorithm has been used to obtain the optimum values of the uncertain parameters of these production rules to build an appropriate irregular lattice network capable of the prediction of both static and hydraulic information of the target porous medium.

  14. Application of simulation models for the optimization of business processes

    Science.gov (United States)

    Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří

    2016-06-01

    The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.

  15. Waste Minimization Through Process Integration and Multi-objective Optimization

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    By avoiding or reducing the production of waste, waste minimization is an effective approach to solve the pollution problem in chemical industry. Process integration supported by multi-objective optimization provides a framework for process design or process retrofit by simultaneously optimizing on the aspects of environment and economics. Multi-objective genetic algorithm is applied in this area as the solution approach for the multi-objective optimization problem.

  16. Optimization of machining processes using pattern search algorithm

    Directory of Open Access Journals (Sweden)

    Miloš Madić

    2014-04-01

    Full Text Available Optimization of machining processes not only increases machining efficiency and economics, but also the end product quality. In recent years, among the traditional optimization methods, stochastic direct search optimization methods such as meta-heuristic algorithms are being increasingly applied for solving machining optimization problems. Their ability to deal with complex, multi-dimensional and ill-behaved optimization problems made them the preferred optimization tool by most researchers and practitioners. This paper introduces the use of pattern search (PS algorithm, as a deterministic direct search optimization method, for solving machining optimization problems. To analyze the applicability and performance of the PS algorithm, six case studies of machining optimization problems, both single and multi-objective, were considered. The PS algorithm was employed to determine optimal combinations of machining parameters for different machining processes such as abrasive waterjet machining, turning, turn-milling, drilling, electrical discharge machining and wire electrical discharge machining. In each case study the optimization solutions obtained by the PS algorithm were compared with the optimization solutions that had been determined by past researchers using meta-heuristic algorithms. Analysis of obtained optimization results indicates that the PS algorithm is very applicable for solving machining optimization problems showing good competitive potential against stochastic direct search methods such as meta-heuristic algorithms. Specific features and merits of the PS algorithm were also discussed.

  17. Evolutionary optimization of production materials workflow processes

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    We present an evolutionary optimisation technique for stochastic production processes, which is able to find improved production materials workflow processes with respect to arbitrary combinations of numerical quantities associated with the production process. Working from a core fragment...

  18. Parameter optimization model in electrical discharge machining process

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Electrical discharge machining (EDM) process, at present is still an experience process, wherein selected parameters are often far from the optimum, and at the same time selecting optimization parameters is costly and time consuming. In this paper,artificial neural network (ANN) and genetic algorithm (GA) are used together to establish the parameter optimization model. An ANN model which adapts Levenberg-Marquardt algorithm has been set up to represent the relationship between material removal rate (MRR) and input parameters, and GA is used to optimize parameters, so that optimization results are obtained. The model is shown to be effective, and MRR is improved using optimized machining parameters.

  19. Combining support vector regression and cellular genetic algorithm for multi-objective optimization of coal-fired utility boilers

    Energy Technology Data Exchange (ETDEWEB)

    Feng Wu; Hao Zhou; Tao Ren; Ligang Zheng; Kefa Cen [Zhejiang University, Hangzhou (China). State Key Laboratory of Clean Energy Utilization

    2009-10-15

    Support vector regression (SVR) was employed to establish mathematical models for the NOx emissions and carbon burnout of a 300 MW coal-fired utility boiler. Combined with the SVR models, the cellular genetic algorithm for multi-objective optimization (MOCell) was used for multi-objective optimization of the boiler combustion. Meanwhile, the comparison between MOCell and the improved non-dominated sorting genetic algorithm (NSGA-II) shows that MOCell has superior performance to NSGA-II regarding the problem. The field experiments were carried out to verify the accuracy of the results obtained by MOCell, the results were in good agreement with the measurement data. The proposed approach provides an effective tool for multi-objective optimization of coal combustion performance, whose feasibility and validity are experimental validated. A time period of less than 4 s was required for a run of optimization under a PC system, which is suitable for the online application. 19 refs., 8 figs., 2 tabs.

  20. Melt-processed polymeric cellular dosage forms for immediate drug release.

    Science.gov (United States)

    Blaesi, Aron H; Saka, Nannaji

    2015-12-28

    The present immediate-release solid dosage forms, such as the oral tablets and capsules, comprise granular matrices. While effective in releasing the drug rapidly, they are fraught with difficulties inherent in processing particulate matter. By contrast, liquid-based processes would be far more predictable; but the standard cast microstructures are unsuited for immediate-release because they resist fluid percolation and penetration. In this article, we introduce cellular dosage forms that can be readily prepared from polymeric melts by incorporating the nucleation, growth, and coalescence of microscopic gas bubbles in a molding process. We show that the cell topology and formulation of such cellular structures can be engineered to reduce the length-scale of the mass-transfer step, which determines the time of drug release, from as large as the dosage form itself to as small as the thickness of the cell wall. This allows the cellular dosage forms to achieve drug release rates over an order of magnitude faster compared with those of cast matrices, spanning the entire spectrum of immediate-release and beyond. The melt-processed polymeric cellular dosage forms enable predictive design of immediate-release solid dosage forms by tailoring microstructures, and could be manufactured efficiently in a single step.

  1. Microstructure optimization design methods of the forging process and applications

    Institute of Scientific and Technical Information of China (English)

    WANG Guangchun; ZHAO Guoqun; GUAN Jing

    2007-01-01

    A microstructure optimization design method of the forging process is proposed. The optimization goal is the fine grain size and homogeneous grain distribution. The optimization object is the forging process parameters and the shape of the preform die. The grain size sub-objective function, the forgings shape sub-objective function and the whole objective function including the shape and the grain size are established, espectively. The detailed optimization steps are given. The microstructure optimization program is developed using the micro-genetic algorithm and the finite element method. Then, the upsetting process of the cylindrical billet is analyzed using a self-developed program. The forging parameters and the shape of preform die of the upsetting process are optimized respectively. The fine size and homogenous distribution of the grain can be achieved by controlling the shape of the preform die and improving the friction condition.

  2. Maximum process problems in optimal control theory

    Directory of Open Access Journals (Sweden)

    Goran Peskir

    2005-01-01

    Full Text Available Given a standard Brownian motion (Btt≥0 and the equation of motion dXt=vtdt+2dBt, we set St=max0≤s≤tXs and consider the optimal control problem supvE(Sτ−Cτ, where c>0 and the supremum is taken over all admissible controls v satisfying vt∈[μ0,μ1] for all t up to τ=inf{t>0|Xt∉(ℓ0,ℓ1} with μ0g∗(St, where s↦g∗(s is a switching curve that is determined explicitly (as the unique solution to a nonlinear differential equation. The solution found demonstrates that the problem formulations based on a maximum functional can be successfully included in optimal control theory (calculus of variations in addition to the classic problem formulations due to Lagrange, Mayer, and Bolza.

  3. Modeling and Optimization of Cement Raw Materials Blending Process

    Directory of Open Access Journals (Sweden)

    Xianhong Li

    2012-01-01

    Full Text Available This paper focuses on modelling and solving the ingredient ratio optimization problem in cement raw material blending process. A general nonlinear time-varying (G-NLTV model is established for cement raw material blending process via considering chemical composition, feed flow fluctuation, and various craft and production constraints. Different objective functions are presented to acquire optimal ingredient ratios under various production requirements. The ingredient ratio optimization problem is transformed into discrete-time single objective or multiple objectives rolling nonlinear constraint optimization problem. A framework of grid interior point method is presented to solve the rolling nonlinear constraint optimization problem. Based on MATLAB-GUI platform, the corresponding ingredient ratio software is devised to obtain optimal ingredient ratio. Finally, several numerical examples are presented to study and solve ingredient ratio optimization problems.

  4. Cellular compartments cause multistability and allow cells to process more information

    DEFF Research Database (Denmark)

    Harrington, Heather A; Feliu, Elisenda; Wiuf, Carsten;

    2013-01-01

    recent developments from dynamical systems and chemical reaction network theory to identify and characterize the key-role of the spatial organization of eukaryotic cells in cellular information processing. In particular, the existence of distinct compartments plays a pivotal role in whether a system...... outcomes for cellular-decision making. We combine different mathematical techniques to provide a heuristic procedure to determine if a system has the capacity for multiple steady states, and find conditions that ensure that multiple steady states cannot occur. Notably, we find that introducing species......Many biological, physical, and social interactions have a particular dependence on where they take place; e.g., in living cells, protein movement between the nucleus and cytoplasm affects cellular responses (i.e., proteins must be present in the nucleus to regulate their target genes). Here we use...

  5. OPTIMAL PROCESSES IN IRREVERSIBLE THERMODYNAMICS AND MICROECONOMICS

    Directory of Open Access Journals (Sweden)

    Vladimir A. Kazakov

    2004-06-01

    Full Text Available This paper describes general methodology that allows one to extend Carnot efficiency of classical thermodynamic for zero rate processes onto thermodynamic systems with finite rate. We define the class of minimal dissipation processes and show that it represents generalization of reversible processes and determines the limiting possibilities of finite rate systems. The described methodology is then applied to microeconomic exchange systems yielding novel estimates of limiting efficiencies for such systems.

  6. Optimized Execution of Business Processes on Blockchain

    OpenAIRE

    García-Bañuelos, Luciano; Ponomarev, Alexander; Dumas, Marlon; Weber, Ingo

    2016-01-01

    Blockchain technology enables the execution of collaborative business processes involving untrusted parties without requiring a central authority. Specifically, a process model comprising tasks performed by multiple parties can be coordinated via smart contracts operating on the blockchain. The consensus mechanism governing the blockchain thereby guarantees that the process model is followed by each party. However, the cost required for blockchain use is highly dependent on the volume of data...

  7. Optimization of the investment casting process

    Directory of Open Access Journals (Sweden)

    M. Martinez-Hernandez

    2012-04-01

    Full Text Available Rapid prototyping is an important technique for manufacturing. This work refers to the manufacture of hollow patterns made of polymeric materials by rapid prototyping technologies for its use in the preparation of ceramic molds in the investment casting process. This work is focused on the development of a process for manufacturing patterns different from those that currently exist due to its hollow interior design, allowing its direct use in the fabrication of ceramic molds; avoiding cracking and fracture during the investment casting process, which is an important process for the foundry industry.

  8. A new cellular nonlinear network emulation on FPGA for EEG signal processing in epilepsy

    Science.gov (United States)

    Müller, Jens; Müller, Jan; Tetzlaff, Ronald

    2011-05-01

    For processing of EEG signals, we propose a new architecture for the hardware emulation of discrete-time Cellular Nonlinear Networks (DT-CNN). Our results show the importance of a high computational accuracy in EEG signal prediction that cannot be achieved with existing analogue VLSI circuits. The refined architecture of the processing elements and its resource schedule, the cellular network structure with local couplings, the FPGA-based embedded system containing the DT-CNN, and the data flow in the entire system will be discussed in detail. The proposed DT-CNN design has been implemented and tested on an Xilinx FPGA development platform. The embedded co-processor with a multi-threading kernel is utilised for control and pre-processing tasks and data exchange to the host via Ethernet. The performance of the implemented DT-CNN has been determined for a popular example and compared to that of a conventional computer.

  9. Optimal control of switched systems arising in fermentation processes

    CERN Document Server

    Liu, Chongyang

    2014-01-01

    The book presents, in a systematic manner, the optimal controls under different mathematical models in fermentation processes. Variant mathematical models – i.e., those for multistage systems; switched autonomous systems; time-dependent and state-dependent switched systems; multistage time-delay systems and switched time-delay systems – for fed-batch fermentation processes are proposed and the theories and algorithms of their optimal control problems are studied and discussed. By putting forward novel methods and innovative tools, the book provides a state-of-the-art and comprehensive systematic treatment of optimal control problems arising in fermentation processes. It not only develops nonlinear dynamical system, optimal control theory and optimization algorithms, but can also help to increase productivity and provide valuable reference material on commercial fermentation processes.

  10. Optimizing signal and image processing applications using Intel libraries

    Science.gov (United States)

    Landré, Jérôme; Truchetet, Frédéric

    2007-01-01

    This paper presents optimized signal and image processing libraries from Intel Corporation. Intel Performance Primitives (IPP) is a low-level signal and image processing library developed by Intel Corporation to optimize code on Intel processors. Open Computer Vision library (OpenCV) is a high-level library dedicated to computer vision tasks. This article describes the use of both libraries to build flexible and efficient signal and image processing applications.

  11. Process Model Construction and Optimization Using Statistical Experimental Design,

    Science.gov (United States)

    1988-04-01

    Memo No. 88-442 ~LECTE March 1988 31988 %,.. MvAY 1 98 0) PROCESS MODEL CONSTRUCTION AND OPTIMIZATION USING STATISTICAL EXPERIMENTAL DESIGN Emmanuel...Sachs and George Prueger Abstract A methodology is presented for the construction of process models by the combination of physically based mechanistic...253-8138. .% I " Process Model Construction and Optimization Using Statistical Experimental Design" by Emanuel Sachs Assistant Professor and George

  12. Intelligent Signal Processing for Detection System Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Fu, C Y; Petrich, L I; Daley, P F; Burnham, A K

    2004-06-18

    A wavelet-neural network signal processing method has demonstrated approximately tenfold improvement in the detection limit of various nitrogen and phosphorus compounds over traditional signal-processing methods in analyzing the output of a thermionic detector attached to the output of a gas chromatograph. A blind test was conducted to validate the lower detection limit. All fourteen of the compound spikes were detected when above the estimated threshold, including all three within a factor of two above. In addition, two of six were detected at levels 1/2 the concentration of the nominal threshold. We would have had another two correct hits if we had allowed human intervention to examine the processed data. One apparent false positive in five nulls was traced to a solvent impurity, whose presence was identified by running a solvent aliquot evaporated to 1% residual volume, while the other four nulls were properly classified. We view this signal processing method as broadly applicable in analytical chemistry, and we advocate that advanced signal processing methods be applied as directly as possible to the raw detector output so that less discriminating preprocessing and post-processing does not throw away valuable signal.

  13. Intelligent Signal Processing for Detection System Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Fu, C Y; Petrich, L I; Daley, P F; Burnham, A K

    2004-12-05

    A wavelet-neural network signal processing method has demonstrated approximately tenfold improvement over traditional signal-processing methods for the detection limit of various nitrogen and phosphorus compounds from the output of a thermionic detector attached to a gas chromatograph. A blind test was conducted to validate the lower detection limit. All fourteen of the compound spikes were detected when above the estimated threshold, including all three within a factor of two above the threshold. In addition, two of six spikes were detected at levels of 1/2 the concentration of the nominal threshold. Another two of the six would have been detected correctly if we had allowed human intervention to examine the processed data. One apparent false positive in five nulls was traced to a solvent impurity, whose presence was subsequently identified by analyzing a solvent aliquot evaporated to 1% residual volume, while the other four nulls were properly classified. We view this signal processing method as broadly applicable in analytical chemistry, and we advocate that advanced signal processing methods should be applied as directly as possible to the raw detector output so that less discriminating preprocessing and post-processing does not throw away valuable signal.

  14. Energy optimization of integrated process plants

    Energy Technology Data Exchange (ETDEWEB)

    Sandvig Nielsen, J.

    1996-10-01

    A general approach for viewing the process synthesis as an evolutionary process is proposed. Each step is taken according to the present level of information and knowledge. This is formulated in a Process Synthesis Cycle. Initially the synthesis is conducted at a high abstraction level maximizing use of heuristics (prior experience, rules of thumbs etc). When further knowledge and information are available, heuristics will gradually be replaced by exact problem formulations. The principles in the Process Synthesis Cycle, is used to develop a general procedure for energy synthesis, based on available tools. The procedure is based on efficient use of process simulators with integrated Pinch capabilities (energy targeting). The proposed general procedure is tailored to three specific problems (Humid Air Turbine power plant synthesis, Nitric Acid process synthesis and Sulphuric Acid synthesis). Using the procedure reduces the problem dimension considerable and thus allows for faster evaluation of more alternatives. At more detailed level a new framework for the Heat Exchanger Network synthesis problem is proposed. The new framework is object oriented based on a general functional description of all elements potentially present in the heat exchanger network (streams, exchangers, pumps, furnaces etc.). (LN) 116 refs.

  15. Soil restoration with organic amendments: linking cellular functionality and ecosystem processes

    Science.gov (United States)

    Bastida, F.; Selevsek, N.; Torres, I. F.; Hernández, T.; García, C.

    2015-10-01

    A hot topic in recent decades, the application of organic amendments to arid-degraded soils has been shown to benefit microbially-mediated processes. However, despite the importance of soils for global sustainability, a gap has not been addressed yet in soil science: is there any connection between ecosystem-community processes, cellular functionality, and microbial lifestyles (i.e. oligotrophy-copiotrophy) in restored soils? Together with classical ecosystem indicators (fatty-acids, extracellular-enzyme activities, basal respiration), state-of-the-art metaproteomics was applied to fill this gap in a model-restoration experiment initiated 10-years ago by the addition of sewage-sludge and compost. Organic amendment strongly impacted ecosystem processes. Furthermore, the type of material used induced differences in the cellular functionalities through variations in the percentages of proteins involved in translation, transcription, energy production and C-fixation. We conclude that the long-term impact of organic restoration goes beyond ecosystem processes and affects cellular functionalities and phyla-lifestyles coupled with differences in microbial-community structures.

  16. Optimal design of metabolic flux analysis experiments for anchorage-dependent mammalian cells using a cellular automaton model.

    Science.gov (United States)

    Meadows, Adam L; Roy, Siddhartha; Clark, Douglas S; Blanch, Harvey W

    2007-09-01

    Metabolic flux analysis (MFA) is widely used to quantify metabolic pathway activity. Typical applications involve isotopically labeled substrates, which require both metabolic and isotopic steady states for simplified data analysis. For bacterial systems, these steady states are readily achieved in chemostat cultures. However, mammalian cells are often anchorage dependent and experiments are typically conducted in batch or fed-batch systems, such as tissue culture dishes or microcarrier-containing bioreactors. Surface adherence may cause deviations from exponential growth, resulting in metabolically heterogeneous populations and a varying number of cellular "nearest neighbors" that may affect the observed metabolism. Here, we discuss different growth models suitable for deconvoluting these effects and their application to the design and optimization of MFA experiments employing surface-adherent mammalian cells. We describe a stochastic two-dimensional (2D) cellular automaton model, with empirical descriptions of cell number and non-growing cell fraction, suitable for easy application to most anchorage-dependent mammalian cell cultures. Model utility was verified by studying the impact of contact inhibition on the growth rate, specific extracellular flux rates, and isotopic labeling in lactate for MCF7 cells, a commonly studied breast cancer cell line. The model successfully defined the time over which exponential growth and a metabolically homogeneous growing cell population could be assumed. The cellular automaton model developed is shown to be a useful tool in designing optimal MFA experiments.

  17. Optimization of Steel Bar Manufacturing Process Using Six Sigma

    Institute of Scientific and Technical Information of China (English)

    NAEEM Khawar; ULLAH Misbah; TARIQ Adnan; MAQSOOD Shahid; AKHTAR Rehman; NAWAZ Rashid; HUSSAIN Iftikhar

    2016-01-01

    Optimization of a manufacturing process results in higher productivity and reduced wastes. Production parameters of a local steel bar manufacturing industry of Pakistan is optimized by using six Sigma-Define, measure, analyze, improve, and control- methodology. Production data is collected and analyzed. After analysis, experimental design result is used to identify significant factors affecting process performance. The significant factors are controlled to optimized level using two-level factorial design method. A regression model is developed that helps in the estimation of response under multi variable input values. Model is tested, verified, and validated by using industrial data collected at a local steel bar manufacturing industry of Peshawar(Khyber Pakhtunkhwa, Pakistan). The sigma level of the manufacturing process is improved to 4.01 from 3.58. The novelty of the research is the identification of the significant factors along with the optimum levels that affects the process yield, and the methodology to optimize the steel bar manufacturing process.

  18. Optimization of steel bar manufacturing process using six sigma

    Science.gov (United States)

    Naeem, Khawar; Ullah, Misbah; Tariq, Adnan; Maqsood, Shahid; Akhtar, Rehman; Nawaz, Rashid; Hussain, Iftikhar

    2016-03-01

    Optimization of a manufacturing process results in higher productivity and reduced wastes. Production parameters of a local steel bar manufacturing industry of Pakistan is optimized by using six Sigma-Define, measure, analyze, improve, and controlmethodology. Production data is collected and analyzed. After analysis, experimental design result is used to identify significant factors affecting process performance. The significant factors are controlled to optimized level using two-level factorial design method. A regression model is developed that helps in the estimation of response under multi variable input values. Model is tested, verified, and validated by using industrial data collected at a local steel bar manufacturing industry of Peshawar(Khyber Pakhtunkhwa, Pakistan). The sigma level of the manufacturing process is improved to 4.01 from 3.58. The novelty of the research is the identification of the significant factors along with the optimum levels that affects the process yield, and the methodology to optimize the steel bar manufacturing process.

  19. Process control and optimization with simple interval calculation method

    DEFF Research Database (Denmark)

    Pomerantsev, A.; Rodionova, O.; Høskuldsson, Agnar

    2006-01-01

    the series of expanding PLS/SIC models in order to support the on-line process improvements. This method helps to predict the effect of planned actions on the product quality and thus enables passive quality control. We have also considered an optimization approach that proposes the correcting actions......Methods of process control and optimization are presented and illustrated with a real world example. The optimization methods are based on the PLS block modeling as well as on the simple interval calculation methods of interval prediction and object status classification. It is proposed to employ...... for the quality improvement in the course of production. The latter is an active quality optimization, which takes into account the actual history of the process. The advocate approach is allied to the conventional method of multivariate statistical process control (MSPC) as it also employs the historical process...

  20. Silicon epitaxy process recipe and tool configuration optimization

    Science.gov (United States)

    Moy, W. H.; Cheong, K. Y.

    2017-07-01

    Silicon epitaxy is widely used in semiconductor fabrication due to its ability to produce high quality and low cost thin film. Epitaxy optimized process condition with respect to the process recipe and tool for the maximization of n-type epitaxial production has been investigated. For standard recipe of an epitaxy process, there are seven main steps, namely purge, ramp, bake, stab, deposition, post and cooling. This project focuses on the recipe optimization on ramp, bake and stab steps. For the tool configuration, cool-down step has been optimized. Impact on slip, haze, wafers warpage and crystal originated particles have been investigated.

  1. Optimizing ISOCAM data processing using spatial redundancy

    CERN Document Server

    Miville-Deschênes, M A; Abergel, A; Bernard, J P

    2000-01-01

    We present new data processing techniques that allow to correct the main instrumental effects that degrade the images obtained by ISOCAM, the camera on board the Infrared Space Observatory (ISO). Our techniques take advantage of the fact that a position on the sky has been observed by several pixels at different times. We use this information (1) to correct the long term variation of the detector response, (2) to correct memory effects after glitches and point sources, and (3) to refine the deglitching process. Our new method allows the detection of faint extended emission with contrast smaller than 1% of the zodiacal background. The data reduction corrects instrumental effects to the point where the noise in the final map is dominated by the readout and the photon noises. All raster ISOCAM observations can benefit from the data processing described here. These techniques could also be applied to other raster type observations (e.g. ISOPHOT or IRAC on SIRTF).

  2. Optimizing the morphological design of discrete-time cellular neural networks

    NARCIS (Netherlands)

    terBrugge, MH; Spaanenburg, L; Jansen, WJ; Nijhuis, JAG

    1996-01-01

    The morphological design of Discrete-Time Cellular Neural Networks (DTCNNs) has been presented in a companion paper [1]. DTCNN templates have been given for the elemental morphological operators. One way to obtain realizations for more complex operators is cascading the DTCNN equivalences of the

  3. Constraint-based Hybrid Cellular Automaton Topology Optimization for Advanced Lightweight Blast Resistant Structure Development

    Science.gov (United States)

    2011-11-01

    the desired model. 20 5. References 1. Goetz, J. C.; Tan, H.; Renaud, J. E.; Tovar , A. Structural Topology Optimization for Blast Mitigation...Stander, N. A Topology Optimization Tool for LS-DYNA Users: LS- OPT/Topology. The 7th European LS-DYNA Conference, 2009. 10. Tovar , A.; Patel, N

  4. Reorganization of microtubular cytoskeleton and formation of cellular processes during post-telophase in haemanthus endosperm.

    Science.gov (United States)

    Bajer, A S; Smirnova, E A

    1999-10-01

    We followed time-dependent post-telophase reorganization of the microtubule cytoskeleton on immunostained preparations of endosperm of the higher plant Haemanthus. After completion of mitosis, the phragmoplast continued to reorganize for several hours. This prompted the formation of phragmoplast-like derivatives (secondary and accessory phragmoplasts and peripheral microtubular ring). Next, elongated cellular protrusions (processes) appeared at the cell periphery. These processes contained long microtubule bundles and disorderly arranged actin filaments. Microtubule converging centers or accessory phragmoplasts were often present at the tips of the processes. Observation in vivo demonstrated that processes were formed at the cell periphery as extensions of lammelipodia or filopodia-type protrusions that commonly terminated with cytoplasmic blobs. We suggest that processes are derivatives of a peripheral microtubular ring that reorganizes gradually into cellular protrusions. Endosperm processes have several features of neuronal cells, or animal somatic cells with overexpressed MAPs. Since microtubule-containing processes were never detected shortly after extrusion of the cells from the embryo sac, this course of events might be restricted specifically to extruded endosperm and triggered either by removal of cells, their placement in monolayer on agar substrate, or both. Thus, post telophase behavior of endosperm cells offers a novel experimental system for studies of cytoskeleton in higher plants.

  5. Energy optimization aspects by injection process technology

    Science.gov (United States)

    Tulbure, A.; Ciortea, M.; Hutanu, C.; Farcas, V.

    2016-08-01

    In the proposed paper, the authors examine the energy aspects related to the injection moulding process technology in the automotive industry. Theoretical considerations have been validated by experimental measurements on the manufacturing process, for two types of injections moulding machines, hydraulic and electric. Practical measurements have been taken with professional equipment separately on each technological operation: lamination, compression, injection and expansion. For results traceability, the following parameters were, whenever possible, maintained: cycle time, product weight and the relative time. The aim of the investigations was to carry out a professional energy audit with accurate losses identification. Base on technological diagram for each production cycle, at the end of this contribution, some measure to reduce the energy consumption were proposed.

  6. Optimization of a Pressure-Treating Process

    Directory of Open Access Journals (Sweden)

    Josean Velez

    2011-01-01

    Full Text Available A company that pressure-treats wood wants to minimize its annual cost without using more than 250 days of operation per year. In addition, they want to find the corresponding value of time, batches and cost for each category. We develop an expression in terms of boards per batch to model the total cost of the treatment process. We then take the derivative and use Newton's Method to find the number of boards per batch that minimizes total cost.

  7. A New Optimal Control System Design for Chemical Processes

    Institute of Scientific and Technical Information of China (English)

    丛二丁; 胡明慧; 涂善东; 邵惠鹤

    2013-01-01

    Based on frequency response and convex optimization, a novel optimal control system was developed for chemical processes. The feedforward control is designed to improve the tracking performance of closed loop chemical systems. The parametric model is not required because the system directly utilizes the frequency response of the loop transfer function, which can be measured accurately. In particular, the extremal values of magnitude and phase can be solved according to constrained quadratic programming optimizer and convex optimization. Simula-tion examples show the effectiveness of the method. The design method is simple and easily adopted in chemical industry.

  8. Optimization of underwater wet welding process parameters using neural network

    National Research Council Canada - National Science Library

    Omajene, Joshua Emuejevoke; Martikainen, Jukka; Wu, Huapeng; Kah, Paul

    2014-01-01

    .... The soundness of a weld can be predicted from the weld bead geometry.This paper illustrates the application of artificial neural network approach in the optimization of the welding process parameter and the influence of the water environment...

  9. Optimization of aqueous extraction process to enhance the ...

    African Journals Online (AJOL)

    Kumar Sudhir

    2014-02-12

    Feb 12, 2014 ... Aqueous extraction process was optimized to reduce endotoxins from mixed substrate (1:1) for further phytase .... Microorganism and chemicals ..... The experimental design output (Table 2) was analyzed .... synthesis.

  10. SOFTWARE OPTIMIZATION OF BUSINESS PROCESS “UNIVERSITY ADMISSION CAMPAIGN”

    Directory of Open Access Journals (Sweden)

    Victor V. Babenko

    2015-01-01

    Full Text Available Admission campaign is an important part of the main business processes system of the university. Admission campaign is analyzed on the base of different modeling tools. The conceptual basis of CRM-system as information support of the process is proposed. It should be significant optimizing resource of business process

  11. Process sequence optimization for digital microfluidic integration using EWOD technique

    Science.gov (United States)

    Yadav, Supriya; Joyce, Robin; Sharma, Akash Kumar; Sharma, Himani; Sharma, Niti Nipun; Varghese, Soney; Akhtar, Jamil

    2016-04-01

    Micro/nano-fluidic MEMS biosensors are the devices that detects the biomolecules. The emerging micro/nano-fluidic devices provide high throughput and high repeatability with very low response time and reduced device cost as compared to traditional devices. This article presents the experimental details for process sequence optimization of digital microfluidics (DMF) using "electrowetting-on-dielectric" (EWOD). Stress free thick film deposition of silicon dioxide using PECVD and subsequent process for EWOD techniques have been optimized in this work.

  12. OPTIMAL SIGNAL PROCESSING METHODS IN GPR

    Directory of Open Access Journals (Sweden)

    Saeid Karamzadeh

    2014-01-01

    Full Text Available In the past three decades, a lot of various applications of Ground Penetrating Radar (GPR took place in real life. There are important challenges of this radar in civil applications and also in military applications. In this paper, the fundamentals of GPR systems will be covered and three important signal processing methods (Wavelet Transform, Matched Filter and Hilbert Huang will be compared to each other in order to get most accurate information about objects which are in subsurface or behind the wall.

  13. Improving cellularity and quality of liquid-based cytology slides processed from pancreatobiliary tract brushings.

    Science.gov (United States)

    Campion, Michael B; Kipp, Benjamin R; Humphrey, Sandra K; Zhang, Jun; Clayton, Amy C; Henry, Michael R

    2010-09-01

    Cytology has been reported to have suboptimal sensitivity for detecting pancreatobiliary tract cancer in biliary tract specimens partly as a result of low specimen cellularity and obscuring noncellular components. The goal of this study was to determine if the use of a glacial acetic acid wash prior to processing would increase the cellularity and improve the quality of ThinPrep slides when compared to standard non-gyn ThinPrep processing. Fifty consecutive pancreatobiliary tract specimens containing 20 ml of sample/PreservCyt were divided equally for standard non-gyn ThinPrep (STP) and glacial acetic acid ThinPrep processing (GATP). A manual drop preparation was also performed on residual STP specimen to determine the number of cells left in the vial during STP processing. Twenty-six (52%) specimens had more epithelial cell groupings with the GATP methodology while 19 (38%) had equivalent cellularity with both methods. The STP method produced more epithelial cell groupings in 5 (10%) of the specimens. Of the 26 specimens that had less cells with the STP method, 14 (54%) had > or = 50 cell groupings on the manual drop slide processed from the residual STP specimen suggesting that many cells remain in the vial after STP processing. The GATP method was preferred in 25 (50%) of the specimens, the STP method in 5 (10%), while both methodologies provided similar findings in the remaining 20 (40%) of specimens. The data from this study suggests that the GATP method results in more cells being placed on the slide and was preferred over the STP method in a majority of specimens.

  14. [Optimization of the pertussis vaccine production process].

    Science.gov (United States)

    Germán Santiago, J; Zamora, N; de la Rosa, E; Alba Carrión, C; Padrón, P; Hernández, M; Betancourt, M; Moretti, N

    1995-01-01

    The production of Pertussis Vaccine was reevaluated at the Instituto Nacional de Higiene "Rafael Rangel" in order to optimise it in terms of vaccine yield, potency, specific toxicity and efficiency (cost per doses). Four different processes, using two culture media (Cohen-Wheeler and Fermentación Glutamato Prolina-1) and two types of bioreactors (25 L Fermentador Caracas and a 450 L industrial fermentor) were compared. Runs were started from freeze-dried strains (134 or 509) and continued until the obtention of the maximal yield. It was found that the combination Fermentación Glutamato Prolina-1/industrial fermentor, shortened the process to 40 hours while consistently yielding a vaccine of higher potency (7.91 +/- 2.56 IU/human dose) and lower specific toxicity in a mice bioassay. In addition, the physical aspect of the preparation was rather homogeneous and free of dark aggregates. Most importantly, the biomass yield more than doubled those of the Fermentador Caracas using the two different media and that in the industrial fermentor with the Cohen-Wheeler medium. Therefore, the cost per doses was substantially decreased.

  15. Optimization Query Process of Mediators Interrogation Based On Combinatorial Storage

    Directory of Open Access Journals (Sweden)

    L. Cherrat

    2013-05-01

    Full Text Available In the distributed environment where a query involves several heterogeneous sources, communication costs must be taken into consideration. In this paper we describe a query optimization approach using dynamic programming technique for set integrated heterogeneous sources. The objective of the optimization is to minimize the total processing time including load processing, request rewriting and communication costs, to facilitate communication inter-sites and to optimize the time of data transfer from site to others. Moreover, the ability to store data in more than one centre site provides more flexibility in terms of Security/Safety and overload of the network. In contrast to optimizers which are considered a restricted search space, the proposed optimizer searches the closed subsets of sources and independency relationship which may be deep laniary or hierarchical trees. Especially the execution of the queries can start traversal anywhere over any subset and not only from a specific source.

  16. Optimality of Poisson Processes Intensity Learning with Gaussian Processes

    NARCIS (Netherlands)

    Kirichenko, A.; van Zanten, H.

    2015-01-01

    In this paper we provide theoretical support for the so-called "Sigmoidal Gaussian Cox Process" approach to learning the intensity of an inhomogeneous Poisson process on a d-dimensional domain. This method was proposed by Adams, Murray and MacKay (ICML, 2009), who developed a tractable computational

  17. Optimality of Poisson Processes Intensity Learning with Gaussian Processes

    NARCIS (Netherlands)

    A. Kirichenko; H. van Zanten

    2015-01-01

    In this paper we provide theoretical support for the so-called "Sigmoidal Gaussian Cox Process" approach to learning the intensity of an inhomogeneous Poisson process on a d-dimensional domain. This method was proposed by Adams, Murray and MacKay (ICML, 2009), who developed a tractable computational

  18. Cellular Automata as a learning process in Architecture and Urban design

    DEFF Research Database (Denmark)

    Jensen, Mads Brath; Foged, Isak Worre

    2014-01-01

    design approach on a master level urban design studio this paper will discuss the strategies for dealing with complexity at an urban scale as well as the pedagogical considerations behind applying computational tools and methods to a urban design education.......This paper explores the application of cellular automata as method for investigating the dynamic parameters and interrelationships that constitute the urban space. With increasing aspects needed for integration during the architectural and urban design process with the relations between....... An architectural methodological response to this situation is presented through the development of a conceptual computational design system that allows these dynamics to unfold and to be observed for architectural design decision taking. Reflecting on the development and implementation of a cellular automata based...

  19. When teams shift among processes: insights from simulation and optimization.

    Science.gov (United States)

    Kennedy, Deanna M; McComb, Sara A

    2014-09-01

    This article introduces process shifts to study the temporal interplay among transition and action processes espoused in the recurring phase model proposed by Marks, Mathieu, and Zacarro (2001). Process shifts are those points in time when teams complete a focal process and change to another process. By using team communication patterns to measure process shifts, this research explores (a) when teams shift among different transition processes and initiate action processes and (b) the potential of different interventions, such as communication directives, to manipulate process shift timing and order and, ultimately, team performance. Virtual experiments are employed to compare data from observed laboratory teams not receiving interventions, simulated teams receiving interventions, and optimal simulated teams generated using genetic algorithm procedures. Our results offer insights about the potential for different interventions to affect team performance. Moreover, certain interventions may promote discussions about key issues (e.g., tactical strategies) and facilitate shifting among transition processes in a manner that emulates optimal simulated teams' communication patterns. Thus, we contribute to theory regarding team processes in 2 important ways. First, we present process shifts as a way to explore the timing of when teams shift from transition to action processes. Second, we use virtual experimentation to identify those interventions with the greatest potential to affect performance by changing when teams shift among processes. Additionally, we employ computational methods including neural networks, simulation, and optimization, thereby demonstrating their applicability in conducting team research.

  20. Image processing to optimize wave energy converters

    Science.gov (United States)

    Bailey, Kyle Marc-Anthony

    The world is turning to renewable energies as a means of ensuring the planet's future and well-being. There have been a few attempts in the past to utilize wave power as a means of generating electricity through the use of Wave Energy Converters (WEC), but only recently are they becoming a focal point in the renewable energy field. Over the past few years there has been a global drive to advance the efficiency of WEC. Placing a mechanical device either onshore or offshore that captures the energy within ocean surface waves to drive a mechanical device is how wave power is produced. This paper seeks to provide a novel and innovative way to estimate ocean wave frequency through the use of image processing. This will be achieved by applying a complex modulated lapped orthogonal transform filter bank to satellite images of ocean waves. The complex modulated lapped orthogonal transform filterbank provides an equal subband decomposition of the Nyquist bounded discrete time Fourier Transform spectrum. The maximum energy of the 2D complex modulated lapped transform subband is used to determine the horizontal and vertical frequency, which subsequently can be used to determine the wave frequency in the direction of the WEC by a simple trigonometric scaling. The robustness of the proposed method is provided by the applications to simulated and real satellite images where the frequency is known.

  1. Optimal design of upstream processes in biotransformation technologies.

    Science.gov (United States)

    Dheskali, Endrit; Michailidi, Katerina; de Castro, Aline Machado; Koutinas, Apostolis A; Kookos, Ioannis K

    2017-01-01

    In this work a mathematical programming model for the optimal design of the bioreaction section of biotechnological processes is presented. Equations for the estimation of the equipment cost derived from a recent publication by the US National Renewable Energy Laboratory (NREL) are also summarized. The cost-optimal design of process units and the optimal scheduling of their operation can be obtained using the proposed formulation that has been implemented in software available from the journal web page or the corresponding author. The proposed optimization model can be used to quantify the effects of decisions taken at a lab scale on the industrial scale process economics. It is of paramount important to note that this can be achieved at the early stage of the development of a biotechnological project. Two case studies are presented that demonstrate the usefulness and potential of the proposed methodology.

  2. Optimization of frying process in food safety

    Directory of Open Access Journals (Sweden)

    Quaglia, G.

    1998-08-01

    Full Text Available The mechanics of frying are fairly simple. Hot oil serves as a heat exchange medium in which heat is transferred to the food being fried. As a result, the heat converts water within the food to steam and melts the fat within the food. The steam and fat then migrate from the interior of the food through the exterior and into the oil. Conversely, some of the frying oil is absorbed into the food being fried. The chemistry occurring in the frying oil and in the food being fried includes a myriad of thermal and oxidative reactions involving lipids, proteins, carbohydrates and minor food constituents. Decomposition products by autoxidation above 100°C, polimerization without oxigen between 200-300°C and thermal oxidation at 200°C, can be produced in frying oil and their amounts are related to different chemical and physical parameters such as temperature, heating time, type of oil used and food being fried, oil turnover rate, management of the oil and finally type of equipment used. Different studies have remarked as the toxicity of these by-products, is due to their chemistry and concentration. Since the prime requirement in food quality is the safety of the products, attainable through preventive analysis of the risks and total control through all frying processes, in this work the critical points of particular importance are identify and showed: Oil composition, and in particular its antioxidant capacity. Proper fryer design. Food/oil ratio. Good manufactured practice. Beside the quality screening has to be direct towards the chemical quality evaluation by easy and rapid analysis of oil (colour, polar compounds, free fatty acids and antioxidant capacity and food fried (panel test and/or consumer test. Conclusion, to maintain high quality in the frying medium, choose efficient equipment, select a fat with desirable flavour and good antioxidant capacity, eliminate crackling as soon and often as possible, choose better components with minimal but

  3. Optimization of electrodeposition processes for tin coatings

    Science.gov (United States)

    Wen, Shixue

    operating conditions are 200 A/m2 and 20°C for tin deposition in the investigated electrolyte. It is demonstrated that the chronopotentiometry is a very useful and efficient tool to study the deposition process when combined with SEM.

  4. Optimal Heating in Heat-Treatment Process Based on Grey Asynchronous Particle Swarm Optimization

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    To ensure plate heating quality and reduce energy consumption in heat-treatment process, optimal heating for plates in a roller hearth furnace was investigated and a new strategy for heating procedure optimization was developed. During solving process, plate temperature forecast model based on heat transfer mechanics was established to calculate plate temperature with the assumed heating procedure. In addition, multi-objective feature of optimal heating was analyzed. And the method, which is composed of asynchronous particle swarm optimization and grey relational analysis, was adopted for solving the multi-objective problem. The developed strategy for optimizing heating has been applied to the mass production. The result indicates that the absolute plate discharging temperature deviation between measured value and target value does not exceed ± 8 ℃, and the relative deviation is less than ± 0.77%.

  5. Variance-optimal hedging for processes with stationary independent increments

    DEFF Research Database (Denmark)

    Hubalek, Friedrich; Kallsen, J.; Krawczyk, L.

    We determine the variance-optimal hedge when the logarithm of the underlying price follows a process with stationary independent increments in discrete or continuous time. Although the general solution to this problem is known as backward recursion or backward stochastic differential equation, we...... show that for this class of processes the optimal endowment and strategy can be expressed more explicitly. The corresponding formulas involve the moment resp. cumulant generating function of the underlying process and a Laplace- or Fourier-type representation of the contingent claim. An example...

  6. Direction for optimization of the training process in junior hockey

    Directory of Open Access Journals (Sweden)

    Kygaevskiy S.A.

    2014-02-01

    Full Text Available Purpose: to consider the possible directions of optimization of training activity in youth hockey and offer practical advice. Material : study analyzed data from the literature and the latest achievements in the practice of training the player’s domestic and foreign authors on training in youth sports. Results : innovative approaches are considered in the initial stages of training sports perfection, as well as various areas of optimization of the training process in the initial stages of hockey and preliminary basic training. The examples of the training process in the North American and European hockey schools. The questions concerning the construction and orientation of training process at the initial stages and pre- basic training. Conclusions : highlighted promising areas for optimization of the training process of young hockey players in the initial stages of sports perfection.

  7. Reverse electrodialysis : A validated process model for design and optimization

    NARCIS (Netherlands)

    Veerman, J.; Saakes, M.; Metz, S. J.; Harmsen, G. J.

    2011-01-01

    Reverse electrodialysis (RED) is a technology to generate electricity using the entropy of the mixing of sea and river water. A model is made of the RED process and validated experimentally. The model is used to design and optimize the RED process. It predicts very small differences between counter-

  8. Maximizing the efficiency of multienzyme process by stoichiometry optimization.

    Science.gov (United States)

    Dvorak, Pavel; Kurumbang, Nagendra P; Bendl, Jaroslav; Brezovsky, Jan; Prokop, Zbynek; Damborsky, Jiri

    2014-09-05

    Multienzyme processes represent an important area of biocatalysis. Their efficiency can be enhanced by optimization of the stoichiometry of the biocatalysts. Here we present a workflow for maximizing the efficiency of a three-enzyme system catalyzing a five-step chemical conversion. Kinetic models of pathways with wild-type or engineered enzymes were built, and the enzyme stoichiometry of each pathway was optimized. Mathematical modeling and one-pot multienzyme experiments provided detailed insights into pathway dynamics, enabled the selection of a suitable engineered enzyme, and afforded high efficiency while minimizing biocatalyst loadings. Optimizing the stoichiometry in a pathway with an engineered enzyme reduced the total biocatalyst load by an impressive 56 %. Our new workflow represents a broadly applicable strategy for optimizing multienzyme processes.

  9. Pavement maintenance optimization model using Markov Decision Processes

    Science.gov (United States)

    Mandiartha, P.; Duffield, C. F.; Razelan, I. S. b. M.; Ismail, A. b. H.

    2017-09-01

    This paper presents an optimization model for selection of pavement maintenance intervention using a theory of Markov Decision Processes (MDP). There are some particular characteristics of the MDP developed in this paper which distinguish it from other similar studies or optimization models intended for pavement maintenance policy development. These unique characteristics include a direct inclusion of constraints into the formulation of MDP, the use of an average cost method of MDP, and the policy development process based on the dual linear programming solution. The limited information or discussions that are available on these matters in terms of stochastic based optimization model in road network management motivates this study. This paper uses a data set acquired from road authorities of state of Victoria, Australia, to test the model and recommends steps in the computation of MDP based stochastic optimization model, leading to the development of optimum pavement maintenance policy.

  10. Alternative oxidase pathway optimizes photosynthesis during osmotic and temperature stress by regulating cellular ROS, malate valve and antioxidative systems

    Directory of Open Access Journals (Sweden)

    DINAKAR eCHALLABATHULA

    2016-02-01

    Full Text Available The present study reveals the importance of alternative oxidase (AOX pathway in optimizing photosynthesis under osmotic and temperature stress conditions in the mesophyll protoplasts of Pisum sativum. The responses of photosynthesis and respiration were monitored at saturating light intensity of 1000 µmoles m-2 s-1 at 25 oC under a range of sorbitol concentrations from 0.4 M to 1.0M to induce hyper-osmotic stress and by varying the temperature of the thermo-jacketed pre-incubation chamber from 25 oC to 10 oC to impose sub-optimal temperature stress. Compared to controls (0.4 M sorbitol and 25 OC, the mesophyll protoplasts showed remarkable decrease in NaHCO3-dependent O2 evolution (indicator of photosynthetic carbon assimilation, under both hyper-osmotic (1.0 M sorbitol and sub-optimal temperature stress conditions (10 OC, while the decrease in rates of respiratory O2 uptake were marginal. The capacity of AOX pathway increased significantly in parallel to increase in intracellular pyruvate and reactive oxygen species (ROS levels under both hyper-osmotic stress and sub-optimal temperature stress under the background of saturating light. The ratio of redox couple (Malate/OAA related to malate valve increased in contrast to the ratio of redox couple (GSH/GSSG related to antioxidative system during hyper-osmotic stress. Nevertheless, the ratio of GSH/GSSG decreased in the presence of sub-optimal temperature, while the ratio of Malate/OAA showed no visible changes. Also, the redox ratios of pyridine nucleotides increased under hyper-osmotic (NADH/NAD and sub-optimal temperature (NADPH/NADP stresses, respectively. However, upon restriction of AOX pathway by using salicylhydroxamic acid (SHAM, the observed changes in NaHCO3 dependent O2 evolution, cellular ROS, redox ratios of Malate/OAA, NAD(PH/NAD(P and GSH/GSSG were further aggravated under stress conditions with concomitant modulations in NADP-MDH and antioxidant enzymes. Taken together, the

  11. Parameters Optimization of Low Carbon Low Alloy Steel Annealing Process

    Institute of Scientific and Technical Information of China (English)

    Maoyu ZHAO; Qianwang CHEN

    2013-01-01

    A suitable match of annealing process parameters is critical for obtaining the fine microstructure of material.Low carbon low alloy steel (20CrMnTi) was heated for various durations near Ac temperature to obtain fine pearlite and ferrite grains.Annealing temperature and time were used as independent variables,and material property data were acquired by orthogonal experiment design under intercritical process followed by subcritical annealing process (IPSAP).The weights of plasticity (hardness,yield strength,section shrinkage and elongation) of annealed material were calculated by analytic hierarchy process,and then the process parameters were optimized by the grey theory system.The results observed by SEM images show that microstructure of optimization annealing material are consisted of smaller lamellar pearlites (ferrite-cementite)and refining ferrites which distribute uniformly.Morphologies on tension fracture surface of optimized annealing material indicate that the numbers of dimple fracture show more finer toughness obviously comparing with other annealing materials.Moreover,the yield strength value of optimization annealing material decreases apparently by tensile test.Thus,the new optimized strategy is accurate and feasible.

  12. Inhomogeneous Poisson point process nucleation: comparison of analytical solution with cellular automata simulation

    Directory of Open Access Journals (Sweden)

    Paulo Rangel Rios

    2009-06-01

    Full Text Available Microstructural evolution in three dimensions of nucleation and growth transformations is simulated by means of cellular automata (CA. In the simulation, nuclei are located in space according to a heterogeneous Poisson point processes. The simulation is compared with exact analytical solution recently obtained by Rios and Villa supposing that the intensity is a harmonic function of the spatial coordinate. The simulated data gives very good agreement with the analytical solution provided that the correct shape factor for the growing CA grains is used. This good agreement is auspicious because the analytical expressions were derived and thus are exact only if the shape of the growing regions is spherical.

  13. Optimization of CernVM early boot process

    CERN Document Server

    Mazdin, Petra

    2015-01-01

    CernVM virtual machine is a Linux based virtual appliance optimized for High Energy Physics experiments. It is used for cloud computing, volunteer computing, and software development by the four large LHC experiments. The goal of this project is proling and optimizing the boot process of the CernVM. A key part was the development of a performance profiler for shell scripts as an extension to the popular BusyBox open source UNIX tool suite. Based on the measurements, costly shell code was replaced by more efficient, custom C programs. The results are compared to the original ones and successful optimization is proven.

  14. Optimal control of three-dimensional steamflooding processes

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Wei; Fred Ramirez, W. (Dept. of Chemical Engineering, Univ. of Colorado, Boulder, CO (United States))

    1994-06-01

    A system science approach using optimal control theory of distributed parameter systems has been developed to determine operating strategies that maximize the economic profitability of the steamflooding processes. Necessary conditions of optimization are established by using the discrete form of calculus of variations and Pontryagin's Maximum Principle. The performance of this approach is investigated through two actual three-dimensional steamflooding projects. The optimization results show this method yields significant improvements over the original operating strategies. These improvements cannot be achieved through traditional design methods

  15. Numerical study on photoresist etching processes based on a cellular automata model

    Institute of Scientific and Technical Information of China (English)

    ZHOU ZaiFa; HUANG QingAn; LI WeiHua; LU Wei

    2007-01-01

    For the three-dimensional (3-D) numerical study of photoresist etching processes, the 2-D dynamic cellular automata (CA) model has been successfully extended to a 3-D dynamic CA model. Only the boundary cells will be processed in the 3-D dynamic CA model and the structure of "if-else" description in the simulation program is avoided to speed up the simulation. The 3-D dynamic CA model has found to be stable, fast and accurate for the numerical study of photoresist etching processes. The exposure simulation, post-exposure bake (PEB) simulation and etching simulation are integrated together to further investigate the performances of the CA model. Simulation results have been compared with the available experimental results and the simulations show good agreement with the available experiments.

  16. Numerical study on photoresist etching processes based on a cellular automata model

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    For the three-dimensional (3-D) numerical study of photoresist etching processes, the 2-D dynamic cellular automata (CA) model has been successfully extended to a 3-D dynamic CA model. Only the boundary cells will be processed in the 3-D dy-namic CA model and the structure of “if-else” description in the simulation pro-gram is avoided to speed up the simulation. The 3-D dynamic CA model has found to be stable, fast and accurate for the numerical study of photoresist etching processes. The exposure simulation, post-exposure bake (PEB) simulation and etching simulation are integrated together to further investigate the performances of the CA model. Simulation results have been compared with the available ex-perimental results and the simulations show good agreement with the available experiments.

  17. Optimization of fused deposition modeling process using teaching-learning-based optimization algorithm

    Directory of Open Access Journals (Sweden)

    R. Venkata Rao

    2016-03-01

    Full Text Available The performance of rapid prototyping (RP processes is often measured in terms of build time, product quality, dimensional accuracy, cost of production, mechanical and tribological properties of the models and energy consumed in the process. The success of any RP process in terms of these performance measures entails selection of the optimum combination of the influential process parameters. Thus, in this work the single-objective and multi-objective optimization problems of a widely used RP process, namely, fused deposition modeling (FDM, are formulated, and the same are solved using the teaching-learning-based optimization (TLBO algorithm and non-dominated Sorting TLBO (NSTLBO algorithm, respectively. The results of the TLBO algorithm are compared with those obtained using genetic algorithm (GA, and quantum behaved particle swarm optimization (QPSO algorithm. The TLBO algorithm showed better performance as compared to GA and QPSO algorithms. The NSTLBO algorithm proposed to solve the multi-objective optimization problems of the FDM process in this work is a posteriori version of the TLBO algorithm. The NSTLBO algorithm is incorporated with non-dominated sorting concept and crowding distance assignment mechanism to obtain a dense set of Pareto optimal solutions in a single simulation run. The results of the NSTLBO algorithm are compared with those obtained using non-dominated sorting genetic algorithm (NSGA-II and the desirability function approach. The Pareto-optimal set of solutions for each problem is obtained and reported. These Pareto-optimal set of solutions will help the decision maker in volatile scenarios and are useful for the FDM process.

  18. Dynamical Allocation of Cellular Resources as an Optimal Control Problem: Novel Insights into Microbial Growth Strategies

    Science.gov (United States)

    Giordano, Nils; Mairet, Francis; Gouzé, Jean-Luc

    2016-01-01

    Microbial physiology exhibits growth laws that relate the macromolecular composition of the cell to the growth rate. Recent work has shown that these empirical regularities can be derived from coarse-grained models of resource allocation. While these studies focus on steady-state growth, such conditions are rarely found in natural habitats, where microorganisms are continually challenged by environmental fluctuations. The aim of this paper is to extend the study of microbial growth strategies to dynamical environments, using a self-replicator model. We formulate dynamical growth maximization as an optimal control problem that can be solved using Pontryagin’s Maximum Principle. We compare this theoretical gold standard with different possible implementations of growth control in bacterial cells. We find that simple control strategies enabling growth-rate maximization at steady state are suboptimal for transitions from one growth regime to another, for example when shifting bacterial cells to a medium supporting a higher growth rate. A near-optimal control strategy in dynamical conditions is shown to require information on several, rather than a single physiological variable. Interestingly, this strategy has structural analogies with the regulation of ribosomal protein synthesis by ppGpp in the enterobacterium Escherichia coli. It involves sensing a mismatch between precursor and ribosome concentrations, as well as the adjustment of ribosome synthesis in a switch-like manner. Our results show how the capability of regulatory systems to integrate information about several physiological variables is critical for optimizing growth in a changing environment. PMID:26958858

  19. Dynamical Allocation of Cellular Resources as an Optimal Control Problem: Novel Insights into Microbial Growth Strategies.

    Directory of Open Access Journals (Sweden)

    Nils Giordano

    2016-03-01

    Full Text Available Microbial physiology exhibits growth laws that relate the macromolecular composition of the cell to the growth rate. Recent work has shown that these empirical regularities can be derived from coarse-grained models of resource allocation. While these studies focus on steady-state growth, such conditions are rarely found in natural habitats, where microorganisms are continually challenged by environmental fluctuations. The aim of this paper is to extend the study of microbial growth strategies to dynamical environments, using a self-replicator model. We formulate dynamical growth maximization as an optimal control problem that can be solved using Pontryagin's Maximum Principle. We compare this theoretical gold standard with different possible implementations of growth control in bacterial cells. We find that simple control strategies enabling growth-rate maximization at steady state are suboptimal for transitions from one growth regime to another, for example when shifting bacterial cells to a medium supporting a higher growth rate. A near-optimal control strategy in dynamical conditions is shown to require information on several, rather than a single physiological variable. Interestingly, this strategy has structural analogies with the regulation of ribosomal protein synthesis by ppGpp in the enterobacterium Escherichia coli. It involves sensing a mismatch between precursor and ribosome concentrations, as well as the adjustment of ribosome synthesis in a switch-like manner. Our results show how the capability of regulatory systems to integrate information about several physiological variables is critical for optimizing growth in a changing environment.

  20. Multi-Objective Optimization of Squeeze Casting Process using Genetic Algorithm and Particle Swarm Optimization

    Directory of Open Access Journals (Sweden)

    Patel G.C.M.

    2016-09-01

    Full Text Available The near net shaped manufacturing ability of squeeze casting process requiresto set the process variable combinations at their optimal levels to obtain both aesthetic appearance and internal soundness of the cast parts. The aesthetic and internal soundness of cast parts deal with surface roughness and tensile strength those can readily put the part in service without the requirement of costly secondary manufacturing processes (like polishing, shot blasting, plating, hear treatment etc.. It is difficult to determine the levels of the process variable (that is, pressure duration, squeeze pressure, pouring temperature and die temperature combinations for extreme values of the responses (that is, surface roughness, yield strength and ultimate tensile strength due to conflicting requirements. In the present manuscript, three population based search and optimization methods, namely genetic algorithm (GA, particle swarm optimization (PSO and multi-objective particle swarm optimization based on crowding distance (MOPSO-CD methods have been used to optimize multiple outputs simultaneously. Further, validation test has been conducted for the optimal casting conditions suggested by GA, PSO and MOPSO-CD. The results showed that PSO outperformed GA with regard to computation time.

  1. Research on Business Processes Optimization for Agile Manufacturing

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Based on macroscopic and synthetic approaches, especia lly information entropy approach, the quantification of the flexible degree and order degree of business processes is studied. According to the outcome of abov e analysis, a conceptual model of optimizing business processes is proposed whic h supports to construct dynamic stable business processes. The research above has been applied in project 863/SDDAC-CIMS, and achieved primary benefits.

  2. Optimization of Gas Metal Arc Welding Process Parameters

    Science.gov (United States)

    Kumar, Amit; Khurana, M. K.; Yadav, Pradeep K.

    2016-09-01

    This study presents the application of Taguchi method combined with grey relational analysis to optimize the process parameters of gas metal arc welding (GMAW) of AISI 1020 carbon steels for multiple quality characteristics (bead width, bead height, weld penetration and heat affected zone). An orthogonal array of L9 has been implemented to fabrication of joints. The experiments have been conducted according to the combination of voltage (V), current (A) and welding speed (Ws). The results revealed that the welding speed is most significant process parameter. By analyzing the grey relational grades, optimal parameters are obtained and significant factors are known using ANOVA analysis. The welding parameters such as speed, welding current and voltage have been optimized for material AISI 1020 using GMAW process. To fortify the robustness of experimental design, a confirmation test was performed at selected optimal process parameter setting. Observations from this method may be useful for automotive sub-assemblies, shipbuilding and vessel fabricators and operators to obtain optimal welding conditions.

  3. Numerical Simulation and Optimization of Directional Solidification Process of Single Crystal Superalloy Casting

    Directory of Open Access Journals (Sweden)

    Hang Zhang

    2014-02-01

    Full Text Available The rapid development of numerical modeling techniques has led to more accurate results in modeling metal solidification processes. In this study, the cellular automaton-finite difference (CA-FD method was used to simulate the directional solidification (DS process of single crystal (SX superalloy blade samples. Experiments were carried out to validate the simulation results. Meanwhile, an intelligent model based on fuzzy control theory was built to optimize the complicate DS process. Several key parameters, such as mushy zone width and temperature difference at the cast-mold interface, were recognized as the input variables. The input variables were functioned with the multivariable fuzzy rule to get the output adjustment of withdrawal rate (v (a key technological parameter. The multivariable fuzzy rule was built, based on the structure feature of casting, such as the relationship between section area, and the delay time of the temperature change response by changing v, and the professional experience of the operator as well. Then, the fuzzy controlling model coupled with CA-FD method could be used to optimize v in real-time during the manufacturing process. The optimized process was proven to be more flexible and adaptive for a steady and stray-grain free DS process.

  4. Optimizing a Laser Process for Making Carbon Nanotubes

    Science.gov (United States)

    Arepalli, Sivaram; Nikolaev, Pavel; Holmes, William

    2010-01-01

    A systematic experimental study has been performed to determine the effects of each of the operating conditions in a double-pulse laser ablation process that is used to produce single-wall carbon nanotubes (SWCNTs). The comprehensive data compiled in this study have been analyzed to recommend conditions for optimizing the process and scaling up the process for mass production. The double-pulse laser ablation process for making SWCNTs was developed by Rice University researchers. Of all currently known nanotube-synthesizing processes (arc and chemical vapor deposition), this process yields the greatest proportion of SWCNTs in the product material. The aforementioned process conditions are important for optimizing the production of SWCNTs and scaling up production. Reports of previous research (mostly at Rice University) toward optimization of process conditions mention effects of oven temperature and briefly mention effects of flow conditions, but no systematic, comprehensive study of the effects of process conditions was done prior to the study described here. This was a parametric study, in which several production runs were carried out, changing one operating condition for each run. The study involved variation of a total of nine parameters: the sequence of the laser pulses, pulse-separation time, laser pulse energy density, buffer gas (helium or nitrogen instead of argon), oven temperature, pressure, flow speed, inner diameter of the flow tube, and flow-tube material.

  5. Proteomic characterization of cellular and molecular processes that enable the Nanoarchaeum equitans--Ignicoccus hospitalis relationship.

    Directory of Open Access Journals (Sweden)

    Richard J Giannone

    Full Text Available Nanoarchaeum equitans, the only cultured representative of the Nanoarchaeota, is dependent on direct physical contact with its host, the hyperthermophile Ignicoccus hospitalis. The molecular mechanisms that enable this relationship are unknown. Using whole-cell proteomics, differences in the relative abundance of >75% of predicted protein-coding genes from both Archaea were measured to identify the specific response of I. hospitalis to the presence of N. equitans on its surface. A purified N. equitans sample was also analyzed for evidence of interspecies protein transfer. The depth of cellular proteome coverage achieved here is amongst the highest reported for any organism. Based on changes in the proteome under the specific conditions of this study, I. hospitalis reacts to N. equitans by curtailing genetic information processing (replication, transcription in lieu of intensifying its energetic, protein processing and cellular membrane functions. We found no evidence of significant Ignicoccus biosynthetic enzymes being transported to N. equitans. These results suggest that, under laboratory conditions, N. equitans diverts some of its host's metabolism and cell cycle control to compensate for its own metabolic shortcomings, thus appearing to be entirely dependent on small, transferable metabolites and energetic precursors from I. hospitalis.

  6. Proteomic characterization of cellular and molecular processes that enable the Nanoarchaeum equitans--Ignicoccus hospitalis relationship.

    Science.gov (United States)

    Giannone, Richard J; Huber, Harald; Karpinets, Tatiana; Heimerl, Thomas; Küper, Ulf; Rachel, Reinhard; Keller, Martin; Hettich, Robert L; Podar, Mircea

    2011-01-01

    Nanoarchaeum equitans, the only cultured representative of the Nanoarchaeota, is dependent on direct physical contact with its host, the hyperthermophile Ignicoccus hospitalis. The molecular mechanisms that enable this relationship are unknown. Using whole-cell proteomics, differences in the relative abundance of >75% of predicted protein-coding genes from both Archaea were measured to identify the specific response of I. hospitalis to the presence of N. equitans on its surface. A purified N. equitans sample was also analyzed for evidence of interspecies protein transfer. The depth of cellular proteome coverage achieved here is amongst the highest reported for any organism. Based on changes in the proteome under the specific conditions of this study, I. hospitalis reacts to N. equitans by curtailing genetic information processing (replication, transcription) in lieu of intensifying its energetic, protein processing and cellular membrane functions. We found no evidence of significant Ignicoccus biosynthetic enzymes being transported to N. equitans. These results suggest that, under laboratory conditions, N. equitans diverts some of its host's metabolism and cell cycle control to compensate for its own metabolic shortcomings, thus appearing to be entirely dependent on small, transferable metabolites and energetic precursors from I. hospitalis.

  7. Linking Cellular and Mechanical Processes in Articular Cartilage Lesion Formation: A Mathematical Model.

    Science.gov (United States)

    Kapitanov, Georgi I; Wang, Xiayi; Ayati, Bruce P; Brouillette, Marc J; Martin, James A

    2016-01-01

    Post-traumatic osteoarthritis affects almost 20% of the adult US population. An injurious impact applies a significant amount of physical stress on articular cartilage and can initiate a cascade of biochemical reactions that can lead to the development of osteoarthritis. In our effort to understand the underlying biochemical mechanisms of this debilitating disease, we have constructed a multiscale mathematical model of the process with three components: cellular, chemical, and mechanical. The cellular component describes the different chondrocyte states according to the chemicals these cells release. The chemical component models the change in concentrations of those chemicals. The mechanical component contains a simulation of a blunt impact applied onto a cartilage explant and the resulting strains that initiate the biochemical processes. The scales are modeled through a system of partial-differential equations and solved numerically. The results of the model qualitatively capture the results of laboratory experiments of drop-tower impacts on cartilage explants. The model creates a framework for incorporating explicit mechanics, simulated by finite element analysis, into a theoretical biology framework. The effort is a step toward a complete virtual platform for modeling the development of post-traumatic osteoarthritis, which will be used to inform biomedical researchers on possible non-invasive strategies for mitigating the disease.

  8. Improved Large-Scale Process Cooling Operation through Energy Optimization

    Directory of Open Access Journals (Sweden)

    Kriti Kapoor

    2013-11-01

    Full Text Available This paper presents a study based on real plant data collected from chiller plants at the University of Texas at Austin. It highlights the advantages of operating the cooling processes based on an optimal strategy. A multi-component model is developed for the entire cooling process network. The model is used to formulate and solve a multi-period optimal chiller loading problem, posed as a mixed-integer nonlinear programming (MINLP problem. The results showed that an average energy savings of 8.57% could be achieved using optimal chiller loading as compared to the historical energy consumption data from the plant. The scope of the optimization problem was expanded by including a chilled water thermal storage in the cooling system. The effect of optimal thermal energy storage operation on the net electric power consumption by the cooling system was studied. The results include a hypothetical scenario where the campus purchases electricity at wholesale market prices and an optimal hour-by-hour operating strategy is computed to use the thermal energy storage tank.

  9. Optimization of polyetherimide processing parameters for optical interconnect applications

    Science.gov (United States)

    Zhao, Wei; Johnson, Peter; Wall, Christopher

    2015-10-01

    ULTEM® polyetherimide (PEI) resins have been used in opto-electronic markets since the optical properties of these materials enable the design of critical components under tight tolerances. PEI resins are the material of choice for injection molded integrated lens applications due to good dimensional stability, near infrared (IR) optical transparency, low moisture uptake and high heat performance. In most applications, parts must be produced consistently with minimal deviations to insure compatibility throughout the lifetime of the part. With the large number of lenses needed for this market, injection molding has been optimized to maximize the production rate. These optimized parameters for high throughput may or may not translate to an optimized optical performance. In this paper, we evaluate and optimize PEI injection molding processes with a focus on optical property performance. A commonly used commercial grade was studied to determine factors and conditions which contribute to optical transparency, color, and birefringence. Melt temperature, mold temperature, injection speed and cycle time were varied to develop optimization trials and evaluate optical properties. These parameters could be optimized to reduce in-plane birefringence from 0.0148 to 0.0006 in this study. In addition, we have studied an optically smooth, sub-10nm roughness mold to re-evaluate material properties with minimal influence from mold quality and further refine resin and process effects for the best optical performance.

  10. Opportunistic Interference Mitigation Achieves Optimal Degrees-of-Freedom in Cellular Networks

    CERN Document Server

    Jung, Bang Chul; Shin, Won-Yong

    2010-01-01

    We introduce an opportunistic interference mitigation (OIM) for cellular networks, where a user scheduling strategy is utilized in uplink $K$-cell environments with time-invariant channel coefficients and base stations (BSs) having $M$ receive antennas. In the OIM scheme, each BS opportunistically selects a set of users who generate the minimum interference to the other BSs. We consider two OIM protocols according to the number $S$ of simultaneously transmitting users per cell: an opportunistic interference nulling (OIN) and an opportunistic interference alignment (OIA). Then, their performance is analyzed in terms of degrees-of-freedom (DoFs). It is shown that $KM$ DoFs are achievable under the OIN protocol with $M$ selected users per cell, while the OIA scheme with $S$ selected users (smaller than $M$) achieves $KS$ DoFs. Assuming that $N$ denotes the total number of users in a cell, we also analyze the scaling condition between system parameters $K$, $M$, $N$, $S$, and the received signal-to-noise ratio su...

  11. Signal processing for molecular and cellular biological physics: an emerging field

    Science.gov (United States)

    Little, Max A.; Jones, Nick S.

    2013-01-01

    Recent advances in our ability to watch the molecular and cellular processes of life in action—such as atomic force microscopy, optical tweezers and Forster fluorescence resonance energy transfer—raise challenges for digital signal processing (DSP) of the resulting experimental data. This article explores the unique properties of such biophysical time series that set them apart from other signals, such as the prevalence of abrupt jumps and steps, multi-modal distributions and autocorrelated noise. It exposes the problems with classical linear DSP algorithms applied to this kind of data, and describes new nonlinear and non-Gaussian algorithms that are able to extract information that is of direct relevance to biological physicists. It is argued that these new methods applied in this context typify the nascent field of biophysical DSP. Practical experimental examples are supplied. PMID:23277603

  12. Network-Guided Key Gene Discovery for a Given Cellular Process

    DEFF Research Database (Denmark)

    He, Feng Q; Ollert, Markus

    2017-01-01

    Identification of key genes for a given physiological or pathological process is an essential but still very challenging task for the entire biomedical research community. Statistics-based approaches, such as genome-wide association study (GWAS)- or quantitative trait locus (QTL)-related analysis...... have already made enormous contributions to identifying key genes associated with a given disease or phenotype, the success of which is however very much dependent on a huge number of samples. Recent advances in network biology, especially network inference directly from genome-scale data...... and the following-up network analysis, opens up new avenues to predict key genes driving a given biological process or cellular function. Here we review and compare the current approaches in predicting key genes, which have no chances to stand out by classic differential expression analysis, from gene...

  13. Simulation of abrasive water jet cutting process: Part 2. Cellular automata approach

    Science.gov (United States)

    Orbanic, Henri; Junkar, Mihael

    2004-11-01

    A new two-dimensional cellular automata (CA) model for the simulation of the abrasive water jet (AWJ) cutting process is presented. The CA calculates the shape of the cutting front, which can be used as an estimation of the surface quality. The cutting front is formed based on material removal rules and AWJ propagation rules. The material removal rule calculates when a particular part of the material will be removed with regard to the energy of AWJ. The AWJ propagation rule calculates the distribution of AWJ energy through CA by using a weighted average. The modelling with CA also provides a visual narrative of the moving of the cutting front, which is hard to observe in real process. The algorithm is fast and has been successfully tested in comparison to cutting fronts obtained with cutting experiments of aluminium alloy.

  14. Optimal protocol for teleconsultation with a cellular phone for dentoalveolar trauma: an in-vitro study.

    Science.gov (United States)

    Park, Wonse; Lee, Hae-Na; Jeong, Jin-Sun; Kwon, Jung-Hoon; Lee, Grace H; Kim, Kee-Deog

    2012-06-01

    Dental trauma is frequently unpredictable. The initial assessment and urgent treatment are essential for dentists to save the patient's teeth. Mobile-phone-assisted teleconsultation and telediagnosis for dental trauma could be an aid when a dentist is not available. In the present in-vitro study, we evaluated the success rate and time to transfer images under various conditions. We analyzed the image quality of cameras built into mobile phones based on their resolution, autofocus, white-balance, and anti-movement functions. The image quality of most built-in cameras was acceptable to perform the initial assessment, with the autofocus function being essential to obtain high-quality images. The transmission failure rate increased markedly when the image size exceeded 500 κB and the additional text messaging did not improve the success rate or the transmission time. Our optimal protocol could be useful for emergency programs running on the mobile phones.

  15. Optimal protocol for teleconsultation with a cellular phone for dentoalveolar trauma: an in-vitro study

    Energy Technology Data Exchange (ETDEWEB)

    Park, Won Se; Lee, Hae Na; Jeong, Jin Sun; Kwon, Jung Hoon; Lee, Grace H; Kim, Kee Dong [College of Dentistry, Yonsei University, Seoul (Korea, Republic of)

    2012-06-15

    Dental trauma is frequently unpredictable. The initial assessment and urgent treatment are essential for dentists to save the patient's teeth. Mobile-phone-assisted teleconsultation and telediagnosis for dental trauma could be an aid when a dentist is not available. In the present in-vitro study, we evaluated the success rate and time to transfer images under various conditions. We analyzed the image quality of cameras built into mobile phones based on their resolution, autofocus, white-balance, and anti-movement functions. The image quality of most built-in cameras was acceptable to perform the initial assessment, with the autofocus function being essential to obtain high-quality images. The transmission failure rate increased markedly when the image size exceeded 500 kB and the additional text messaging did not improve the success rate or the transmission time. Our optimal protocol could be useful for emergency programs running on the mobile phones.

  16. Optimization of the Design of Pre-Signal System Using Improved Cellular Automaton

    Directory of Open Access Journals (Sweden)

    Yan Li

    2014-01-01

    Full Text Available The pre-signal system can improve the efficiency of intersection approach under rational design. One of the main obstacles in optimizing the design of pre-signal system is that driving behaviors in the sorting area cannot be well evaluated. The NaSch model was modified by considering slow probability, turning-deceleration rules, and lane changing rules. It was calibrated with field observed data to explore the interactions among design parameters. The simulation results of the proposed model indicate that the length of sorting area, traffic demand, signal timing, and lane allocation are the most important influence factors. The recommendations of these design parameters are demonstrated. The findings of this paper can be foundations for the design of pre-signal system and show promising improvement in traffic mobility.

  17. Optimization of the design of pre-signal system using improved cellular automaton.

    Science.gov (United States)

    Li, Yan; Li, Ke; Tao, Siran; Wan, Xia; Chen, Kuanmin

    2014-01-01

    The pre-signal system can improve the efficiency of intersection approach under rational design. One of the main obstacles in optimizing the design of pre-signal system is that driving behaviors in the sorting area cannot be well evaluated. The NaSch model was modified by considering slow probability, turning-deceleration rules, and lane changing rules. It was calibrated with field observed data to explore the interactions among design parameters. The simulation results of the proposed model indicate that the length of sorting area, traffic demand, signal timing, and lane allocation are the most important influence factors. The recommendations of these design parameters are demonstrated. The findings of this paper can be foundations for the design of pre-signal system and show promising improvement in traffic mobility.

  18. System Design Support by Optimization Method Using Stochastic Process

    Science.gov (United States)

    Yoshida, Hiroaki; Yamaguchi, Katsuhito; Ishikawa, Yoshio

    We proposed the new optimization method based on stochastic process. The characteristics of this method are to obtain the approximate solution of the optimum solution as an expected value. In numerical calculation, a kind of Monte Carlo method is used to obtain the solution because of stochastic process. Then, it can obtain the probability distribution of the design variable because it is generated in the probability that design variables were in proportion to the evaluation function value. This probability distribution shows the influence of design variables on the evaluation function value. This probability distribution is the information which is very useful for the system design. In this paper, it is shown the proposed method is useful for not only the optimization but also the system design. The flight trajectory optimization problem for the hang-glider is shown as an example of the numerical calculation.

  19. The Process of Optimizing Mechanical Sound Quality in Product Design

    DEFF Research Database (Denmark)

    Eriksen, Kaare; Holst, Thomas

    2011-01-01

    clarify the importance of product sound, defining perceptive demands identified by users, and, finally, how to suggest mechanical principles for modification of an existing sound design. The optimized mechanical sound design is followed by tests on users of the product in its use context. The result......The research field concerning optimizing product sound quality is a relatively unexplored area, and may become difficult for designers to operate in. To some degree, sound is a highly subjective parameter, which is normally targeted sound specialists. This paper describes the theoretical...... and practical background for managing a process of optimizing the mechanical sound quality in a product design by using simple tools and workshops systematically. The procedure is illustrated by a case study of a computer navigation tool (computer mouse or mouse). The process is divided into 4 phases, which...

  20. Engineering an improved acellular nerve graft via optimized chemical processing.

    Science.gov (United States)

    Hudson, Terry W; Liu, Stephen Y; Schmidt, Christine E

    2004-01-01

    The long-term goal of our research is to engineer an acellular nerve graft for clinical nerve repair and for use as a model system with which to study nerve-extracellular matrix interactions during nerve regeneration. To develop this model acellular nerve graft we (1) examined the effects of detergents on peripheral nerve tissue, and (2) used that knowledge to create a nerve graft devoid of cells with a well-preserved extracellular matrix. Using histochemistry and Western analysis, the impact of each detergent on cellular and extracellular tissue components was determined. An optimized protocol was created with the detergents Triton X-200, sulfobetaine-16, and sulfobetaine-10. This study represents the most comprehensive examination to date of the effects of detergents on peripheral nerve tissue morphology and protein composition. Also presented is an improved chemical decellularization protocol that preserves the internal structure of native nerve more than the predominant current protocol.

  1. Adjoint-based optimization of a foam EOR process

    NARCIS (Netherlands)

    Namdar Zanganeh, M.; Kraaijevanger, J.F.B.M.; Buurman, H.W.; Jansen, J.D.; Rossen, W.R.

    2012-01-01

    We apply adjoint-based optimization to a Surfactant-Alternating-Gas foam process using a linear foam model introducing gradual changes in gas mobility and a nonlinear foam model giving abrupt changes in gas mobility as function of oil and water saturations and surfactant concentration. For the

  2. Conformal optimal design and processing of extruding die cavity

    Institute of Scientific and Technical Information of China (English)

    齐红元; 陈科山; 杜凤山

    2008-01-01

    Aimed at the optimal analysis and processing technology of die cavity of special-shaped products extrusion, by numerical analysis of trigonometric interpolation and Conformal Mapping theory, on the non-circle cross-section of special-shaped products, the conformal mapping function can be set up to translate the cross-section region into unit dish region, over numerical finite interpolation points between even and odd. Products extrusion forming can be turned into two-dimension problem, and plastic stream function can be deduced, as well as the mathematical model of the die cavity surface is established based on deferent kinds of vertical curve. By applying Upper-bound Principle, the vertical curves and related parameters of die cavity are optimized. Combining with electrical discharge machining (EDM) process and numerical control (NC) milling machine technology, the optimal processing of die cavity can be realized. Taking ellipse-shaped products as an instance, the optimal analysis and processing of die cavity including extruding experiment are carried out.

  3. A framework for efficient process development using optimal experimental designs

    NARCIS (Netherlands)

    Ven, P. van de; Bijlsma, S.; Gout, E.; Voort Maarschalk, K. van der; Thissen, U.

    2011-01-01

    Introduction: The aim of this study was to develop and demonstrate a framework assuring efficient process development using fewer experiments than standard experimental designs. Methods: A novel optimality criterion for experimental designs (Iw criterion) is defined that leads to more efficient proc

  4. Optimization of Processing Technology of Compound Dandelion Wine

    OpenAIRE

    Wu Jixuan; Sun Guangren; Cao Xiuli; Han Yuting; Sun Xuesong; Zhang Huan; Zhang Lei; Dang Ataer

    2016-01-01

    Exploring dandelion food has been the concern in fields of the food processing and pharmaceutical industry for playing exact curative effect on high-fat-diet induced hepatic steatosis and diuretic activity. Few dandelion foods including drinks and microencapsulation were explored and unilateral dandelion wine were less carried out for its bitter flavour. In tis paper, to optimize the processing technologies of fermented compound wine from dandelion root, the orthogonal experiment design metho...

  5. Unifying View on Min-Max Fairness, Max-Min Fairness, and Utility Optimization in Cellular Networks

    Directory of Open Access Journals (Sweden)

    Stanczak Slawomir

    2007-01-01

    Full Text Available We are concerned with the control of quality of service (QoS in wireless cellular networks utilizing linear receivers. We investigate the issues of fairness and total performance, which are measured by a utility function in the form of a weighted sum of link QoS. We disprove the common conjecture on incompatibility of min-max fairness and utility optimality by characterizing network classes in which both goals can be accomplished concurrently. We characterize power and weight allocations achieving min-max fairness and utility optimality and show that they correspond to saddle points of the utility function. Next, we address the problem of the difference between min-max fairness and max-min fairness. We show that in general there is a (fairness gap between the performance achieved under min-max fairness and under max-min fairness. We characterize the network class for which both performance values coincide. Finally, we characterize the corresponding network subclass, in which both min-max fairness and max-min fairness are achievable by the same power allocation.

  6. Comparison of different optimization and process control procedures

    Directory of Open Access Journals (Sweden)

    Marko Reibenschuh

    2010-10-01

    Full Text Available This paper includes a comparison of different optimization methods, used for optimizing the cutting conditions during milling. It includes also a part of using soft computer techniques in process control procedures. Milling is a cutting procedure dependent of a number of variables. These variables are dependent from each other in consequence, if we change one variable, the others change too. PSO and GA algorithm are applied to the CNC milling program to improve cutting conditions, improve end finishing, reduce tool wear and reduce the stress on the tool, the machine and the machined part. At the end a summary will be given of pasted and future researches.

  7. Hierarchical random cellular neural networks for system-level brain-like signal processing.

    Science.gov (United States)

    Kozma, Robert; Puljic, Marko

    2013-09-01

    Sensory information processing and cognition in brains are modeled using dynamic systems theory. The brain's dynamic state is described by a trajectory evolving in a high-dimensional state space. We introduce a hierarchy of random cellular automata as the mathematical tools to describe the spatio-temporal dynamics of the cortex. The corresponding brain model is called neuropercolation which has distinct advantages compared to traditional models using differential equations, especially in describing spatio-temporal discontinuities in the form of phase transitions. Phase transitions demarcate singularities in brain operations at critical conditions, which are viewed as hallmarks of higher cognition and awareness experience. The introduced Monte-Carlo simulations obtained by parallel computing point to the importance of computer implementations using very large-scale integration (VLSI) and analog platforms.

  8. Cellular Automata Modelling of Photo-Induced Oxidation Processes in Molecularly Doped Polymers

    Directory of Open Access Journals (Sweden)

    David M. Goldie

    2016-11-01

    Full Text Available The possibility of employing cellular automata (CA to model photo-induced oxidation processes in molecularly doped polymers is explored. It is demonstrated that the oxidation dynamics generated using CA models exhibit stretched-exponential behavior. This dynamical characteristic is in general agreement with an alternative analysis conducted using standard rate equations provided the molecular doping levels are sufficiently low to prohibit the presence of safe-sites which are impenetrable to dissolved oxygen. The CA models therefore offer the advantage of exploring the effect of dopant agglomeration which is difficult to assess from standard rate equation solutions. The influence of UV-induced bleaching or darkening upon the resulting oxidation dynamics may also be easily incorporated into the CA models and these optical effects are investigated for various photo-oxidation product scenarios. Output from the CA models is evaluated for experimental photo-oxidation data obtained from a series of hydrazone-doped polymers.

  9. The optimization of operating parameters on microalgae upscaling process planning.

    Science.gov (United States)

    Ma, Yu-An; Huang, Hsin-Fu; Yu, Chung-Chyi

    2016-03-01

    The upscaling process planning developed in this study primarily involved optimizing operating parameters, i.e., dilution ratios, during process designs. Minimal variable cost was used as an indicator for selecting the optimal combination of dilution ratios. The upper and lower mean confidence intervals obtained from the actual cultured cell density data were used as the final cell density stability indicator after the operating parameters or dilution ratios were selected. The process planning method and results were demonstrated through three case studies of batch culture simulation. They are (1) final objective cell densities were adjusted, (2) high and low light intensities were used for intermediate-scale cultures, and (3) the number of culture days was expressed as integers for the intermediate-scale culture.

  10. Computer teaching process optimization strategy analysis of thinking ability

    Directory of Open Access Journals (Sweden)

    Luo Liang

    2016-01-01

    Full Text Available As is known to all, computer is a college student in a university course, one of the basic course in the process of education for college students which lay a theoretical foundation for the next professional learning. At the same time, in recent years, countries and universities attach great importance to and focus on computer teaching for young college students, the purpose is to improve students’ thinking ability, eventually to promote college students’ ability to use computational thinking to solve and analyze the problems of daily life. Therefore, this article on how to the calculation of optimization in the process of computer teaching college students thinking ability on further discussion and analysis, and then explore the strategies and methods, so as to promote the computer teaching in the process of the cultivation of thinking ability and optimize the computer

  11. Multi-objective optimization of process based on resource capability

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    To improve the practicability, suitability and accuracy of the trade-off among time, cost and quality of a process, a method based on resource capability is introduced. Through analyzing the relationship between an activity and its' supporting resource, the model trades off the time, cost and quality by changing intensity of labor or changing the types of supporting resource or units of labor of resource in a certain time respectively according to the different types of its' supporting resources. Through contrasting this method with the model of unit time cost corresponding to different quality levels and inter-related linear programming model of time, cost and quality for process optimizing, it is shown that this model does not only cover the above two models but also can describe some conditions the above two models can not express. The method supports to select different function to optimize a process according to different types of its supporting resource.

  12. The role of chemical engineering in process development and optimization.

    Science.gov (United States)

    Dienemann, E; Osifchin, R

    2000-11-01

    This review focuses on the roles that chemical engineers can play in the development, scale-up and optimization of synthetic processes for the production of active pharmaceutical ingredients. This multidisciplinary endeavor involves close collaboration among chemists and chemical engineers, and, for successful products, involves bridging the R&D and manufacturing enterprises. Balancing these disparate elements in the face of ever-mounting competitive pressures to shorten development timelines and ever-tightening regulatory, safety and environmental constraints, has become a critical business objective for all pharmaceutical companies. The concept of focusing development resources on selected critical process features as a function of phase within the development cycle will be discussed. In addition, several examples of chemical engineering- focused process development and optimization will be presented.

  13. Plasma process optimization for N-type doping applications

    Energy Technology Data Exchange (ETDEWEB)

    Raj, Deven; Persing, Harold; Salimian, Siamak; Lacey, Kerry; Qin Shu; Hu, Jeff Y.; McTeer, Allen [Applied Materials, Inc., Varian Semiconductor Business Unit, 35 Dory Road, Gloucester, MA 01930 (United States); Micron Technology, Inc., 8000 S. Federal Way, Boise, ID 83707 (United States)

    2012-11-06

    Plasma doping (PLAD) has been adopted across the implant technology space and into high volume production for both conventional DRAM and NAND doping applications. PLAD has established itself as an alternative to traditional ion implantation by beamline implantation. The push for high doping concentration, shallow doping depth, and conformal doping capability expand the need for a PLAD solution to meet such requirements. The unique doping profile and doping characteristics at high dose rates allow for PLAD to deliver a high throughput, differentiated solution to meet the demand of evolving transistor technology. In the PLAD process, ions are accelerated to the wafer as with a negative wafer bias applied to the wafer. Competing mechanisms, such as deposition, sputtering, and etching inherent in plasma doping require unique control and process optimization. In this work, we look at the distinctive process tool control and characterization features which enable an optimized doping process using n-type (PH{sub 3} or AsH{sub 3}) chemistries. The data in this paper will draw the relationship between process optimization through plasma chemistry study to the wafer level result.

  14. Magnetic Resonance Microscopy of Human and Porcine Neurons and Cellular Processes

    Science.gov (United States)

    Flint, Jeremy J.; Hansen, Brian; Portnoy, Sharon; Lee, Choong-Heon; King, Michael A.; Fey, Michael; Vincent, Franck; Stanisz, Greg J; Vestergaard-Poulsen, Peter; Blackband, Stephen J

    2012-01-01

    With its unparalleled ability to safely generate high-contrast images of soft tissues, magnetic resonance imaging (MRI) has remained at the forefront of diagnostic clinical medicine. Unfortunately due to resolution limitations, clinical scans are most useful for detecting macroscopic structural changes associated with a small number of pathologies. Moreover, due to a longstanding inability to directly observe magnetic resonance (MR) signal behavior at the cellular level, such information is poorly characterized and generally must be inferred. With the advent of the MR microscope in 1986 came the ability to measure MR signal properties of theretofore unobservable tissue structures. Recently, further improvements in hardware technology have made possible the ability to visualize mammalian cellular structure. In the current study, we expand upon previous work by imaging the neuronal cell bodies and processes of human and porcine α-motor neurons. Complimentary imaging studies are conducted in pig tissue in order to demonstrate qualitative similarities to human samples. Also, apparent diffusion coefficient (ADC) maps were generated inside porcine α-motor neuron cell bodies and portions of their largest processes (mean = 1.7±0.5 μm2/ms based on 53 pixels) as well as in areas containing a mixture of extracellular space, microvasculature, and neuropil (0.59±0.37 μm2/ms based on 33 pixels). Three-dimensional reconstruction of MR images containing α-motor neurons shows the spatial arrangement of neuronal projections between adjacent cells. Such advancements in imaging portend the ability to construct accurate models of MR signal behavior based on direct observation and measurement of the components which comprise functional tissues. These tools would not only be useful for improving our interpretation of macroscopic MRI performed in the clinic, but they could potentially be used to develop new methods of differential diagnosis to aid in the early detection of a

  15. Optimal design of the satellite constellation arrangement reconfiguration process

    Science.gov (United States)

    Fakoor, Mahdi; Bakhtiari, Majid; Soleymani, Mahshid

    2016-08-01

    In this article, a novel approach is introduced for the satellite constellation reconfiguration based on Lambert's theorem. Some critical problems are raised in reconfiguration phase, such as overall fuel cost minimization, collision avoidance between the satellites on the final orbital pattern, and necessary maneuvers for the satellites in order to be deployed in the desired position on the target constellation. To implement the reconfiguration phase of the satellite constellation arrangement at minimal cost, the hybrid Invasive Weed Optimization/Particle Swarm Optimization (IWO/PSO) algorithm is used to design sub-optimal transfer orbits for the satellites existing in the constellation. Also, the dynamic model of the problem will be modeled in such a way that, optimal assignment of the satellites to the initial and target orbits and optimal orbital transfer are combined in one step. Finally, we claim that our presented idea i.e. coupled non-simultaneous flight of satellites from the initial orbital pattern will lead to minimal cost. The obtained results show that by employing the presented method, the cost of reconfiguration process is reduced obviously.

  16. Cellular systems biology profiling applied to cellular models of disease.

    Science.gov (United States)

    Giuliano, Kenneth A; Premkumar, Daniel R; Strock, Christopher J; Johnston, Patricia; Taylor, Lansing

    2009-11-01

    Building cellular models of disease based on the approach of Cellular Systems Biology (CSB) has the potential to improve the process of creating drugs as part of the continuum from early drug discovery through drug development and clinical trials and diagnostics. This paper focuses on the application of CSB to early drug discovery. We discuss the integration of protein-protein interaction biosensors with other multiplexed, functional biomarkers as an example in using CSB to optimize the identification of quality lead series compounds.

  17. Measurement of immunotargeted plasmonic nanoparticles' cellular binding: a key factor in optimizing diagnostic efficacy

    Science.gov (United States)

    Fu, Kun; Sun, Jiantang; Bickford, Lissett R.; Lin, Alex W. H.; Halas, Naomi J.; Yu, Tse-Kuan; Drezek, Rebekah A.

    2008-01-01

    In this study, we use polarized light scattering to study immunotargeted plasmonic nanoparticles which bind to live SK-BR-3 human breast carcinoma cells. Gold nanoparticles can be conjugated to various biomolecules in order to target specific molecular signatures of disease. This specific targeting provides enhanced contrast in scattering-based optical imaging techniques. While there are papers which report the number of antibodies that bind per nanoparticle, there are almost no reports of the key factor which influences diagnostic or therapeutic efficacy using nanoparticles: the number of targeted nanoparticles that bind per cell. To achieve this goal, we have developed a 'negative' method of determining the binding concentration of those antibody/nanoparticle bioconjugates which are targeted specifically to breast cancer cells. Unlike previously reported methods, we collected unbound nanoparticle bioconjugates and measured the light scattering from dilute solutions of these particles so that quantitative binding information can be obtained. By following this process, the interaction effects of adjacent bound nanoparticles on the cell membrane can be avoided simply by measuring the light scattering from the unbound nanoparticles. Specifically, using nanoshells of two different sizes, we compared the binding concentrations of anti-HER2/nanoshell and anti-IgG/nanoshell bioconjugates targeted to HER2-positive SK-BR-3 breast cancer cells. The results indicate that, for anti-HER2/nanoshell bioconjugates, there are approximately 800-1600 nanoshells bound per cell; for anti-IgG/nanoshell bioconjugates, the binding concentration is significantly lower at nearly 100 nanoshells bound per cell. These results are also supported by dark-field microscopy images of the cells labeled with anti-HER2/nanoshell and anti-IgG/nanoshell bioconjugates.

  18. Measurement of immunotargeted plasmonic nanoparticles' cellular binding: a key factor in optimizing diagnostic efficacy

    Energy Technology Data Exchange (ETDEWEB)

    Fu Kun [Department of Bioengineering, Rice University, 6100 Main Street, MS-142, Houston, TX 77005 (United States); Sun Jiantang [Department of Bioengineering, Rice University, 6100 Main Street, MS-142, Houston, TX 77005 (United States); Bickford, Lissett R [Department of Bioengineering, Rice University, 6100 Main Street, MS-142, Houston, TX 77005 (United States); Lin, Alex W H [Department of Bioengineering, Rice University, 6100 Main Street, MS-142, Houston, TX 77005 (United States); Halas, Naomi J [Department of Electrical and Computer Engineering, Rice University, 6100 Main Street, MS-142, Houston, TX 77005 (United States); Yu, T-K [Department of Radiation Oncology, University of Texas, M D Anderson Cancer Center, Box 1202, 1515 Holcombe Boulevard, Houston, TX 77030 (United States); Drezek, Rebekah A [Department of Bioengineering, Rice University, 6100 Main Street, MS-142, Houston, TX 77005 (United States)

    2008-01-30

    In this study, we use polarized light scattering to study immunotargeted plasmonic nanoparticles which bind to live SK-BR-3 human breast carcinoma cells. Gold nanoparticles can be conjugated to various biomolecules in order to target specific molecular signatures of disease. This specific targeting provides enhanced contrast in scattering-based optical imaging techniques. While there are papers which report the number of antibodies that bind per nanoparticle, there are almost no reports of the key factor which influences diagnostic or therapeutic efficacy using nanoparticles: the number of targeted nanoparticles that bind per cell. To achieve this goal, we have developed a 'negative' method of determining the binding concentration of those antibody/nanoparticle bioconjugates which are targeted specifically to breast cancer cells. Unlike previously reported methods, we collected unbound nanoparticle bioconjugates and measured the light scattering from dilute solutions of these particles so that quantitative binding information can be obtained. By following this process, the interaction effects of adjacent bound nanoparticles on the cell membrane can be avoided simply by measuring the light scattering from the unbound nanoparticles. Specifically, using nanoshells of two different sizes, we compared the binding concentrations of anti-HER2/nanoshell and anti-IgG/nanoshell bioconjugates targeted to HER2-positive SK-BR-3 breast cancer cells. The results indicate that, for anti-HER2/nanoshell bioconjugates, there are approximately 800-1600 nanoshells bound per cell; for anti-IgG/nanoshell bioconjugates, the binding concentration is significantly lower at nearly 100 nanoshells bound per cell. These results are also supported by dark-field microscopy images of the cells labeled with anti-HER2/nanoshell and anti-IgG/nanoshell bioconjugates.

  19. Moisture-heating Treatment and Processing Optimization of Parboiled Rice

    Institute of Scientific and Technical Information of China (English)

    Zhao Siming; Xiong Shanbai

    2001-01-01

    Effect of moisture-heating treatment on the quality of parboiled rice and processing optimization were studied. Results indicated that the riboflavine content of parboiled rice is higher, and the color is light, aroma is strong and the head yield ratio is higher by soaking with acid and ethanol, cooking at high-pressure and then drying with high temperature-high moisture. Optimal processing parameters are soaking paddy with citrate acid for 2h, then with 1.5% ethanol for 1.5h, high pressure cooking for 30min, and then drying under 55%RH, 90℃ for 30min, cooling slowly for 2.5h, and shelling and milling immediately. Head yield ratio and whole rice ratio was 67.3% and 87.0% respectively, the color of finished product is light, rice aroma is strong, and the content of riboflavine is 2.47mg/100g.

  20. Intelligent Optimization of a Mixed Culture Cultivation Process

    Directory of Open Access Journals (Sweden)

    Petia Koprinkova-Hristova

    2015-04-01

    Full Text Available In the present paper a neural network approach called "Adaptive Critic Design" (ACD was applied to optimal tuning of set point controllers of the three main substrates (sugar, nitrogen source and dissolved oxygen for PHB production process. For approximation of the critic and the controllers a special kind of recurrent neural networks called Echo state networks (ESN were used. Their structure allows fast training that will be of crucial importance in on-line applications. The critic network is trained to minimize the temporal difference error using Recursive Least Squares method. Two approaches - gradient and heuristic - were exploited for training of the controllers. The comparison is made with respect to achieved improvement of the utility function subject of optimization as well as with known expert strategy for control the PHB production process.

  1. A Generic Methodology for Superstructure Optimization of Different Processing Networks

    DEFF Research Database (Denmark)

    Bertran, Maria-Ona; Frauzem, Rebecca; Zhang, Lei

    , biorefineries, and carbon dioxide utilization are considered.In the synthesis stage, the processing alternatives are represented in a superstructure and the associated data is collected and stored in a database. Once a specific process synthesis problem is formulated, the existing superstructure is retrieved...... of processing networks. This is illustrated through case studies from two applications: the synthesis of biorefinery networks and the synthesis of sustainable carbon dioxide utilization processes.......A large focus is placed on sustainability and sustainable practices as a result of the arising environmental issues. As an element of this, sustainable process synthesis and design becomes important. A generic, systematic methodology is proposed for solving the problem of optimal design...

  2. OPTIMIZATION STUDY IN MANUFACTURING PROCESS FOR PC/ABS BLENDS

    Institute of Scientific and Technical Information of China (English)

    Huang Chenghung; Fung Chinping; Chang Shihhsing; Hwang Jiunren; Doong Jiliang

    2003-01-01

    The optimization of injection molding process for polycarbonate/acrylonitrile-butadiene-styrene (PC/ABS) blends is studied using Taguchi method and principal component analysis (PCA). Four controllable process factors are studied at three levels each in the manufacturing process. The L9 orthogonal array is conducted to determine the optimum process factor/level combination for single quality of mechanical properties. In addition, the principal component analysis is employed to transform the correlated mechanical properties to a set of uncorrelated components and to evaluate a comprehensive index for multi-response cases. Then the optimum process factor/level combination for multiple qualities can be determined. Finally, the analysis of variance is used to find out the most influential injection molding parameter for single and multiple qualities problems.

  3. Optimal Component Types for the Design of Construction Processes

    Institute of Scientific and Technical Information of China (English)

    Felix Enge; Wolfgang Huhnt

    2008-01-01

    Within the project preparation phase, experienced professionals manually map design information onto process information with the aim to develop realistic and practical schedules. Unfortunately, the map- ping itself is neither part of any underlying data model nor is it supported by current scheduling tools. As a consequence the process of setting up the data model for a schedule is still not supported formally. Huhnt and Enge described a modelling technique that addresses the missing linkage between design and process information[1]. The approach makes use of so called component types. These are template sub-processes that describe the fabrication procedure of typical building components. Decomposing the building into com- ponents and assigning a component type to each component allows for formal support while scheduling. Depending on the decomposition of the building into components and the complexity of the involved com- ponent types the specification effort differs. The question about optimal component types arises: Which lay- out of building components and component types results in minimal specification effort? This paper presents a branch and bound algorithm to determine optimal component types. For a given schedule, which has been modelled based on component types, all possible decompositions into sub-processes are determined. Dur- ing the decomposition process the encountered configurations are compared. Those with minimal specifica- tion effort are registered. Theoretical and practical examples are examined and discussed.

  4. Optimizing The DSSC Fabrication Process Using Lean Six Sigma

    Science.gov (United States)

    Fauss, Brian

    Alternative energy technologies must become more cost effective to achieve grid parity with fossil fuels. Dye sensitized solar cells (DSSCs) are an innovative third generation photovoltaic technology, which is demonstrating tremendous potential to become a revolutionary technology due to recent breakthroughs in cost of fabrication. The study here focused on quality improvement measures undertaken to improve fabrication of DSSCs and enhance process efficiency and effectiveness. Several quality improvement methods were implemented to optimize the seven step individual DSSC fabrication processes. Lean Manufacturing's 5S method successfully increased efficiency in all of the processes. Six Sigma's DMAIC methodology was used to identify and eliminate each of the root causes of defects in the critical titanium dioxide deposition process. These optimizations resulted with the following significant improvements in the production process: 1. fabrication time of the DSSCs was reduced by 54 %; 2. fabrication procedures were improved to the extent that all critical defects in the process were eliminated; 3. the quantity of functioning DSSCs fabricated was increased from 17 % to 90 %.

  5. Research in Mobile Database Query Optimization and Processing

    Directory of Open Access Journals (Sweden)

    Agustinus Borgy Waluyo

    2005-01-01

    Full Text Available The emergence of mobile computing provides the ability to access information at any time and place. However, as mobile computing environments have inherent factors like power, storage, asymmetric communication cost, and bandwidth limitations, efficient query processing and minimum query response time are definitely of great interest. This survey groups a variety of query optimization and processing mechanisms in mobile databases into two main categories, namely: (i query processing strategy, and (ii caching management strategy. Query processing includes both pull and push operations (broadcast mechanisms. We further classify push operation into on-demand broadcast and periodic broadcast. Push operation (on-demand broadcast relates to designing techniques that enable the server to accommodate multiple requests so that the request can be processed efficiently. Push operation (periodic broadcast corresponds to data dissemination strategies. In this scheme, several techniques to improve the query performance by broadcasting data to a population of mobile users are described. A caching management strategy defines a number of methods for maintaining cached data items in clients' local storage. This strategy considers critical caching issues such as caching granularity, caching coherence strategy and caching replacement policy. Finally, this survey concludes with several open issues relating to mobile query optimization and processing strategy.

  6. Process analysis and optimization models defining recultivation surface mines

    Directory of Open Access Journals (Sweden)

    Dimitrijević Bojan V.

    2015-01-01

    Full Text Available Surface mines are generally open and very dynamic systems influenced by a large number of technical, economic, environmental and safety factors and limitations in all stages of the life cycle. In this paper the dynamic compliance period surface mining phases and of the reclamation. Also, an analysis of the reclamation of surface mines, and process flow management project recultivation is determined by the principled management model reclamation. The analysis of the planning process is defined by the model optimization recultivation surface mine.

  7. Optimized Technology for Residuum Processing in the ARGG Unit

    Institute of Scientific and Technical Information of China (English)

    Pan Luoqi; Yuan hongxing; Nie Baiqiu

    2006-01-01

    The influence of feedstock property on operation in the FCC unit was studied to identify the cause leading to deteriorated products distribution related with increasingly heavier feedstock for the ARGG unit. In order to maximize the economic benefits of the ARGG unit a string of measures, including the modification of catalyst formulation, retention of high catalyst activity, application of mixed termination agents to control the reaction temperature and once-through operation, and optimization of catalyst regeneration technique, were adopted to adapt the ARGG unit to processing of the heavy feedstock with its carbon residue equating to 7% on an average. The heavy oil processing technology has brought about apparent economic benefits.

  8. An optimal velocity for online limb-target regulation processes?

    Science.gov (United States)

    Tremblay, Luc; Crainic, Valentin A; de Grosbois, John; Bhattacharjee, Arindam; Kennedy, Andrew; Hansen, Steve; Welsh, Timothy N

    2017-01-01

    The utilization of visual information for the control of ongoing voluntary limb movements has been investigated for more than a century. Recently, online sensorimotor processes for the control of upper-limb reaches were hypothesized to include a distinct process related to the comparison of limb and target positions (i.e., limb-target regulation processes: Elliott et al. in Psychol Bull 136:1023-1044. doi: 10.1037/a0020958 , 2010). In the current study, this hypothesis was tested by presenting participants with brief windows of vision (20 ms) when the real-time velocity of the reaching limb rose above selected velocity criteria. One experiment tested the perceptual judgments of endpoint bias (i.e., under- vs. over-shoot), and another experiment tested the shifts in endpoint distributions following an imperceptible target jump. Both experiments revealed that limb-target regulation processes take place at an optimal velocity or "sweet spot" between movement onset and peak limb velocity (i.e., 1.0 m/s with the employed movement amplitude and duration). In contrast with pseudo-continuous models of online control (e.g., Elliott et al. in Hum Mov Sci 10:393-418. doi: 10.1016/0167-9457(91)90013-N , 1991), humans likely optimize online limb-target regulation processes by gathering visual information at a rather limited period of time, well in advance of peak limb velocity.

  9. Optimally efficient neural systems for processing spoken language.

    Science.gov (United States)

    Zhuang, Jie; Tyler, Lorraine K; Randall, Billi; Stamatakis, Emmanuel A; Marslen-Wilson, William D

    2014-04-01

    Cognitive models claim that spoken words are recognized by an optimally efficient sequential analysis process. Evidence for this is the finding that nonwords are recognized as soon as they deviate from all real words (Marslen-Wilson 1984), reflecting continuous evaluation of speech inputs against lexical representations. Here, we investigate the brain mechanisms supporting this core aspect of word recognition and examine the processes of competition and selection among multiple word candidates. Based on new behavioral support for optimal efficiency in lexical access from speech, a functional magnetic resonance imaging study showed that words with later nonword points generated increased activation in the left superior and middle temporal gyrus (Brodmann area [BA] 21/22), implicating these regions in dynamic sound-meaning mapping. We investigated competition and selection by manipulating the number of initially activated word candidates (competition) and their later drop-out rate (selection). Increased lexical competition enhanced activity in bilateral ventral inferior frontal gyrus (BA 47/45), while increased lexical selection demands activated bilateral dorsal inferior frontal gyrus (BA 44/45). These findings indicate functional differentiation of the fronto-temporal systems for processing spoken language, with left middle temporal gyrus (MTG) and superior temporal gyrus (STG) involved in mapping sounds to meaning, bilateral ventral inferior frontal gyrus (IFG) engaged in less constrained early competition processing, and bilateral dorsal IFG engaged in later, more fine-grained selection processes.

  10. Manufacturing processes of cellular metals. Part II. Solid route, metals deposition, other processes; Procesos de fabricacion de metales celulares. Parte II: Via solida, deposicion de metales otros procesos

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez, P.; Cruz, L. J.; Coleto, J.

    2009-07-01

    At the first part of this paper review a description about cellular metal processes by liquid route, was made. In this second part, solid processes and metals deposition are described. In similar way, the different kind of processes in each case are reviewed; making a short description about the main parameters involved and the advantages and drawbacks in each of them. (Author) 147 refs.

  11. Cellular artificial glowworm swarm optimization algorithm for multiple-choice knapsack problem.%多选择背包问题的元胞萤火虫算法

    Institute of Scientific and Technical Information of China (English)

    程魁; 马良; 刘勇

    2013-01-01

    为有效求解多选择背包问题,基于元胞自动机的原理和萤火虫算法,提出一种求解多选择背包问题的元胞萤火虫算法。将元胞及其邻居引入到算法中来保持种群的多样性,利用元胞的演化规则进行局部优化,避免算法陷入局部极值。通过对典型多选择背包问题的仿真实验和其他算法的比较,表明该算法可行有效,有良好的全局优化能力。%In order to solve the multiple-choice knapsack problem, based on the principles of cellular automata and artificial glowworm swarm optimization algorithm, this paper presents a novel cellular artificial glowworm swarm optimization algorithm for multiple-choice knapsack problem. Cellular and its neighbor are introduced into the algorithm to maintain the swarm’s diver-sity and the algorithm uses evolutionary rule of cellular in local optimization to avoid local optima. Simulated tests of multiple-choice knapsack problem and comparisons with other algorithms show the algorithm is feasible and effective and the algorithm has strong global optimization ability.

  12. Low probability of intercept-based adaptive radar waveform optimization in signal-dependent clutter for joint radar and cellular communication systems

    Science.gov (United States)

    Shi, Chenguang; Salous, Sana; Wang, Fei; Zhou, Jianjiang

    2016-12-01

    In this paper, we investigate the problem of low probability of intercept (LPI)-based adaptive radar waveform optimization in signal-dependent clutter for joint radar and cellular communication systems, where the radar system optimizes the transmitted waveform such that the interference caused to the cellular communication systems is strictly controlled. Assuming that the precise knowledge of the target spectra, the power spectral densities (PSDs) of signal-dependent clutters, the propagation losses of corresponding channels and the communication signals is known by the radar, three different LPI based criteria for radar waveform optimization are proposed to minimize the total transmitted power of the radar system by optimizing the multicarrier radar waveform with a predefined signal-to-interference-plus-noise ratio (SINR) constraint and a minimum required capacity for the cellular communication systems. These criteria differ in the way the communication signals scattered off the target are considered in the radar waveform design: (1) as useful energy, (2) as interference or (3) ignored altogether. The resulting problems are solved analytically and their solutions represent the optimum power allocation for each subcarrier in the multicarrier radar waveform. We show with numerical results that the LPI performance of the radar system can be significantly improved by exploiting the scattered echoes off the target due to cellular communication signals received at the radar receiver.

  13. A Taguchi approach on optimal process control parameters for HDPE pipe extrusion process

    Science.gov (United States)

    Sharma, G. V. S. S.; Rao, R. Umamaheswara; Rao, P. Srinivasa

    2016-12-01

    High-density polyethylene (HDPE) pipes find versatile applicability for transportation of water, sewage and slurry from one place to another. Hence, these pipes undergo tremendous pressure by the fluid carried. The present work entails the optimization of the withstanding pressure of the HDPE pipes using Taguchi technique. The traditional heuristic methodology stresses on a trial and error approach and relies heavily upon the accumulated experience of the process engineers for determining the optimal process control parameters. This results in setting up of less-than-optimal values. Hence, there arouse a necessity to determine optimal process control parameters for the pipe extrusion process, which can ensure robust pipe quality and process reliability. In the proposed optimization strategy, the design of experiments (DoE) are conducted wherein different control parameter combinations are analyzed by considering multiple setting levels of each control parameter. The concept of signal-to-noise ratio (S/N ratio) is applied and ultimately optimum values of process control parameters are obtained as: pushing zone temperature of 166 °C, Dimmer speed at 08 rpm, and Die head temperature to be 192 °C. Confirmation experimental run is also conducted to verify the analysis and research result and values proved to be in synchronization with the main experimental findings and the withstanding pressure showed a significant improvement from 0.60 to 1.004 Mpa.

  14. Cellular cholesterol delivery, intracellular processing and utilization for biosynthesis of steroid hormones

    Directory of Open Access Journals (Sweden)

    Azhar Salman

    2010-06-01

    Full Text Available Abstract Steroid hormones regulate diverse physiological functions such as reproduction, blood salt balance, maintenance of secondary sexual characteristics, response to stress, neuronal function and various metabolic processes. They are synthesized from cholesterol mainly in the adrenal gland and gonads in response to tissue-specific tropic hormones. These steroidogenic tissues are unique in that they require cholesterol not only for membrane biogenesis, maintenance of membrane fluidity and cell signaling, but also as the starting material for the biosynthesis of steroid hormones. It is not surprising, then, that cells of steroidogenic tissues have evolved with multiple pathways to assure the constant supply of cholesterol needed to maintain optimum steroid synthesis. The cholesterol utilized for steroidogenesis is derived from a combination of sources: 1 de novo synthesis in the endoplasmic reticulum (ER; 2 the mobilization of cholesteryl esters (CEs stored in lipid droplets through cholesteryl ester hydrolase; 3 plasma lipoprotein-derived CEs obtained by either LDL receptor-mediated endocytic and/or SR-BI-mediated selective uptake; and 4 in some cultured cell systems from plasma membrane-associated free cholesterol. Here, we focus on recent insights into the molecules and cellular processes that mediate the uptake of plasma lipoprotein-derived cholesterol, events connected with the intracellular cholesterol processing and the role of crucial proteins that mediate cholesterol transport to mitochondria for its utilization for steroid hormone production. In particular, we discuss the structure and function of SR-BI, the importance of the selective cholesterol transport pathway in providing cholesterol substrate for steroid biosynthesis and the role of two key proteins, StAR and PBR/TSO in facilitating cholesterol delivery to inner mitochondrial membrane sites, where P450scc (CYP11A is localized and where the conversion of cholesterol to

  15. Optimization of struvite precipitation in synthetic biologically treated swine wastewater - Determination of the optimal process parameters

    OpenAIRE

    Capdevielle, Aurélie; Sýkorová, Eva; Biscans, Béatrice; Béline, Fabrice; Daumer, Marie-Line

    2013-01-01

    International audience; A sustainable way to recover phosphorus (P) in swine wastewater involves a preliminary step of P dissolution followed by the separation of particulate organic matter. The next two steps are firstly the precipitation of struvite crystals done by adding a crystallization reagent (magnesia) and secondly the filtration of the crystals. A design of experiments with five process parameters was set up to optimize the size of the struvite crystals in a synthetic swine wastewat...

  16. Optimization of Sunflower Oil Transesterification Process Using Sodium Methoxide

    Directory of Open Access Journals (Sweden)

    Sara KoohiKamali

    2012-01-01

    Full Text Available In this study, the methanolysis process of sunflower oil was investigated to get high methyl esters (biodiesel content using sodium methoxide. To reach to the best process conditions, central composite design (CCD through response surface methodology (RSM was employed. The optimal conditions predicted were the reaction time of 60 min, an excess stoichiometric amount of alcohol to oil ratio of 25%w/w and the catalyst content of 0.5%w/w, which lead to the highest methyl ester content (100%w/w. The methyl ester content of the mixture from gas chromatography analysis (GC was compared to that of optimum point. Results, confirmed that there was no significant difference between the fatty acid methyl ester content of sunflower oil produced under the optimized condition and the experimental value (P≥0.05. Furthermore, some fuel specifications of the resultant biodiesel were tested according to American standards for testing of materials (ASTM methods. The outcome showed that the methyl ester mixture produced from the optimized condition met nearly most of the important biodiesel specifications recommended in ASTM D 6751 requirements. Thus, the sunflower oil methyl esters resulted from this study could be a suitable alternative for petrol diesels.

  17. Optimization of a semiconductor manufacturing process using a reentrant model

    Directory of Open Access Journals (Sweden)

    Sarah Abuhab Valente

    2015-01-01

    Full Text Available The scope of this work is the simulation of a semiconductor manufacturing model in Arena® software and subsequent optimization and sensitivity analysis of this model. The process is considered extremely complex given the amount of steps, machinery, parameters, and highly reentrant characteristics, which makes it difficult to reach stability of production process. The production model used was the Intel Five-Machine Six-Step Mini-fab developed by Karl Kempf (1994. It was programmed in Arena® and optimized by OptQuest®, an add-on. We concluded that variation in the number of machines and operators reflects on cycle time only if there is an increase of one unit of resource more than obtained in the optimization. As a result, we highlighted the scenario where a reduction in cycle time stood out, in which one extra unit was added in the second machine group, representing a 7.41% reduction in cycle time.

  18. Parallel particle swarm optimization on a graphics processing unit with application to trajectory optimization

    Science.gov (United States)

    Wu, Q.; Xiong, F.; Wang, F.; Xiong, Y.

    2016-10-01

    In order to reduce the computational time, a fully parallel implementation of the particle swarm optimization (PSO) algorithm on a graphics processing unit (GPU) is presented. Instead of being executed on the central processing unit (CPU) sequentially, PSO is executed in parallel via the GPU on the compute unified device architecture (CUDA) platform. The processes of fitness evaluation, updating of velocity and position of all particles are all parallelized and introduced in detail. Comparative studies on the optimization of four benchmark functions and a trajectory optimization problem are conducted by running PSO on the GPU (GPU-PSO) and CPU (CPU-PSO). The impact of design dimension, number of particles and size of the thread-block in the GPU and their interactions on the computational time is investigated. The results show that the computational time of the developed GPU-PSO is much shorter than that of CPU-PSO, with comparable accuracy, which demonstrates the remarkable speed-up capability of GPU-PSO.

  19. Process Parameters Optimization in Single Point Incremental Forming

    Science.gov (United States)

    Gulati, Vishal; Aryal, Ashmin; Katyal, Puneet; Goswami, Amitesh

    2016-04-01

    This work aims to optimize the formability and surface roughness of parts formed by the single-point incremental forming process for an Aluminium-6063 alloy. The tests are based on Taguchi's L18 orthogonal array selected on the basis of DOF. The tests have been carried out on vertical machining center (DMC70V); using CAD/CAM software (SolidWorks V5/MasterCAM). Two levels of tool radius, three levels of sheet thickness, step size, tool rotational speed, feed rate and lubrication have been considered as the input process parameters. Wall angle and surface roughness have been considered process responses. The influential process parameters for the formability and surface roughness have been identified with the help of statistical tool (response table, main effect plot and ANOVA). The parameter that has the utmost influence on formability and surface roughness is lubrication. In the case of formability, lubrication followed by the tool rotational speed, feed rate, sheet thickness, step size and tool radius have the influence in descending order. Whereas in surface roughness, lubrication followed by feed rate, step size, tool radius, sheet thickness and tool rotational speed have the influence in descending order. The predicted optimal values for the wall angle and surface roughness are found to be 88.29° and 1.03225 µm. The confirmation experiments were conducted thrice and the value of wall angle and surface roughness were found to be 85.76° and 1.15 µm respectively.

  20. Optimization of magnetite carrier precipitation process for transuranic waste reduction

    Energy Technology Data Exchange (ETDEWEB)

    Slater, S.A.; Chamberlain, D.B.; Aase, S.A.; Babcock, B.D.; Conner, C.; Sedlet, J.; Vandegrift, G.F. [Argonne National Lab., IL (United States). Chemical Technology Div.

    1995-12-31

    Transuranic (TRU) waste that is being generated at Argonne National Laboratory has a TRU activity ranging from 10{sup 2} to 10{sup 7} nCi/g with a wide variety of chemical compositions. Currently, the waste is stored in highly acidic solutions that must be neutralized for intermediate storage. A magnetite carrier precipitation process has been adapted to concentrate TRU isotopes in a noncorrosive solid phase. In this paper, the authors report the results of a series of laboratory tests done to optimize the process. The parameters they optimized included (1) magnetite concentration used to precipitate the TRUs from solution, (2) formation of magnetite (in situ or ex situ), (3) processing pH, and (4) temperature and mixing time of the carrier precipitation. They also studied the effects of anions, cations, and complexing agents in the waste solutions on the carrier precipitation and the effect of magnetite solids loading on the filtration equipment. An overview is given of the planned full-scale process, which will be operated in a glovebox.

  1. Roll levelling semi-analytical model for process optimization

    Science.gov (United States)

    Silvestre, E.; Garcia, D.; Galdos, L.; Saenz de Argandoña, E.; Mendiguren, J.

    2016-08-01

    Roll levelling is a primary manufacturing process used to remove residual stresses and imperfections of metal strips in order to make them suitable for subsequent forming operations. In the last years the importance of this process has been evidenced with the apparition of Ultra High Strength Steels with strength > 900 MPa. The optimal setting of the machine as well as a robust machine design has become critical for the correct processing of these materials. Finite Element Method (FEM) analysis is the widely used technique for both aspects. However, in this case, the FEM simulation times are above the admissible ones in both machine development and process optimization. In the present work, a semi-analytical model based on a discrete bending theory is presented. This model is able to calculate the critical levelling parameters i.e. force, plastification rate, residual stresses in a few seconds. First the semi-analytical model is presented. Next, some experimental industrial cases are analyzed by both the semi-analytical model and the conventional FEM model. Finally, results and computation times of both methods are compared.

  2. Optimal deployment of resources for maximizing impact in spreading processes.

    Science.gov (United States)

    Lokhov, Andrey Y; Saad, David

    2017-09-26

    The effective use of limited resources for controlling spreading processes on networks is of prime significance in diverse contexts, ranging from the identification of "influential spreaders" for maximizing information dissemination and targeted interventions in regulatory networks, to the development of mitigation policies for infectious diseases and financial contagion in economic systems. Solutions for these optimization tasks that are based purely on topological arguments are not fully satisfactory; in realistic settings, the problem is often characterized by heterogeneous interactions and requires interventions in a dynamic fashion over a finite time window via a restricted set of controllable nodes. The optimal distribution of available resources hence results from an interplay between network topology and spreading dynamics. We show how these problems can be addressed as particular instances of a universal analytical framework based on a scalable dynamic message-passing approach and demonstrate the efficacy of the method on a variety of real-world examples.

  3. Optimal Deployment of Resources for Maximizing Impact in Spreading Processes

    CERN Document Server

    Lokhov, Andrey Y

    2016-01-01

    The effective use of limited resources for controlling spreading processes on networks is of prime significance in diverse contexts, ranging from the identification of "influential spreaders" for maximizing information dissemination and targeted interventions in regulatory networks, to the development of mitigation policies for infectious diseases and financial contagion in economic systems. Solutions for these optimization tasks that are based purely on topological arguments are not fully satisfactory; in realistic settings the problem is often characterized by heterogeneous interactions and requires interventions over a finite time window via a restricted set of controllable nodes. The optimal distribution of available resources hence results from an interplay between network topology and spreading dynamics. We show how these problems can be addressed as particular instances of a universal analytical framework based on a scalable dynamic message-passing approach and demonstrate the efficacy of the method on...

  4. Where should I send it? Optimizing the submission decision process.

    Science.gov (United States)

    Salinas, Santiago; Munch, Stephan B

    2015-01-01

    How do scientists decide where to submit manuscripts? Many factors influence this decision, including prestige, acceptance probability, turnaround time, target audience, fit, and impact factor. Here, we present a framework for evaluating where to submit a manuscript based on the theory of Markov decision processes. We derive two models, one in which an author is trying to optimally maximize citations and another in which that goal is balanced by either minimizing the number of resubmissions or the total time in review. We parameterize the models with data on acceptance probability, submission-to-decision times, and impact factors for 61 ecology journals. We find that submission sequences beginning with Ecology Letters, Ecological Monographs, or PLOS ONE could be optimal depending on the importance given to time to acceptance or number of resubmissions. This analysis provides some guidance on where to submit a manuscript given the individual-specific values assigned to these disparate objectives.

  5. Optimal and adaptive methods of processing hydroacoustic signals (review)

    Science.gov (United States)

    Malyshkin, G. S.; Sidel'nikov, G. B.

    2014-09-01

    Different methods of optimal and adaptive processing of hydroacoustic signals for multipath propagation and scattering are considered. Advantages and drawbacks of the classical adaptive (Capon, MUSIC, and Johnson) algorithms and "fast" projection algorithms are analyzed for the case of multipath propagation and scattering of strong signals. The classical optimal approaches to detecting multipath signals are presented. A mechanism of controlled normalization of strong signals is proposed to automatically detect weak signals. The results of simulating the operation of different detection algorithms for a linear equidistant array under multipath propagation and scattering are presented. An automatic detector is analyzed, which is based on classical or fast projection algorithms, which estimates the background proceeding from median filtering or the method of bilateral spatial contrast.

  6. Moisture-heating Treatment and Processing Optimization of Parboiled Rice

    Institute of Scientific and Technical Information of China (English)

    ZhaoSiming; XiongShanbai

    2001-01-01

    Abstract: Effect of moisture-heating treatment on the quality ofparboiled rice and processing optimization were studied. Resultsindicated that the riboflavine content of parboiled rice is higher, and thecolor is light, aroma is strong and the head yield ratio is higher bysoaking with acid and ethanol, cooking at high-pressure and thendrying with high temperature-high moisture. Optimal processingparameters are soaking paddy with citrate acid for 2h, then with 1.5%ethanol for 1.5h, high pressure cooking for 30min, and then dryingunder 55%RH, 90~C for 30min, cooling slowly for 2.5h, and shelling andmilling immediately. Head yield ratio and whole rice ratio was 67.3%and 87.0% respectively, the color of finished product is light, ricearoma is strong, and the content of riboflavine is 2.47mg/100g.

  7. Implementation and Optimization of Image Processing Algorithms on Embedded GPU

    Science.gov (United States)

    Singhal, Nitin; Yoo, Jin Woo; Choi, Ho Yeol; Park, In Kyu

    In this paper, we analyze the key factors underlying the implementation, evaluation, and optimization of image processing and computer vision algorithms on embedded GPU using OpenGL ES 2.0 shader model. First, we present the characteristics of the embedded GPU and its inherent advantage when compared to embedded CPU. Additionally, we propose techniques to achieve increased performance with optimized shader design. To show the effectiveness of the proposed techniques, we employ cartoon-style non-photorealistic rendering (NPR), speeded-up robust feature (SURF) detection, and stereo matching as our example algorithms. Performance is evaluated in terms of the execution time and speed-up achieved in comparison with the implementation on embedded CPU.

  8. Graphics Processing Units and High-Dimensional Optimization.

    Science.gov (United States)

    Zhou, Hua; Lange, Kenneth; Suchard, Marc A

    2010-08-01

    This paper discusses the potential of graphics processing units (GPUs) in high-dimensional optimization problems. A single GPU card with hundreds of arithmetic cores can be inserted in a personal computer and dramatically accelerates many statistical algorithms. To exploit these devices fully, optimization algorithms should reduce to multiple parallel tasks, each accessing a limited amount of data. These criteria favor EM and MM algorithms that separate parameters and data. To a lesser extent block relaxation and coordinate descent and ascent also qualify. We demonstrate the utility of GPUs in nonnegative matrix factorization, PET image reconstruction, and multidimensional scaling. Speedups of 100 fold can easily be attained. Over the next decade, GPUs will fundamentally alter the landscape of computational statistics. It is time for more statisticians to get on-board.

  9. Thickness optimization for lithography process on silicon substrate

    Science.gov (United States)

    Su, Xiaojing; Su, Yajuan; Liu, Yansong; Chen, Fong; Liu, Zhimin; Zhang, Wei; Li, Bifeng; Gao, Tao; Wei, Yayi

    2015-03-01

    With the development of the lithography, the demand for critical dimension (CD) and CD uniformity (CDU) has reached a new level, which is harder and harder to achieve. There exists reflection at the interface between photo-resist and substrate during lithography exposure. This reflection has negative impact on CD and CDU control. It is possible to optimize the litho stack and film stack thickness on different lithography conditions. With the optimized stack, the total reflectivity for all incident angles at the interface can be controlled less than 0.5%, ideally 0.1%, which enhances process window (PW) most of the time. The theoretical results are verified by the experiment results from foundry, which helps the foundry achieve the mass production finally.

  10. Where should I send it? Optimizing the submission decision process.

    Directory of Open Access Journals (Sweden)

    Santiago Salinas

    Full Text Available How do scientists decide where to submit manuscripts? Many factors influence this decision, including prestige, acceptance probability, turnaround time, target audience, fit, and impact factor. Here, we present a framework for evaluating where to submit a manuscript based on the theory of Markov decision processes. We derive two models, one in which an author is trying to optimally maximize citations and another in which that goal is balanced by either minimizing the number of resubmissions or the total time in review. We parameterize the models with data on acceptance probability, submission-to-decision times, and impact factors for 61 ecology journals. We find that submission sequences beginning with Ecology Letters, Ecological Monographs, or PLOS ONE could be optimal depending on the importance given to time to acceptance or number of resubmissions. This analysis provides some guidance on where to submit a manuscript given the individual-specific values assigned to these disparate objectives.

  11. A software architecture for multi-cellular system simulations on graphics processing units.

    Science.gov (United States)

    Jeannin-Girardon, Anne; Ballet, Pascal; Rodin, Vincent

    2013-09-01

    The first aim of simulation in virtual environment is to help biologists to have a better understanding of the simulated system. The cost of such simulation is significantly reduced compared to that of in vivo simulation. However, the inherent complexity of biological system makes it hard to simulate these systems on non-parallel architectures: models might be made of sub-models and take several scales into account; the number of simulated entities may be quite large. Today, graphics cards are used for general purpose computing which has been made easier thanks to frameworks like CUDA or OpenCL. Parallelization of models may however not be easy: parallel computer programing skills are often required; several hardware architectures may be used to execute models. In this paper, we present the software architecture we built in order to implement various models able to simulate multi-cellular system. This architecture is modular and it implements data structures adapted for graphics processing units architectures. It allows efficient simulation of biological mechanisms.

  12. Near infrared probes for biochemical, cellular, and whole animal analysis of disease processes

    Science.gov (United States)

    Kovar, Joy; Boveia, Vince; Chen, Huaxian; Peng, Xinzhan; Skopp, Rose; Little, Garrick; Draney, Dan; Olive, D. M.

    2009-02-01

    The study of disease processes requires a number of tools for detection of proteins and biomarkers in cell and animal based assays. Near infrared (NIR) technologies offer the advantage of high signal without interference from background producing factors such as tissues, blood, or plastics. NIR fluorescence quenching biochemical assays employing a novel NIR quencher are homogeneous and sensitive. NIR-based immunocytochemical assays offer a means of quantitatively evaluating cell signaling pathways. The technology can be extended to the development of targeted molecular imaging agents for disease analysis in animal models. We describe here model assays for each of these categories. A fluorescence quenching caspase-3 assay was developed employing a novel, broadly applicable quencher dye suitable for use with both visible and NIR dye chemistries. An NIR cell based assay is described for assessment of phosphorylation of p53 in response to a cellular stimulus. Finally, we describe the development and application of a targeted NIR optical imaging agent for monitoring tumor growth in whole animals. The NIR biochemical and cell based assays are robust with Z' factors greater than 0.7. The use of an IRDye (R)800CW-labeled cyclic RGD peptide is presented as a model for development and application of targeted imaging agents. NIR technologies are compatible with the complete spectrum of assay needs for disease analysis and therapeutic development.

  13. Exploring key cellular processes and candidate genes regulating the primary thickening growth of Moso underground shoots.

    Science.gov (United States)

    Wei, Qiang; Jiao, Chen; Guo, Lin; Ding, Yulong; Cao, Junjie; Feng, Jianyuan; Dong, Xiaobo; Mao, Linyong; Sun, Honghe; Yu, Fen; Yang, Guangyao; Shi, Peijian; Ren, Guodong; Fei, Zhangjun

    2017-04-01

    The primary thickening growth of Moso (Phyllostachys edulis) underground shoots largely determines the culm circumference. However, its developmental mechanisms remain largely unknown. Using an integrated anatomy, mathematics and genomics approach, we systematically studied cellular and molecular mechanisms underlying the growth of Moso underground shoots. We discovered that the growth displayed a spiral pattern and pith played an important role in promoting the primary thickening process of Moso underground shoots and driving the evolution of culms with different sizes among different bamboo species. Different with model plants, the shoot apical meristem (SAM) of Moso is composed of six layers of cells. Comparative transcriptome analysis identified a large number of genes related to the vascular tissue formation that were significantly upregulated in a thick wall variant with narrow pith cavity, mildly spiral growth, and flat and enlarged SAM, including those related to plant hormones and those involved in cell wall development. These results provide a systematic perspective on the primary thickening growth of Moso underground shoots, and support a plausible mechanism resulting in the narrow pith cavity, weak spiral growth but increased vascular bundle of the thick wall Moso. © 2016 The Authors. New Phytologist © 2016 New Phytologist Trust.

  14. Focused Metabolite Profiling for Dissecting Cellular and Molecular Processes of Living Organisms in Space Environments

    Science.gov (United States)

    2008-01-01

    Regulatory control in biological systems is exerted at all levels within the central dogma of biology. Metabolites are the end products of all cellular regulatory processes and reflect the ultimate outcome of potential changes suggested by genomics and proteomics caused by an environmental stimulus or genetic modification. Following on the heels of genomics, transcriptomics, and proteomics, metabolomics has become an inevitable part of complete-system biology because none of the lower "-omics" alone provide direct information about how changes in mRNA or protein are coupled to changes in biological function. The challenges are much greater than those encountered in genomics because of the greater number of metabolites and the greater diversity of their chemical structures and properties. To meet these challenges, much developmental work is needed, including (1) methodologies for unbiased extraction of metabolites and subsequent quantification, (2) algorithms for systematic identification of metabolites, (3) expertise and competency in handling a large amount of information (data set), and (4) integration of metabolomics with other "omics" and data mining (implication of the information). This article reviews the project accomplishments.

  15. Large-scale analysis of expression signatures reveals hidden links among diverse cellular processes

    Directory of Open Access Journals (Sweden)

    Ge Steven X

    2011-05-01

    Full Text Available Abstract Background Cells must respond to various perturbations using their limited available gene repertoires. In order to study how cells coordinate various responses, we conducted a comprehensive comparison of 1,186 gene expression signatures (gene lists associated with various genetic and chemical perturbations. Results We identified 7,419 statistically significant overlaps between various published gene lists. Most (80% of the overlaps can be represented by a highly connected network, a "molecular signature map," that highlights the correlation of various expression signatures. By dissecting this network, we identified sub-networks that define clusters of gene sets related to common biological processes (cell cycle, immune response, etc. Examination of these sub-networks has confirmed relationships among various pathways and also generated new hypotheses. For example, our result suggests that glutamine deficiency might suppress cellular growth by inhibiting the MYC pathway. Interestingly, we also observed 1,369 significant overlaps between a set of genes upregulated by factor X and a set of genes downregulated by factor Y, suggesting a repressive interaction between X and Y factors. Conclusions Our results suggest that molecular-level responses to diverse chemical and genetic perturbations are heavily interconnected in a modular fashion. Also, shared molecular pathways can be identified by comparing newly defined gene expression signatures with databases of previously published gene expression signatures.

  16. Model Development and Process Analysis for Lean Cellular Design Planning in Aerospace Assembly and Manufacturing

    Science.gov (United States)

    Hilburn, Monty D.

    Successful lean manufacturing and cellular manufacturing execution relies upon a foundation of leadership commitment and strategic planning built upon solid data and robust analysis. The problem for this study was to create and employ a simple lean transformation planning model and review process that could be used to identify functional support staff resources required to plan and execute lean manufacturing cells within aerospace assembly and manufacturing sites. The lean planning model was developed using available literature for lean manufacturing kaizen best practices and validated through a Delphi panel of lean experts. The resulting model and a standardized review process were used to assess the state of lean transformation planning at five sites of an international aerospace manufacturing and assembly company. The results of the three day, on-site review were compared with baseline plans collected from each of the five sites to determine if there analyzed, with focus on three critical areas of lean planning: the number and type of manufacturing cells identified, the number, type, and duration of planned lean and continuous kaizen events, and the quantity and type of functional staffing resources planned to support the kaizen schedule. Summarized data of the baseline and on-site reviews was analyzed with descriptive statistics. ANOVAs and paired-t tests at 95% significance level were conducted on the means of data sets to determine if null hypotheses related to cell, kaizen event, and support resources could be rejected. The results of the research found significant differences between lean transformation plans developed by site leadership and plans developed utilizing the structured, on-site review process and lean transformation planning model. The null hypothesis that there was no difference between the means of pre-review and on-site cell counts was rejected, as was the null hypothesis that there was no significant difference in kaizen event plans. These

  17. Selected papers from the Fourth Annual q-bio Conference on Cellular Information Processing.

    Science.gov (United States)

    Nemenman, Ilya; Faeder, James R; Hlavacek, William S; Jiang, Yi; Wall, Michael E; Zilman, Anton

    2011-10-01

    This special issue consists of 11 original papers that elaborate on work presented at the Fourth Annual q-bio Conference on Cellular Information Processing, which was held on the campus of St John's College in Santa Fe, New Mexico, USA, 11-14 August 2010. Now in its fourth year, the q-bio conference has changed considerably over time. It is now well established and a major event in systems biology. The 2010 conference saw attendees from all continents (except Antarctica!) sharing novel results and participating in lively discussions at both the oral and poster sessions. The conference was oversubscribed and grew to 27 contributed talks, 16 poster spotlights and 137 contributed posters. We deliberately decreased the number of invited speakers to 21 to leave more space for contributed presentations, and the attendee feedback confirmed that the choice was a success. Although the q-bio conference has grown and matured, it has remained true to the original goal of being an intimate and dynamic event that brings together modeling, theory and quantitative experimentation for the study of cell regulation and information processing. Funded in part by a grant from NIGMS and by DOE funds through the Los Alamos National Laboratory Directed Research and Development program, the conference has continued to exhibit youth and vigor by attracting (and partially supporting) over 100 undergraduate, graduate and postdoctoral researchers. The associated q-bio summer school, which precedes the conference each year, further emphasizes the development of junior scientists and makes q-bio a singular event in its impact on the future of quantitative biology. In addition to an increased international presence, the conference has notably diversified its demographic representation within the USA, including increased participation from the southeastern corner of the country. One big change in the conference this year is our new publication partner, Physical Biology. Although we are very

  18. 蜂窝移动通信网络的智能优化方法研究%Research on Intelligent Optimization Method for Cellular Mobile Communication Network

    Institute of Scientific and Technical Information of China (English)

    杜建凤; 宋俊德

    2001-01-01

    Researches an intelligent optimization method based on the artificial intelligence (AI).An Intelligent Agent(IA)is designed as the intelligent optimization system of the cellular mobile communication network(IOS-CMCN).The related intelligent technologies are dissected.The feasibility and practicability of IOS-CMCN are analyzed and researched.A novel thought for the optimization of the cellular mobile communication network in our country.%研究了基于人工智能技术(AI)的智能网络优化方法,设计了智能代理(IA)式蜂窝移动通信网络优化工具IOS-CMCN,剖析了系统中引入的相关智能技术,并通过应用举例对IOS-CMCN的可行性和实用性进行了分析研究.

  19. Process Optimization for Valuable Metal Recovery from Dental Amalgam Residues

    Directory of Open Access Journals (Sweden)

    C.M. Parra–Mesa

    2009-07-01

    Full Text Available In this paper, the methodology used for optimizing leaching in a semi pilot plant is presented. This leaching process was applied to recover value metals from dental amalgam residues. 23 factorial design was used to characterize the process during the first stage and in the second one, a central compound rotational design was used for modeling copper percentage dissolved, a function of the nitric acid concentration, leaching time and temperature. This model explained the 81% of the response variability, which is considered satisfactory given the complexity of the process kinetics and, furthermore, it allowed the definition of the operation conditions for better copper recovery, which this was of 99.15%, at a temperature of 55°C, a concentration of 30% by weight and a time of 26 hours.

  20. Adjusting process count on demand for petascale global optimization

    KAUST Repository

    Sosonkina, Masha

    2013-01-01

    There are many challenges that need to be met before efficient and reliable computation at the petascale is possible. Many scientific and engineering codes running at the petascale are likely to be memory intensive, which makes thrashing a serious problem for many petascale applications. One way to overcome this challenge is to use a dynamic number of processes, so that the total amount of memory available for the computation can be increased on demand. This paper describes modifications made to the massively parallel global optimization code pVTdirect in order to allow for a dynamic number of processes. In particular, the modified version of the code monitors memory use and spawns new processes if the amount of available memory is determined to be insufficient. The primary design challenges are discussed, and performance results are presented and analyzed.

  1. Allogeneic cell therapy bioprocess economics and optimization: downstream processing decisions.

    Science.gov (United States)

    Hassan, Sally; Simaria, Ana S; Varadaraju, Hemanthram; Gupta, Siddharth; Warren, Kim; Farid, Suzanne S

    2015-01-01

    To develop a decisional tool to identify the most cost effective process flowsheets for allogeneic cell therapies across a range of production scales. A bioprocess economics and optimization tool was built to assess competing cell expansion and downstream processing (DSP) technologies. Tangential flow filtration was generally more cost-effective for the lower cells/lot achieved in planar technologies and fluidized bed centrifugation became the only feasible option for handling large bioreactor outputs. DSP bottlenecks were observed at large commercial lot sizes requiring multiple large bioreactors. The DSP contribution to the cost of goods/dose ranged between 20-55%, and 50-80% for planar and bioreactor flowsheets, respectively. This analysis can facilitate early decision-making during process development.

  2. Optimal design issues of a gas-to-liquid process

    Energy Technology Data Exchange (ETDEWEB)

    Rafiee, Ahmad

    2012-07-01

    Interests in Fischer-Tropsch (FT) synthesis is increasing rapidly due to the recent improvements of the technology, clean-burning fuels (low sulphur, low aromatics) derived from the FT process and the realization that the process can be used to monetize stranded natural gas resources. The economy of GTL plants depends very much on the natural gas price and there is a strong incentive to reduce the investment cost and in addition there is a need to improve energy efficiency and carbon efficiency. A model is constructed based on the available information in open literature. This model is used to simulate the GTL process with UNISIM DESIGN process simulator. In the FT reactor with cobalt based catalyst, Co2 is inert and will accumulate in the system. Five placements of Co2 removal unit in the GTL process are evaluated from an economical point of view. For each alternative, the process is optimized with respect to steam to carbon ratio, purge ratio of light ends, amount of tail gas recycled to syngas and FT units, reactor volume, and Co2 recovery. The results show that carbon and energy efficiencies and the annual net cash flow of the process with or without Co2 removal unit are not significantly different and there is not much to gain by removing Co2 from the process. It is optimal to recycle about 97 % of the light ends to the process (mainly to the FT unit) to obtain higher conversion of CO and H2 in the reactor. Different syngas configurations in a gas-to-liquid (GTL) plant are studied including auto-thermal reformer (ATR), combined reformer, and series arrangement of Gas Heated Reformer (GHR) and ATR. The Fischer-Tropsch (FT) reactor is based on cobalt catalyst and the degrees of freedom are; steam to carbon ratio, purge ratio of light ends, amount of tail gas recycled to synthesis gas (syngas) and Fischer-Tropsch (FT) synthesis units, and reactor volume. The production rate of liquid hydrocarbons is maximized for each syngas configuration. Installing a steam

  3. Optimization of vibratory welding process parameters using response surface methodology

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Pravin Kumar; Kumar, S. Deepak; Patel, D.; Prasad, S. B. [National Institute of Technology Jamshedpur, Jharkhand (India)

    2017-05-15

    The current investigation was carried out to study the effect of vibratory welding technique on mechanical properties of 6 mm thick butt welded mild steel plates. A new concept of vibratory welding technique has been designed and developed which is capable to transfer vibrations, having resonance frequency of 300 Hz, into the molten weld pool before it solidifies during the Shielded metal arc welding (SMAW) process. The important process parameters of vibratory welding technique namely welding current, welding speed and frequency of the vibrations induced in molten weld pool were optimized using Taguchi’s analysis and Response surface methodology (RSM). The effect of process parameters on tensile strength and hardness were evaluated using optimization techniques. Applying RSM, the effect of vibratory welding parameters on tensile strength and hardness were obtained through two separate regression equations. Results showed that, the most influencing factor for the desired tensile strength and hardness is frequency at its resonance value, i.e. 300 Hz. The micro-hardness and microstructures of the vibratory welded joints were studied in detail and compared with those of conventional SMAW joints. Comparatively, uniform and fine grain structure has been found in vibratory welded joints.

  4. A Cloud Computing Model for Optimization of Transport Logistics Process

    Directory of Open Access Journals (Sweden)

    Benotmane Zineb

    2017-09-01

    Full Text Available In any increasing competitive environment and even in companies; we must adopt a good logistic chain management policy which is the main objective to increase the overall gain by maximizing profits and minimizing costs, including manufacturing costs such as: transaction, transport, storage, etc. In this paper, we propose a cloud platform of this chain logistic for decision support; in fact, this decision must be made to adopt new strategy for cost optimization, besides, the decision-maker must have knowledge on the consequences of this new strategy. Our proposed cloud computing platform has a multilayer structure; this later is contained from a set of web services to provide a link between applications using different technologies; to enable sending; and receiving data through protocols, which should be understandable by everyone. The chain logistic is a process-oriented business; it’s used to evaluate logistics process costs, to propose optimal solutions and to evaluate these solutions before their application. As a scenario, we have formulated the problem for the delivery process, and we have proposed a modified Bin-packing algorithm to improve vehicles loading.

  5. Multivariate Statistical Process Optimization in the Industrial Production of Enzymes

    DEFF Research Database (Denmark)

    Klimkiewicz, Anna

    In modern biotech production, a massive number of diverse measurements, with a broad diversity in information content and quality, is stored in data historians. The potential of this enormous amount of data is currently under-employed in process optimization efforts. This is a result of the deman......In modern biotech production, a massive number of diverse measurements, with a broad diversity in information content and quality, is stored in data historians. The potential of this enormous amount of data is currently under-employed in process optimization efforts. This is a result...... and difficulties related to ‘recycling’ of historical data from a full-scale manufacturing of industrial enzymes. First, the crucial and tedious step of retrieving the data from the systems is presented. The prerequisites that need to be comprehended are discussed, such as sensors accuracy and reliability, aspects...... related to the actual measuring frequency and non-equidistance retaining strategies in data storage. Different regimes of data extraction can be employed, and some might introduce undesirable artifacts in the final analysis results (POSTER II1). Several signal processing techniques are also briefly...

  6. A Framework to Design and Optimize Chemical Flooding Processes

    Energy Technology Data Exchange (ETDEWEB)

    Mojdeh Delshad; Gary A. Pope Kamy Sepehrnoori

    2006-08-31

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  7. Ultrasound assisted manufacturing of paraffin wax nanoemulsions: process optimization.

    Science.gov (United States)

    Jadhav, A J; Holkar, C R; Karekar, S E; Pinjari, D V; Pandit, A B

    2015-03-01

    This work reports on the process optimization of ultrasound-assisted, paraffin wax in water nanoemulsions, stabilized by modified sodium dodecyl sulfate (SDS). This work focuses on the optimization of major emulsification process variables including sonication time, applied power and surfactant concentration. The effects of these variables were investigated on the basis of mean droplet diameter and stability of the prepared emulsion. It was found that the stable emulsion with droplet diameters about 160.9 nm could be formed with the surfactant concentration of 10 mg/ml and treated at 40% of applied power (power density: 0.61 W/ml) for 15 min. Scanning electron microscopy (SEM) was used to study the morphology of the emulsion droplets. The droplets were solid at room temperature, showing bright spots under polarized light and a spherical shape under SEM. The electrophoretic properties of emulsion droplets showed a negative zeta potential due to the adsorption of head sulfate groups of the SDS surfactant. For the sake of comparison, paraffin wax emulsion was prepared via emulsion inversion point method and was checked its intrinsic stability. Visually, it was found that the emulsion get separated/creamed within 30 min. while the emulsion prepared via ultrasonically is stable for more than 3 months. From this study, it was found that the ultrasound-assisted emulsification process could be successfully used for the preparation of stable paraffin wax nanoemulsions.

  8. Optimization of Processing Technology of Compound Dandelion Wine

    Directory of Open Access Journals (Sweden)

    Wu Jixuan

    2016-01-01

    Full Text Available Exploring dandelion food has been the concern in fields of the food processing and pharmaceutical industry for playing exact curative effect on high-fat-diet induced hepatic steatosis and diuretic activity. Few dandelion foods including drinks and microencapsulation were explored and unilateral dandelion wine were less carried out for its bitter flavour. In tis paper, to optimize the processing technologies of fermented compound wine from dandelion root, the orthogonal experiment design method was used to composite dandelion root powder with glutinous rice and schisandra fruit and optimize the fermenting parameters. Four factors with dandelion content, schisandra content, acidity and sugar content were discussed. The acidity factor was firstly confirmed as 7.0 g/L. The other three factors were confirmed by a series experiments as dandelion 0.55%, schisandra 0.5%, sugar 22%. With nine step processing of mixing substrate, stirring with water, cooking rice, amylase saccharification, pectinase hydrolysis, adjusting juice, fermenting with yeast, fitering, aging, sterilization, a light yellow wine with the special taste with flavour of dandelion, schisandra and rice and less bitter, few index were determined as 14.7% alcohol, 6.85 g/L acidity. A dandelion fermented compound wine with suitable flavour and sanitarian function was developed for enriching the dandelion food.

  9. A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES

    Energy Technology Data Exchange (ETDEWEB)

    Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori

    2005-07-01

    The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectives of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.

  10. Modeling, estimation and optimal filtration in signal processing

    CERN Document Server

    Najim, Mohamed

    2010-01-01

    The purpose of this book is to provide graduate students and practitioners with traditional methods and more recent results for model-based approaches in signal processing.Firstly, discrete-time linear models such as AR, MA and ARMA models, their properties and their limitations are introduced. In addition, sinusoidal models are addressed.Secondly, estimation approaches based on least squares methods and instrumental variable techniques are presented.Finally, the book deals with optimal filters, i.e. Wiener and Kalman filtering, and adaptive filters such as the RLS, the LMS and the

  11. Learning about Cellular Respiration: An Active Approach Illustrating the Process of Scientific Inquiry.

    Science.gov (United States)

    Johnson, Margaret (Peg)

    1998-01-01

    Details the active-learning approach to teaching cellular respiration in an introductory, one-semester course for nonmajors. Focuses on a laboratory exercise designed to answer the question of what happens to food when eaten. Contains 19 references. (DDR)

  12. MO-B-BRB-00: Optimizing the Treatment Planning Process

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2015-06-15

    The radiotherapy treatment planning process has evolved over the years with innovations in treatment planning, treatment delivery and imaging systems. Treatment modality and simulation technologies are also rapidly improving and affecting the planning process. For example, Image-guided-radiation-therapy has been widely adopted for patient setup, leading to margin reduction and isocenter repositioning after simulation. Stereotactic Body radiation therapy (SBRT) and Radiosurgery (SRS) have gradually become the standard of care for many treatment sites, which demand a higher throughput for the treatment plans even if the number of treatments per day remains the same. Finally, simulation, planning and treatment are traditionally sequential events. However, with emerging adaptive radiotherapy, they are becoming more tightly intertwined, leading to iterative processes. Enhanced efficiency of planning is therefore becoming more critical and poses serious challenge to the treatment planning process; Lean Six Sigma approaches are being utilized increasingly to balance the competing needs for speed and quality. In this symposium we will discuss the treatment planning process and illustrate effective techniques for managing workflow. Topics will include: Planning techniques: (a) beam placement, (b) dose optimization, (c) plan evaluation (d) export to RVS. Planning workflow: (a) import images, (b) Image fusion, (c) contouring, (d) plan approval (e) plan check (f) chart check, (g) sequential and iterative process Influence of upstream and downstream operations: (a) simulation, (b) immobilization, (c) motion management, (d) QA, (e) IGRT, (f) Treatment delivery, (g) SBRT/SRS (h) adaptive planning Reduction of delay between planning steps with Lean systems due to (a) communication, (b) limited resource, (b) contour, (c) plan approval, (d) treatment. Optimizing planning processes: (a) contour validation (b) consistent planning protocol, (c) protocol/template sharing, (d) semi

  13. Strategy optimization for controlled Markov process with descriptive complexity constraint

    Institute of Scientific and Technical Information of China (English)

    JIA QingShan; ZHAO QianChuan

    2009-01-01

    Due to various advantages in storage and Implementation,simple strategies are usually preferred than complex strategies when the performances are close.Strategy optimization for controlled Markov process with descriptive complexity constraint provides a general framework for many such problems.In this paper,we first show by examples that the descriptive complexity and the performance of a strategy could be Independent,and use the F-matrix in the No-Free-Lunch Theorem to show the risk that approximating complex strategies may lead to simple strategies that are unboundedly worse in cardinal performance than the original complex strategies.We then develop a method that handles the descriptive complexity constraint directly,which describes simple strategies exactly and only approximates complex strategies during the optimization.The ordinal performance difference between the resulting strategies of this selective approximation method and the global optimum is quantified.Numerical examples on an engine maintenance problem show how this method Improves the solution quality.We hope this work sheds some insights to solving general strategy optimization for controlled Markov procase with descriptive complexity constraint.

  14. Planar Thinned Arrays: Optimization and Subarray Based Adaptive Processing

    Directory of Open Access Journals (Sweden)

    P. Lombardo

    2013-01-01

    Full Text Available A new approach is presented for the optimized design of a planar thinned array; the proposed strategy works with single antenna elements or with small sets of different subarray types, properly located on a planar surface. The optimization approach is based on the maximization of an objective function accounting for side lobe level and considering a fixed number of active elements/subarrays. The proposed technique is suitable for different shapes of the desired output array, allowing the achievement of the desired directivity properties on the corresponding antenna pattern. The use of subarrays with a limited number of different shapes is relevant for industrial production, which would benefit from reduced design and manufacturing costs. The resulting modularity allows scalable antenna designs for different applications. Moreover, subarrays can be arranged in a set of subapertures, each connected to an independent receiving channel. Therefore, adaptive processing techniques could be applied to cope with and mitigate clutter echoes and external electromagnetic interferences. The performance of adaptive techniques with subapertures taken from the optimized thinned array is evaluated against assigned clutter and jamming scenarios and compared to the performance achievable considering a subarray based filled array with the same number of active elements.

  15. Speed optimized influence matrix processing in inverse treatment planning tools

    Energy Technology Data Exchange (ETDEWEB)

    Ziegenhein, Peter; Wilkens, Jan J; Nill, Simeon; Oelfke, Uwe [German Cancer Research Center (DKFZ), Department of Medical Physics in Radiation Oncology, Im Neuenheimer Feld 280, 69120 Heidelberg (Germany); Ludwig, Thomas [University of Heidelberg, Institute of Computer Science, Research Group Parallel and Distributed Systems, Im Neuenheimer Feld 348, 69120 Heidelberg (Germany)], E-mail: p.ziegenhein@dkfz.de, E-mail: u.oelfke@dkfz.de

    2008-05-07

    An optimal plan in modern treatment planning tools is found through the use of an iterative optimization algorithm, which deals with a high amount of patient-related data and number of treatment parameters to be optimized. Thus, calculating a good plan is a very time-consuming process which limits the application for patients in clinics and for research activities aiming for more accuracy. A common technique to handle the vast amount of radiation dose data is the concept of the influence matrix (DIJ), which stores the dose contribution of each bixel to the patient in the main memory of the computer. This study revealed that a bottleneck for the optimization time arises from the data transfer of the dose data between the memory and the CPU. In this note, we introduce a new method which speeds up the data transportation from stored dose data to the CPU. As an example we used the DIJ approach as is implemented in our treatment planning tool KonRad, developed at the German Cancer Research Center (DKFZ) in Heidelberg. A data cycle reordering method is proposed to take the advantage of modern memory hardware. This induces a minimal eviction policy which results in a memory behaviour exhibiting a 2.6 times faster algorithm compared to the naive implementation. Although our method is described for the DIJ approach implemented in KonRad, we believe that any other planning tool which uses a similar approach to store the dose data will also benefit from the described methods. (note)

  16. Transposon mutagenesis identifies genes and cellular processes driving epithelial-mesenchymal transition in hepatocellular carcinoma

    Science.gov (United States)

    Kodama, Takahiro; Newberg, Justin Y.; Kodama, Michiko; Rangel, Roberto; Yoshihara, Kosuke; Tien, Jean C.; Parsons, Pamela H.; Wu, Hao; Finegold, Milton J.; Copeland, Neal G.; Jenkins, Nancy A.

    2016-01-01

    Epithelial-mesenchymal transition (EMT) is thought to contribute to metastasis and chemoresistance in patients with hepatocellular carcinoma (HCC), leading to their poor prognosis. The genes driving EMT in HCC are not yet fully understood, however. Here, we show that mobilization of Sleeping Beauty (SB) transposons in immortalized mouse hepatoblasts induces mesenchymal liver tumors on transplantation to nude mice. These tumors show significant down-regulation of epithelial markers, along with up-regulation of mesenchymal markers and EMT-related transcription factors (EMT-TFs). Sequencing of transposon insertion sites from tumors identified 233 candidate cancer genes (CCGs) that were enriched for genes and cellular processes driving EMT. Subsequent trunk driver analysis identified 23 CCGs that are predicted to function early in tumorigenesis and whose mutation or alteration in patients with HCC is correlated with poor patient survival. Validation of the top trunk drivers identified in the screen, including MET (MET proto-oncogene, receptor tyrosine kinase), GRB2-associated binding protein 1 (GAB1), HECT, UBA, and WWE domain containing 1 (HUWE1), lysine-specific demethylase 6A (KDM6A), and protein-tyrosine phosphatase, nonreceptor-type 12 (PTPN12), showed that deregulation of these genes activates an EMT program in human HCC cells that enhances tumor cell migration. Finally, deregulation of these genes in human HCC was found to confer sorafenib resistance through apoptotic tolerance and reduced proliferation, consistent with recent studies showing that EMT contributes to the chemoresistance of tumor cells. Our unique cell-based transposon mutagenesis screen appears to be an excellent resource for discovering genes involved in EMT in human HCC and potentially for identifying new drug targets. PMID:27247392

  17. Experimental optimization of a real time fed-batch fermentation process using Markov decision process.

    Science.gov (United States)

    Saucedo, V M; Karim, M N

    1997-07-20

    This article describes a methodology that implements a Markov decision process (MDP) optimization technique in a real time fed-batch experiment. Biological systems can be better modeled under the stochastic framework and MDP is shown to be a suitable technique for their optimization. A nonlinear input/output model is used to calculate the probability transitions. All elements of the MDP are identified according to physical parameters. Finally, this study compares the results obtained when optimizing ethanol production using the infinite horizon problem, with total expected discount policy, to previous experimental results aimed at optimizing ethanol production using a recombinant Escherichia coli fed-batch cultivation. (c) 1997 John Wiley & Sons, Inc. Biotechnol Bioeng 55: 317-327, 1997.

  18. Integration of Advanced Simulation and Visualization for Manufacturing Process Optimization

    Science.gov (United States)

    Zhou, Chenn; Wang, Jichao; Tang, Guangwu; Moreland, John; Fu, Dong; Wu, Bin

    2016-05-01

    The integration of simulation and visualization can provide a cost-effective tool for process optimization, design, scale-up and troubleshooting. The Center for Innovation through Visualization and Simulation (CIVS) at Purdue University Northwest has developed methodologies for such integration with applications in various manufacturing processes. The methodologies have proven to be useful for virtual design and virtual training to provide solutions addressing issues on energy, environment, productivity, safety, and quality in steel and other industries. In collaboration with its industrial partnerships, CIVS has provided solutions to companies, saving over US38 million. CIVS is currently working with the steel industry to establish an industry-led Steel Manufacturing Simulation and Visualization Consortium through the support of National Institute of Standards and Technology AMTech Planning Grant. The consortium focuses on supporting development and implementation of simulation and visualization technologies to advance steel manufacturing across the value chain.

  19. Process of Market Strategy Optimization Using Distributed Computing Systems

    Directory of Open Access Journals (Sweden)

    Nowicki Wojciech

    2015-12-01

    Full Text Available If market repeatability is assumed, it is possible with some real probability to deduct short term market changes by making some calculations. The algorithm, based on logical and statistically reasonable scheme to make decisions about opening or closing position on a market, is called an automated strategy. Due to market volatility, all parameters are changing from time to time, so there is need to constantly optimize them. This article describes a team organization process when researching market strategies. Individual team members are merged into small groups, according to their responsibilities. The team members perform data processing tasks through a cascade organization, providing solutions to speed up work related to the use of remote computing resources. They also work out how to store results in a suitable way, according to the type of task, and facilitate the publication of a large amount of results.

  20. Multivariate Statistical Process Optimization in the Industrial Production of Enzymes

    DEFF Research Database (Denmark)

    Klimkiewicz, Anna

    ultrafiltration operation is limited by the membrane fouling phenomenawhere the production capacity - monitored as flow through the membrane or flux -decreases over time. The flux varies considerably from run to run within the sameproduct and likewise between different products. This variability clearly affects......, the study revealed that the less demanding in-line flow cellsetup outperformed the on-line arrangement. The former worked satisfactory robusttowards different products (amylases and proteases) and associated processingparameters such temperature and processing speed.This dissertation work shows......In modern biotech production, a massive number of diverse measurements, with a broad diversity in information content and quality, is stored in data historians. The potential of this enormous amount of data is currently under-employed in process optimization efforts. This is a result...

  1. Numerical Tool Path Optimization for Conventional Sheet Metal Spinning Processes

    Science.gov (United States)

    Rentsch, Benedikt; Manopulo, Niko; Hora, Pavel

    2016-08-01

    To this day, conventional sheet metal spinning processes are designed with a very low degree of automation. They are usually executed by experienced personnel, who actively adjust the tool paths during production. The practically unlimited freedom in designing the tool paths enables the efficient manufacturing of complex geometries on one hand, but is challenging to translate into a standardized procedure on the other. The present study aims to propose a systematic methodology, based on a 3D FEM model combined with a numerical optimization strategy, in order to design tool paths. The accurate numerical modelling of the spinning process is firstly discussed, followed by an analysis of appropriate objective functions and constraints required to obtain a failure free tool path design.

  2. Multimineral optimization processing method based on elemental capture spectroscopy logging

    Institute of Scientific and Technical Information of China (English)

    Feng Zhou; Li Xin-Tong; Wu Hong-Liang; Xia Shou-Ji; Liu Ying-Ming

    2014-01-01

    Calculating the mineral composition is a critical task in log interpretation. Elemental capture spectroscopy (ECS) log provides the weight percentages of twelve common elements, which lays the foundation for the accurate calculation of mineral compositions. Previous processing methods calculated the formation composition via the conversion relation between the formation chemistry and minerals. Thus, their applicability is limited and the method precision is relatively low. In this study, we present a multimineral optimization processing method based on the ECS log. We derived the ECS response equations for calculating the formation composition, then, determined the logging response values for the elements of common minerals using core data and theoretical calculations. Finally, a software module was developed. The results of the new method are consistent with core data and the mean absolute error is less than 10%.

  3. Optimizing FORTRAN Programs for Hierarchical Memory Parallel Processing Systems

    Institute of Scientific and Technical Information of China (English)

    金国华; 陈福接

    1993-01-01

    Parallel loops account for the greatest amount of parallelism in numerical programs.Executing nested loops in parallel with low run-time overhead is thus very important for achieving high performance in parallel processing systems.However,in parallel processing systems with caches or local memories in memory hierarchies,“thrashing problemmay”may arise whenever data move back and forth between the caches or local memories in different processors.Previous techniques can only deal with the rather simple cases with one linear function in the perfactly nested loop.In this paper,we present a parallel program optimizing technique called hybri loop interchange(HLI)for the cases with multiple linear functions and loop-carried data dependences in the nested loop.With HLI we can easily eliminate or reduce the thrashing phenomena without reucing the program parallelism.

  4. Ising Processing Units: Potential and Challenges for Discrete Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Coffrin, Carleton James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Nagarajan, Harsha [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bent, Russell Whitford [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-07-05

    The recent emergence of novel computational devices, such as adiabatic quantum computers, CMOS annealers, and optical parametric oscillators, presents new opportunities for hybrid-optimization algorithms that leverage these kinds of specialized hardware. In this work, we propose the idea of an Ising processing unit as a computational abstraction for these emerging tools. Challenges involved in using and bench- marking these devices are presented, and open-source software tools are proposed to address some of these challenges. The proposed benchmarking tools and methodology are demonstrated by conducting a baseline study of established solution methods to a D-Wave 2X adiabatic quantum computer, one example of a commercially available Ising processing unit.

  5. Analysis and optimization of coagulation and flocculation process

    Science.gov (United States)

    Saritha, V.; Srinivas, N.; Srikanth Vuppala, N. V.

    2015-02-01

    Natural coagulants have been the focus of research of many investigators through the last decade owing to the problems caused by the chemical coagulants. Optimization of process parameters is vital for the effectiveness of coagulation process. In the present study optimization of parameters like pH, dose of coagulant and mixing speed were studied using natural coagulants sago and chitin in comparison with alum. Jar test apparatus was used to perform the coagulation. The results showed that the removal of turbidity was up to 99 % by both alum and chitin at lower doses of coagulant, i.e., 0.1-0.3 g/L, whereas sago has shown a reduction of 70-100 % at doses of 0.1 and 0.2 g/L. The optimum conditions observed for sago were 6 and 7 whereas chitin was stable at all pH ranges, lower coagulant doses, i.e., 0.1-0.3 g/L and mixing speed—rapid mixing at 100 rpm for 10 min and slow mixing 20 rpm for 20 min. Hence, it can be concluded that sago and chitin can be used for treating water even with large seasonal variation in turbidity.

  6. An improved ant colony optimization approach for optimization of process planning.

    Science.gov (United States)

    Wang, JinFeng; Fan, XiaoLiang; Ding, Haimin

    2014-01-01

    Computer-aided process planning (CAPP) is an important interface between computer-aided design (CAD) and computer-aided manufacturing (CAM) in computer-integrated manufacturing environments (CIMs). In this paper, process planning problem is described based on a weighted graph, and an ant colony optimization (ACO) approach is improved to deal with it effectively. The weighted graph consists of nodes, directed arcs, and undirected arcs, which denote operations, precedence constraints among operation, and the possible visited path among operations, respectively. Ant colony goes through the necessary nodes on the graph to achieve the optimal solution with the objective of minimizing total production costs (TPCs). A pheromone updating strategy proposed in this paper is incorporated in the standard ACO, which includes Global Update Rule and Local Update Rule. A simple method by controlling the repeated number of the same process plans is designed to avoid the local convergence. A case has been carried out to study the influence of various parameters of ACO on the system performance. Extensive comparative experiments have been carried out to validate the feasibility and efficiency of the proposed approach.

  7. Сalculation theory for shrinkage stresses in cellular concrete wall panels in carbonation processes with account of creep

    Directory of Open Access Journals (Sweden)

    Chepurnenko Anton Sergeevich

    2016-12-01

    Full Text Available The task of comprehensive analysis presented in this article is a development of theory of calculation of shrinkage stresses in cellular concrete wall panels; such stresses occur due to carbonation of concrete because of the creep of material. Analytical dependences characterizing the influence of carbonation on the modulus of elasticity, shrinkage and creep of autoclaved cellular concrete, as well as the regularity of variation of carbonation degree as per thickness of the wall panels depending on time, were obtained. The proposed theory of calculation of shrinkage stresses in cellular concrete wall panels, with account of concrete creep, makes it possible to predict the influence of carbonation processes on crack resistance thereof, and thus to develop measures of technological and structural nature, in order to improve their operational reliability and durability.

  8. Cooling system optimization analysis for hot forming processes

    Science.gov (United States)

    Ghoo, Bonyoung; Umezu, Yasuyoshi; Watanabe, Yuko

    2013-12-01

    Hot forming technology was developed to produce automotive panels having ultra-high tensile stress over 1500MPa. The elevated temperature corresponds with decreased flow stress and increased ductility. Furthermore, hot forming products have almost zero springback amounts. This advanced forming technology accelerates the needs for numerical simulations coupling with thermal-mechanical formulations. In the present study, 3-dimensional finite element analyses for hot forming processes are conducted using JSTAMP/NV and LS-DYNA considering cooling system. Special attention is paid to the optimization of cooling system using thermo-mechanical finite element analysis through the influence of various cooling parameters. The presented work shows an adequate cooling system functions and microstructural phase transformation material model together with a proper set of numerical parameters can give both efficient and accurate design insight in hot forming manufacturing process. JSTAMP/NV and LS-DYNA can become a robust combination set for complex hot forming analysis which needs thermo-mechanical and microstructural material modeling and various process modeling. The use of the new JSTAMP/NV function for multishot manufacturing process is shown good capabilities in cooling system evaluation. And the use of the advanced LS-DYNA microstructural phase transformation model is shown good evaluation results in martensite amount and Vickers hardness after quenching.

  9. Biopharmaceutical Process Optimization with Simulation and Scheduling Tools

    Directory of Open Access Journals (Sweden)

    Demetri Petrides

    2014-09-01

    Full Text Available Design and assessment activities associated with a biopharmaceutical process are performed at different levels of detail, based on the stage of development that the product is in. Preliminary “back-of-the envelope” assessments are performed early in the development lifecycle, whereas detailed design and evaluation are performed prior to the construction of a new facility. Both the preliminary and detailed design of integrated biopharmaceutical processes can be greatly assisted by the use of process simulators, discrete event simulators or finite capacity scheduling tools. This report describes the use of such tools for bioprocess development, design, and manufacturing. The report is divided into three sections. Section One provides introductory information and explains the purpose of bioprocess simulation. Section Two focuses on the detailed modeling of a single batch bioprocess that represents the manufacturing of a therapeutic monoclonal antibody (MAb. This type of analysis is typically performed by engineers engaged in the development and optimization of such processes. Section Three focuses on production planning and scheduling models for multiproduct plants.

  10. Process optimization of mechano-electrospinning by response surface methodology.

    Science.gov (United States)

    Bu, Ningbin; Huang, YongAn; Duan, Yongqing; Yin, Zhouping

    2014-05-01

    In this paper, mechano-electrospinning (MES) is presented to write the polyvinylidene fluoride (PVDF) solution into fibers directly, and the effects of the process parameters on the fiber are investigated experimentally based on response surface methodology. The different width of the fiber is obtained by adjusting the individual process parameters (velocity of the substrate, applied voltage and nozzle-to-substrate distance). Considering the continuous jet and stable Taylor-cone, the operation field is selected for investigating the complicated relationship between the process parameters on the width of the fiber by using the response surface methodology. The experiment results show that the predicted width of the fiber is in good agreement with the actual width of the fiber. Based on the analysis of the importance of the terms in the equation, a simple model can be used to predict the width of the fiber. Depending on this model, a large number of calibration experiments can be subducted. Additionally, the principle of the selection of the process parameters is presented by optimizing parameters, which can give a guideline for obtaining the desired fiber in the experiment.

  11. Optimal processes for probabilistic work extraction beyond the second law.

    Science.gov (United States)

    Cavina, Vasco; Mari, Andrea; Giovannetti, Vittorio

    2016-07-05

    According to the second law of thermodynamics, for every transformation performed on a system which is in contact with an environment of fixed temperature, the average extracted work is bounded by the decrease of the free energy of the system. However, in a single realization of a generic process, the extracted work is subject to statistical fluctuations which may allow for probabilistic violations of the previous bound. We are interested in enhancing this effect, i.e. we look for thermodynamic processes that maximize the probability of extracting work above a given arbitrary threshold. For any process obeying the Jarzynski identity, we determine an upper bound for the work extraction probability that depends also on the minimum amount of work that we are willing to extract in case of failure, or on the average work we wish to extract from the system. Then we show that this bound can be saturated within the thermodynamic formalism of quantum discrete processes composed by sequences of unitary quenches and complete thermalizations. We explicitly determine the optimal protocol which is given by two quasi-static isothermal transformations separated by a finite unitary quench.

  12. Optimal processes for probabilistic work extraction beyond the second law

    Science.gov (United States)

    Cavina, Vasco; Mari, Andrea; Giovannetti, Vittorio

    2016-07-01

    According to the second law of thermodynamics, for every transformation performed on a system which is in contact with an environment of fixed temperature, the average extracted work is bounded by the decrease of the free energy of the system. However, in a single realization of a generic process, the extracted work is subject to statistical fluctuations which may allow for probabilistic violations of the previous bound. We are interested in enhancing this effect, i.e. we look for thermodynamic processes that maximize the probability of extracting work above a given arbitrary threshold. For any process obeying the Jarzynski identity, we determine an upper bound for the work extraction probability that depends also on the minimum amount of work that we are willing to extract in case of failure, or on the average work we wish to extract from the system. Then we show that this bound can be saturated within the thermodynamic formalism of quantum discrete processes composed by sequences of unitary quenches and complete thermalizations. We explicitly determine the optimal protocol which is given by two quasi-static isothermal transformations separated by a finite unitary quench.

  13. Optimizing enactment of nursing roles: redesigning care processes and structures

    Directory of Open Access Journals (Sweden)

    Jackson K

    2014-02-01

    Full Text Available Karen Jackson,1 Deborah E White,2 Jeanne Besner,1 Jill M Norris21Health Systems and Workforce Research Unit, Alberta Health Services, Calgary, Alberta, Canada; 2Faculty of Nursing, University of Calgary, Calgary, Alberta, CanadaBackground: Effective and efficient use of nursing human resources is critical. The Nursing Role Effectiveness Model conceptualizes nursing practice in terms of key clinical role accountabilities and has the potential to inform redesign efforts. The aims of this study were to develop, implement, and evaluate a job redesign intended to optimize the enactment of registered nurse (RN clinical role accountabilities.Methods: A job redesign was developed and implemented in a single medical patient care unit, the redesign unit. A mixed-methods design was used to evaluate the job redesign; a second medical patient care unit served as a control unit. Data from administrative databases, observations, interviews, and demographic surveys were collected pre-redesign (November 2005 and post-redesign (October 2007.Results: Several existing unit structures and processes (eg, model of care delivery influenced RNs' ability to optimally enact their role accountabilities. Redesign efforts were hampered by contextual issues, including organizational alignment, leadership, and timing. Overall, optimized enactment of RN role accountabilities and improvements to patient outcomes did not occur, yet this was predictable, given that the redesign was not successful. Although the results were disappointing, much was learned about job redesign.Conclusion: Potential exists to improve the utilization of nursing providers by situating nurses' work in a clinical role accountability framework and attending to a clear organizational vision and well-articulated strategic plan that is championed by leaders at all levels of the organization. Health care leaders require a clear understanding of nurses' role accountabilities, support in managing change, and

  14. Predictive Process Optimization for Fracture Ductility in Automotive TRIP Steels

    Science.gov (United States)

    Gong, Jiadong

    In light of the emerging challenges in the automotive industry of meeting new energy-saving and environment-friendly requirements imposed by both the government and the society, the auto makers have been working relentlessly to reduce the weight of automobiles. While steel makers pushed out a variety of novel Advanced High Strength Steels (AHSS) to serve this market with new needs, TRIP (Transformation Induced Plasticity) steels is one of the most promising materials for auto-body due to its exceptional combination of strength and formability. However, current commercial automotive TRIP steels demonstrate relatively low hole-expansion (HE) capability, which is critical in stretch forming of various auto parts. This shortcoming on ductility has been causing fracture issues in the forming process and limits the wider applications of this steel. The kinetic theory of martensitic transformations and associated transformation plasticity is applied to the optimization of transformation stability for enhanced mechanical properties in a class of high strength galvannealed TRIP steel. This research leverages newly developed characterization and simulation capabilities, supporting computational design of high-performance steels exploiting optimized transformation plasticity for desired mechanical behaviors, especially for the hole-expansion ductility. The microstructure of the automotive TRIP sheet steels was investigated, using advanced tomographic characterization including nanoscale Local Electrode Atom Probe (LEAP) microanalysis. The microstructural basis of austenite stability, the austenite carbon concentration in particular, was quantified and correlated with measured fracture ductility through transformation plasticity constitutive laws. Plastic flow stability for enhanced local fracture ductility at high strength is sought to maintain high hole-expansion ductility, through quantifying the optimal stability and the heat-treatment process to achieve it. An additional

  15. Trans-membrane transport of fluoranthene by Rhodococcus sp. BAP-1 and optimization of uptake process.

    Science.gov (United States)

    Li, Yi; Wang, Hongqi; Hua, Fei; Su, Mengyuan; Zhao, Yicun

    2014-03-01

    The mechanism of transport of (14)C-fluoranthene by Rhodococcus sp. BAP-1, a Gram-positive bacterium isolated from crude oil-polluted soil, was examined. Our finding demonstrated that the mechanism for fluoranthene travel across the cell membrane in Rhodococcus sp. BAP-1 requires energy. Meanwhile, the transport of fluoranthene involves concurrent catabolism of (14)C, that leading to the generation of significant amount of (14)CO2. Combined with trans-membrane transport dynamic and response surface methodology, a significant influence of temperature, pH and salinity on cellular uptake rate was screened by Plackett-Burman design. Then, Box-Behnken design was employed to optimize and enhanced the trans-membrane transport process. The results predicted by Box-Behnken design indicated that the maximum cellular uptake rate of fluoranthene could be achieve to 0.308μmolmin(-1)mg(-1)·protein (observed) and 0.304μmolmin(-1)mg(-1)·protein (predicted) when the initial temperature, pH and salinity were set at 20°C, 9% and 1%, respectively.

  16. Optimization of extrusion process for production of nutritious pellets

    Directory of Open Access Journals (Sweden)

    Ernesto Aguilar-Palazuelos

    2012-03-01

    Full Text Available A blend of 50% Potato Starch (PS, 35% Quality Protein Maize (QPM, and 15% Soybean Meal (SM were used in the preparation of expanded pellets utilizing a laboratory extruder with a 1.5 × 20.0 × 100.0 mm die-nozzle. The independent variables analyzed were Barrel Temperature (BT (75-140 °C and Feed Moisture (FM (16-30%. The effect of extrusion variables was investigated in terms of Expansion Index (EI, apparent density (ApD, Penetration Force (PF and Specific Mechanical Energy (SME, viscosity profiles, DSC, crystallinity by X-ray diffraction, and Scanning Electronic Microscopy (SEM. The PF decreased from 30 to 4 kgf with the increase of both independent variables (BT and FM. SME was affected only by FM, and decreased with the increase in this variable. The optimal region showed that the maximum EI was found for BT in the range of 123-140 °C and 27-31% for FM, respectively. The extruded pellets obtained from the optimal processing region were probably not completely degraded, as shown in the structural characterization. Acceptable expanded pellets could be produced using a blend of PS, QPM, and SM by extrusion cooking.

  17. Optimization and Control of Pressure Swing Adsorption Processes Under Uncertainty

    KAUST Repository

    Khajuria, Harish

    2012-03-21

    The real-time periodic performance of a pressure swing adsorption (PSA) system strongly depends on the choice of key decision variables and operational considerations such as processing steps and column pressure temporal profiles, making its design and operation a challenging task. This work presents a detailed optimization-based approach for simultaneously incorporating PSA design, operational, and control aspects under the effect of time variant and invariant disturbances. It is applied to a two-bed, six-step PSA system represented by a rigorous mathematical model, where the key optimization objective is to maximize the expected H2 recovery while achieving a closed loop product H2 purity of 99.99%, for separating 70% H2, 30% CH4 feed. The benefits over sequential design and control approach are shown in terms of closed-loop recovery improvement of more than 3%, while the incorporation of explicit/multiparametric model predictive controllers improves the closed loop performance. © 2012 American Institute of Chemical Engineers (AIChE).

  18. High Temperature Epoxy Foam: Optimization of Process Parameters

    Directory of Open Access Journals (Sweden)

    Samira El Gazzani

    2016-06-01

    Full Text Available For many years, reduction of fuel consumption has been a major aim in terms of both costs and environmental concerns. One option is to reduce the weight of fuel consumers. For this purpose, the use of a lightweight material based on rigid foams is a relevant choice. This paper deals with a new high temperature epoxy expanded material as substitution of phenolic resin, classified as potentially mutagenic by European directive Reach. The optimization of thermoset foam depends on two major parameters, the reticulation process and the expansion of the foaming agent. Controlling these two phenomena can lead to a fully expanded and cured material. The rheological behavior of epoxy resin is studied and gel time is determined at various temperatures. The expansion of foaming agent is investigated by thermomechanical analysis. Results are correlated and compared with samples foamed in the same temperature conditions. The ideal foaming/gelation temperature is then determined. The second part of this research concerns the optimization of curing cycle of a high temperature trifunctional epoxy resin. A two-step curing cycle was defined by considering the influence of different curing schedules on the glass transition temperature of the material. The final foamed material has a glass transition temperature of 270 °C.

  19. Study on rolling process optimization of high carbon steel wire

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The existing problems in the manufacture of SWRH82B high carbon steel wire were discussed by sampling and testing the microstructure and properties of the steel from the workshop. To solve the problems, the experimental parameters for thermal simulation were optimized, and the thermal simulating experiments were carried out on a Gleeble1500 thermal simulator. The process parameters for the manufacture were optimized after analysis of the data, and the productive experiments were performed after the water box in front of the no-twist blocks was reconstructed, to control the temperature of the loop layer. The results from the productive experiments showed that the cooling rate of 10-15 ℃/s was reasonable before phase transformation, about 5℃/s during phase wire was increased to 1150-1170 MPa with an increase of 20-30 MPa, the percentage reduction of section was to 34%-36% with an increase of 1%-3% by testing the finished products after reconstruction.

  20. Process parameter optimization for fly ash brick by Taguchi method

    Energy Technology Data Exchange (ETDEWEB)

    Prabir Kumar Chaulia; Reeta Das [Central Mechanical Engineering Research Institute, Durgapur (India)

    2008-04-15

    This paper presents the results of an experimental investigation carried out to optimize the mix proportions of the fly ash brick by Taguchi method of parameter design. The experiments have been designed using an L9 orthogonal array with four factors and three levels each. Small quantity of cement has been mixed as binding materials. Both cement and the fly ash used are indicated as binding material and water binder ratio has been considered as one of the control factors. So the effects of water/binder ratio, fly ash, coarse sand, and stone dust on the performance characteristic are analyzed using signal-to-noise ratios and mean response data. According to the results, water/binder ratio and stone dust play the significant role on the compressive strength of the brick. Furthermore, the estimated optimum values of the process parameters are corresponding to water/binder ratio of 0.4, fly ash of 39%, coarse sand of 24% and stone dust of 30%. The mean value of optimal strength is predicted as 166.22 kg.cm{sup -2} with a tolerance of 10.97 kg.cm{sup -2}. The confirmatory experimental result obtained for the optimum conditions is 160.17 kg.cm{sup -2}. 13 refs., 3 figs., 7 tabs.

  1. Cellular Automata as a learning process in Architecture and Urban design

    DEFF Research Database (Denmark)

    Jensen, Mads Brath; Foged, Isak Worre

    2014-01-01

    . An architectural methodological response to this situation is presented through the development of a conceptual computational design system that allows these dynamics to unfold and to be observed for architectural design decision taking. Reflecting on the development and implementation of a cellular automata based...

  2. Optimizing an immersion ESL curriculum using analytic hierarchy process.

    Science.gov (United States)

    Tang, Hui-Wen Vivian

    2011-11-01

    The main purpose of this study is to fill a substantial knowledge gap regarding reaching a uniform group decision in English curriculum design and planning. A comprehensive content-based course criterion model extracted from existing literature and expert opinions was developed. Analytical hierarchy process (AHP) was used to identify the relative importance of course criteria for the purpose of tailoring an optimal one-week immersion English as a second language (ESL) curriculum for elementary school students in a suburban county of Taiwan. The hierarchy model and AHP analysis utilized in the present study will be useful for resolving several important multi-criteria decision-making issues in planning and evaluating ESL programs. This study also offers valuable insights and provides a basis for further research in customizing ESL curriculum models for different student populations with distinct learning needs, goals, and socioeconomic backgrounds. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. [Optimization of the pedagogical process at the department of otorhinolaryngology].

    Science.gov (United States)

    Korkmazov, M Iu; Zyrianova, K S; Dubinets, I D; Kornova, N V

    2014-01-01

    The objective of the present work was to optimize the educational process with respect to teaching otorhinolaryngology based on the experience accumulated in this sphere with the use of fundamental management components. Special emphasis is laid on the innovative teaching methods employed in the higher education institutions for training clinical otorhinolaryngologists. The analysis of the functional responsibilities of the academic staff in medical institutions and the main components of the educational activity is presented. The basic principles of the quality management system are described. The conclusions made by the authors concern the possibility of standardization and prognostication in the higher education system that can be used for the development of practical guidelines for the academic staff.

  4. Multi-objective process parameter optimization for energy saving in injection molding process

    Institute of Scientific and Technical Information of China (English)

    Ning-yun LU; Gui-xia GONG; Yi YANG; Jian-hua LU

    2012-01-01

    This paper deals with a multi-objective parameter optimization framework for energy saving in injection molding process.It combines an experimental design by Taguchi's method,a process analysis by analysis of variance (ANOVA),a process modeling algorithm by artificial neural network (ANN),and a multi-objective parameter optimization algorithm by genetic algorithm (GA)-based lexicographic method.Local and global Pareto analyses show the trade-off between product quality and energy consumption.The implementation of the proposed framework can reduce the energy consumption significantly in laboratory scale tests,and at the same time,the product quality can meet the pre-determined requirements.

  5. Novel methodology for casting process optimization using Gaussian process regression and genetic algorithm

    Institute of Scientific and Technical Information of China (English)

    Yao Weixiong; Yang Yi; Zeng Bin

    2009-01-01

    High pressure die casting (HPDC) is a versatile material processing method for mass-production of metal parts with complex geometries,and this method has been widely used in manufacturing various products of excellent dimensional accuracy and productivity. In order to ensure the quality of the components,a number of variables need to be properly set. A novel methodology for high pressure die casting process optimization was developed,validated and applied to selection of optimal parameters,which incorporate design of experiment (DOE),Gaussian process (GP) regression technique and genetic algorithms (GA). This new approach was applied to process optimization for cast magnesium alloy notebook shell. After being trained,using data generated by PROCAST (FEM-based simulation software),the GP model approximated well with the simulation by extracting useful information from the simulation results. With the help of MATLAB,the GP/GA based approach has achieved the optimum solution of die casting process condition settings.

  6. Optimization of electrocoagulation process for the treatment of landfill leachate

    Science.gov (United States)

    Huda, N.; Raman, A. A.; Ramesh, S.

    2017-06-01

    The main problem of landfill leachate is its diverse composition comprising of persistent organic pollutants (POPs) which must be removed before being discharge into the environment. In this study, the treatment of leachate using electrocoagulation (EC) was investigated. Iron was used as both the anode and cathode. Response surface methodology was used for experimental design and to study the effects of operational parameters. Central Composite Design was used to study the effects of initial pH, inter-electrode distance, and electrolyte concentration on color, and COD removals. The process could remove up to 84 % color and 49.5 % COD. The experimental data was fitted onto second order polynomial equations. All three factors were found to be significantly affect the color removal. On the other hand, electrolyte concentration was the most significant parameter affecting the COD removal. Numerical optimization was conducted to obtain the optimum process performance. Further work will be conducted towards integrating EC with other wastewater treatment processes such as electro-Fenton.

  7. On the optimization of multitasking process with multiplayer

    Science.gov (United States)

    Zhou, Bin; He, Zhe; Wang, Nianxin; Xi, Zhendong; Li, Yujian; Wang, Bing-Hong

    2015-01-01

    In society, many problems can be understood as multitasking process with multiplayer (MPM). Choosing different strategies or different orders in processing tasks, an individual will spend a different amount of time to complete all the tasks. Therefore, a good strategy or a good order can help an individual work more efficiently. In this paper, we propose a model to study the optimization problems of MPM. The average time spent for all the tasks by an individual is calculated in each strategy, and we find the random choice strategy can make an individual spend less time in completing all tasks. The correlation coefficient between the order of each task processed by an individual and the corresponding time spent for all the tasks by the individual is also calculated. Then the internal statistics law between the order and the corresponding time is found and explains why the random choice strategy is better. Finally, we research the change of the queue length in each task with the time. These results have certain significance on theory and practical application on MPM.

  8. Model reduction for dynamic real-time optimization of chemical processes

    NARCIS (Netherlands)

    Van den Berg, J.

    2005-01-01

    The value of models in process industries becomes apparent in practice and literature where numerous successful applications are reported. Process models are being used for optimal plant design, simulation studies, for off-line and online process optimization. For online optimization applications th

  9. Optimizing the processing and presentation of PPCR imaging

    Science.gov (United States)

    Davies, Andrew G.; Cowen, Arnold R.; Parkin, Geoff J. S.; Bury, Robert F.

    1996-03-01

    Photostimulable phosphor computed radiography (CR) is becoming an increasingly popular image acquisition system. The acceptability of this technique, both diagnostically, ergonomically and economically is highly influenced by the method by which the image data is presented to the user. Traditional CR systems utilize an 11' by 14' film hardcopy format, and can place two images per exposure onto this film, which does not correspond to sizes and presentations provided by conventional techniques. It is also the authors' experience that the image enhancement algorithms provided by traditional CR systems do not provide optimal image presentation. An alternative image enhancement algorithm was developed, along with a number of hardcopy formats, designed to match the requirements of the image reporting process. The new image enhancement algorithm, called dynamic range reduction (DRR), is designed to provide a single presentation per exposure, maintaining the appearance of a conventional radiograph, while optimizing the rendition of diagnostically relevant features within the image. The algorithm was developed on a Sun SPARCstation, but later ported to a Philips' EasyVisionRAD workstation. Print formats were developed on the EasyVision to improve the acceptability of the CR hardcopy. For example, for mammographic examinations, four mammograms (a cranio-caudal and medio-lateral view of each breast) are taken for each patient, with all images placed onto a single sheet of 14' by 17' film. The new composite format provides a more suitable image presentation for reporting, and is more economical to produce. It is the use of enhanced image processing and presentation which has enabled all mammography undertaken within the general infirmary to be performed using the CR/EasyVisionRAD DRR/3M 969 combination, without recourse to conventional film/screen mammography.

  10. Optimization of image processing algorithms on mobile platforms

    Science.gov (United States)

    Poudel, Pramod; Shirvaikar, Mukul

    2011-03-01

    This work presents a technique to optimize popular image processing algorithms on mobile platforms such as cell phones, net-books and personal digital assistants (PDAs). The increasing demand for video applications like context-aware computing on mobile embedded systems requires the use of computationally intensive image processing algorithms. The system engineer has a mandate to optimize them so as to meet real-time deadlines. A methodology to take advantage of the asymmetric dual-core processor, which includes an ARM and a DSP core supported by shared memory, is presented with implementation details. The target platform chosen is the popular OMAP 3530 processor for embedded media systems. It has an asymmetric dual-core architecture with an ARM Cortex-A8 and a TMS320C64x Digital Signal Processor (DSP). The development platform was the BeagleBoard with 256 MB of NAND RAM and 256 MB SDRAM memory. The basic image correlation algorithm is chosen for benchmarking as it finds widespread application for various template matching tasks such as face-recognition. The basic algorithm prototypes conform to OpenCV, a popular computer vision library. OpenCV algorithms can be easily ported to the ARM core which runs a popular operating system such as Linux or Windows CE. However, the DSP is architecturally more efficient at handling DFT algorithms. The algorithms are tested on a variety of images and performance results are presented measuring the speedup obtained due to dual-core implementation. A major advantage of this approach is that it allows the ARM processor to perform important real-time tasks, while the DSP addresses performance-hungry algorithms.

  11. Optimization of the Enzymatic Saccharification Process of Milled Orange Wastes

    Directory of Open Access Journals (Sweden)

    Daniel Velasco

    2017-08-01

    Full Text Available Orange juice production generates a very high quantity of residues (Orange Peel Waste or OPW-50–60% of total weight that can be used for cattle feed as well as feedstock for the extraction or production of essential oils, pectin and nutraceutics and several monosaccharides by saccharification, inversion and enzyme-aided extraction. As in all solid wastes, simple pretreatments can enhance these processes. In this study, hydrothermal pretreatments and knife milling have been analyzed with enzyme saccharification at different dry solid contents as the selection test: simple knife milling seemed more appropriate, as no added pretreatment resulted in better final glucose yields. A Taguchi optimization study on dry solid to liquid content and the composition of the enzymatic cocktail was undertaken. The amounts of enzymatic preparations were set to reduce their impact on the economy of the process; however, as expected, the highest amounts resulted in the best yields to glucose and other monomers. Interestingly, the highest content in solid to liquid (11.5% on dry basis rendered the best yields. Additionally, in search for process economy with high yields, operational conditions were set: medium amounts of hemicellulases, polygalacturonases and β-glucosidases. Finally, a fractal kinetic modelling of results for all products from the saccharification process indicated very high activities resulting in the liberation of glucose, fructose and xylose, and very low activities to arabinose and galactose. High activity on pectin was also observed, but, for all monomers liberated initially at a fast rate, high hindrances appeared during the saccharification process.

  12. Design of a tomato packing system by image processing and optimization processing

    Science.gov (United States)

    Li, K.; Kumazaki, T.; Saigusa, M.

    2016-02-01

    In recent years, with the development of environmental control systems in plant factories, tomato production has rapidly increased in Japan. However, with the decline in the availability of agricultural labor, there is a need to automate grading, sorting and packing operations. In this research, we designed an automatic packing program with which tomato weight could be estimated by image processing and that they were able to be packed in an optimized configuration. The weight was estimated by using the pixel area properties after an L*a*b* color model conversion, noise rejection, filling holes and boundary preprocessing. The packing optimization program was designed by a 0-1 knapsack algorithm for dynamic combinatorial optimization.

  13. Novel Optimization Methodology for Welding Process/Consumable Integration

    Energy Technology Data Exchange (ETDEWEB)

    Quintana, Marie A; DebRoy, Tarasankar; Vitek, John; Babu, Suresh

    2006-01-15

    Advanced materials are being developed to improve the energy efficiency of many industries of future including steel, mining, and chemical, as well as, US infrastructures including bridges, pipelines and buildings. Effective deployment of these materials is highly dependent upon the development of arc welding technology. Traditional welding technology development is slow and often involves expensive and time-consuming trial and error experimentation. The reason for this is the lack of useful predictive tools that enable welding technology development to keep pace with the deployment of new materials in various industrial sectors. Literature reviews showed two kinds of modeling activities. Academic and national laboratory efforts focus on developing integrated weld process models by employing the detailed scientific methodologies. However, these models are cumbersome and not easy to use. Therefore, these scientific models have limited application in real-world industrial conditions. On the other hand, industrial users have relied on simple predictive models based on analytical and empirical equations to drive their product development. The scopes of these simple models are limited. In this research, attempts were made to bridge this gap and provide the industry with a computational tool that combines the advantages of both approaches. This research resulted in the development of predictive tools which facilitate the development of optimized welding processes and consumables. The work demonstrated that it is possible to develop hybrid integrated models for relating the weld metal composition and process parameters to the performance of welds. In addition, these tools can be deployed for industrial users through user friendly graphical interface. In principle, the welding industry users can use these modular tools to guide their welding process parameter and consumable composition selection. It is hypothesized that by expanding these tools throughout welding industry

  14. Optimal separation of jojoba protein using membrane processes

    Energy Technology Data Exchange (ETDEWEB)

    Nabetani, Hiroshi; Abbott, T.P.; Kleiman, R. [National Center for Agricultural Utilization Research, Peoria, IL (United States)

    1995-05-01

    The efficiency of a pilot-scale membrane system for purifying and concentrating jojoba protein was estimated. In this system, a jojoba extract was first clarified with a microfiltration membrane. The clarified extract was diafiltrated and the protein was purified with an ultrafiltration membrane. Then the protein solution was concentrated with the ultrafiltration membrane. Permeate flux during microfiltration was essentially independent of solids concentration in the feed, in contrast with the permeate flux during ultrafiltration which was a function of protein concentration. Based on these results, a mathematical model which describes the batchwise concentration process with ultrafiltration membranes was developed. Using this model, the combination of batchwise concentration with diafiltration was optimized, and an industrial-scale process was designed. The effect of ethylenediaminetetraacetic acid (EDTA) on the performance of the membrane system was also investigated. The addition of EDTA increased the concentration of protein in the extract and improved the recovery of protein in the final products. The quality of the final product (color and solubility) was also improved. However, EDTA decreased permeate flux during ultrafiltration.

  15. Assessing and Optimizing Microarchitectural Performance of Event Processing Systems

    Science.gov (United States)

    Mendes, Marcelo R. N.; Bizarro, Pedro; Marques, Paulo

    Event Processing (EP) systems are being progressively used in business critical applications in domains such as algorithmic trading, supply chain management, production monitoring, or fraud detection. To deal with high throughput and low response time requirements, these EP systems mainly use the CPU-RAM sub-system for data processing. However, as we show here, collected statistics on CPU usage or on CPU-RAM communication reveal that available systems are poorly optimized and grossly waste resources. In this paper we quantify some of these inefficiencies and propose cache-aware algorithms and changes on internal data structures to overcome them. We test the before and after system both at the microarchitecture and application level and show that: i) the changes improve microarchitecture metrics such as clocks-per-instruction, cache misses or TLB misses; ii) and that some of these improvements result in very high application level improvements such as a 44% improvement on stream-to-table joins with 6-fold reduction on memory consumption, and order-of-magnitude increase on throughput for moving aggregation operations.

  16. Optimizing the HIV/AIDS informed consent process in India

    Directory of Open Access Journals (Sweden)

    Shrotri A

    2004-08-01

    Full Text Available Abstract Background While the basic ethical issues regarding consent may be universal to all countries, the consent procedures required by international review boards which include detailed scientific and legal information, may not be optimal when administered within certain populations. The time and the technicalities of the process itself intimidate individuals in societies where literacy and awareness about medical and legal rights is low. Methods In this study, we examined pregnant women's understanding of group education and counseling (GEC about HIV/AIDS provided within an antenatal clinic in Maharashtra, India. We then enhanced the GEC process with the use of culturally appropriate visual aids and assessed the subsequent changes in women's understanding of informed consent issues. Results We found the use of visual aids during group counseling sessions increased women's overall understanding of key issues regarding informed consent from 38% to 72%. Moreover, if these same visuals were reinforced during individual counseling, improvements in women's overall comprehension rose to 96%. Conclusions This study demonstrates that complex constructs such as informed consent can be conveyed in populations with little education and within busy government hospital settings, and that the standard model may not be sufficient to ensure true informed consent.

  17. Optimization of Composite Cloud Service Processing with Virtual Machines

    Energy Technology Data Exchange (ETDEWEB)

    Di, Sheng; Kondo, Derrick; Wang, Cho-Li

    2015-06-09

    By leveraging virtual machine (VM) technology, we optimize cloud system performance based on refined resource allocation, in processing user requests with composite services. Our contribution is three-fold. (1) We devise a VM resource allocation scheme with a minimized processing overhead for task execution. (2) We comprehensively investigate the best-suited task scheduling policy with different design parameters. (3) We also explore the best-suited resource sharing scheme with adjusted divisible resource fractions on running tasks in terms of Proportional-share model (PSM), which can be split into absolute mode (called AAPSM) and relative mode (RAPSM). We implement a prototype system over a cluster environment deployed with 56 real VM instances, and summarized valuable experience from our evaluation. As the system runs in short supply, lightest workload first (LWF) is mostly recommended because it can minimize the overall response extension ratio (RER) for both sequential-mode tasks and parallel-mode tasks. In a competitive situation with over-commitment of resources, the best one is combining LWF with both AAPSM and RAPSM. It outperforms other solutions in the competitive situation, by 16 + % w.r.t. the worst-case response time and by 7.4 + % w.r.t. the fairness.

  18. XFEL diffraction: developing processing methods to optimize data quality

    Energy Technology Data Exchange (ETDEWEB)

    Sauter, Nicholas K., E-mail: nksauter@lbl.gov [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)

    2015-01-29

    Bragg spots recorded from a still crystal necessarily give partial measurements of the structure factor intensity. Correction to the full-spot equivalent, relying on both a physical model for crystal disorder and postrefinement of the crystal orientation, improves the electron density map in serial crystallography. Serial crystallography, using either femtosecond X-ray pulses from free-electron laser sources or short synchrotron-radiation exposures, has the potential to reveal metalloprotein structural details while minimizing damage processes. However, deriving a self-consistent set of Bragg intensities from numerous still-crystal exposures remains a difficult problem, with optimal protocols likely to be quite different from those well established for rotation photography. Here several data processing issues unique to serial crystallography are examined. It is found that the limiting resolution differs for each shot, an effect that is likely to be due to both the sample heterogeneity and pulse-to-pulse variation in experimental conditions. Shots with lower resolution limits produce lower-quality models for predicting Bragg spot positions during the integration step. Also, still shots by their nature record only partial measurements of the Bragg intensity. An approximate model that corrects to the full-spot equivalent (with the simplifying assumption that the X-rays are monochromatic) brings the distribution of intensities closer to that expected from an ideal crystal, and improves the sharpness of anomalous difference Fourier peaks indicating metal positions.

  19. Poling process optimization of piezo nano composite PZT/polimer

    Science.gov (United States)

    Ridlo, M. Rosyid; Lestari, Titik; Mardiyanto, Oemry, Achiar

    2013-09-01

    The objective of poling process is to make the electric dipole directions to be parallel in the inside perovskite crystal of piezo materials. In simply way, poling was carried out by giving the two sides of a piezo material by highly electrical potential. More parallel of electrical dipoles, it is more strength the piezo characteristics. The optimization involved control of temperature, time depth and the electrical voltage. The samples was prepared by solgel method with precursor tetrabutyl titanat Ti(OC4H9)4, zirconium nitrat Zr(NO3)4ṡ5H2O, Pb(CH3COO)2ṡ3H2O and solution ethylene glycol. Molar ratio Pb:Zr:Ti = 1,1:0,52:0,48 with concidering lossed Pb. Result of solgel process is nano powder PZT. The formed nano powder PZT was then mixed with polimer PVDF and pressed 10 MPa at 150 °C with the size 15 mm in diameter. After poling, piezoelectric constant d33 was measured. The highest d33 = 45 pC/N was found at poling parameters V = 5 kV/ mm, T = 120 °C dan time depth = 1 hours.

  20. A new model for anaerobic processes of up-flow anaerobic sludge blanket reactors based on cellular automata

    DEFF Research Database (Denmark)

    Skiadas, Ioannis V.; Ahring, Birgitte Kiær

    2002-01-01

    characteristics and lead to different reactor behaviour. A dynamic mathematical model has been developed for the anaerobic digestion of a glucose based synthetic wastewater in UASB reactors. Cellular automata (CA) theory has been applied to simulate the granule development process. The model takes...... into consideration that granule diameter and granule microbial composition are functions of the reactor operational parameters and is capable of predicting the UASB performance and the layer structure of the granules....

  1. Macro-cellular silica foams: synthesis during the natural creaming process of an oil-in-water emulsion.

    Science.gov (United States)

    Sen, T; Tiddy, G J T; Casci, J L; Anderson, M W

    2003-09-01

    The room-temperature synthesis of a macro-mesoporous silica material during the natural creaming process of an oil-in-water emulsion is reported. The material has 3-dimensional interconnected macropores with a strut-like structure similar to meso-cellular silica foams with mesoporous walls of worm-hole structure. The material has very high surface area (approximately 800 m2 g(-1)) with narrow mesopore size distribution.

  2. A key to success: optimizing the planning process

    Science.gov (United States)

    Turk, Huseyin; Karakaya, Kamil

    2014-05-01

    operation planning process is analyzed according to a comprehensive approach. The difficulties of planning are identified. Consequently, for optimizing a decisionmaking process of an air operation, a planning process is identified in a virtual command and control structure.

  3. Optimizing Feature Construction Process for Dynamic Aggregation of Relational Attributes

    Directory of Open Access Journals (Sweden)

    Rayner Alfred

    2009-01-01

    Full Text Available Problem statement: The importance of input representation has been recognized already in machine learning. Feature construction is one of the methods used to generate relevant features for learning data. This study addressed the question whether or not the descriptive accuracy of the DARA algorithm benefits from the feature construction process. In other words, this paper discusses the application of genetic algorithm to optimize the feature construction process to generate input data for the data summarization method called Dynamic Aggregation of Relational Attributes (DARA. Approach: The DARA algorithm was designed to summarize data stored in the non-target tables by clustering them into groups, where multiple records stored in non-target tables correspond to a single record stored in a target table. Here, feature construction methods are applied in order to improve the descriptive accuracy of the DARA algorithm. Since, the study addressed the question whether or not the descriptive accuracy of the DARA algorithm benefits from the feature construction process, the involved task includes solving the problem of constructing a relevant set of features for the DARA algorithm by using a genetic-based algorithm. Results: It is shown in the experimental results that the quality of summarized data is directly influenced by the methods used to create patterns that represent records in the (n×p TF-IDF weighted frequency matrix. The results of the evaluation of the genetic-based feature construction algorithm showed that the data summarization results can be improved by constructing features by using the Cluster Entropy (CE genetic-based feature construction algorithm. Conclusion: This study showed that the data summarization results can be improved by constructing features by using the cluster entropy genetic-based feature construction algorithm.

  4. Tips and step-by-step protocol for the optimization of important factors affecting cellular enzyme-linked immunosorbent assay (CELISA).

    Science.gov (United States)

    Morandini, R; Boeynaems, J M; Wérenne, J; Ghanem, G

    2001-01-01

    CELISA, or cellular enzyme-linked immunosorbent assay, is a powerful and easy to use technique to study cell surface antigens under different stimulations. Nevertheless, some factors must be discussed and optimized prior to reaching a reproducible CELISA. These include the choice of cell density, fixative agent, blocking agent, culture medium, optimal antibody dilutions, and incubation time. In this paper, we first present a short review of some references devoted to CELISA by means of a comparison of these parameters, followed by their description. Then, we describe and study these different parameters using practical examples comparing TNF-induced ICAM-1 expression as an end point, on HBL melanoma and HUVEC. These cell lines were also chosen because they differ in their ability to grow as discontinuous and continuous layers, respectively. Furthermore, we designed a comprehensive flow chart, as well as a complete step-by-step protocol for CELISA optimization.

  5. Process development in the QbD paradigm: Role of process integration in process optimization for production of biotherapeutics.

    Science.gov (United States)

    Rathore, Anurag S; Pathak, Mili; Godara, Avinash

    2016-03-01

    Biotherapeutics have become the focus of the pharmaceutical industry due to their proven effectiveness in managing complex diseases. Downstream processes of these molecules consist of several orthogonal, high resolution unit operations designed so as to be able to separate variants having very similar physicochemical properties. Typical process development involves optimization of the individual unit operations based on Quality by Design principles in order to define the design space within which the process can deliver product that meets the predefined specifications. However, limited efforts are dedicated to understanding the interactions between the unit operations. This paper aims to showcase the importance of understanding these interactions and thereby arrive at operating conditions that are optimal for the overall process. It is demonstrated that these are not necessarily same as those obtained from optimization of the individual unit operations. Purification of Granulocyte Colony Stimulating Factor (G-CSF), a biotherapeutic expressed in E. coli., has been used as a case study. It is evident that the suggested approach results in not only higher yield (91.5 vs. 86.4) but also improved product quality (% RP-HPLC purity of 98.3 vs. 97.5) and process robustness. We think that this paper is very relevant to the present times when the biotech industry is in the midst of implementing Quality by Design towards process development. © 2015 American Institute of Chemical Engineers Biotechnol. Prog., 32:355-362, 2016.

  6. QUALITY OF ACCOUNTING INFORMATION TO OPTIMIZE THE DECISIONAL PROCESS

    Directory of Open Access Journals (Sweden)

    Miculescu Marius Nicolae

    2012-12-01

    Full Text Available This article provides information on business and therefore need managers to obtain information relevant accounting, reliable, clear, accurate and lowest costs to optimize decision making. This need derives from the current economic environment. The survival of organizations in a competitive environment, to which they must adapt, is conditioned by obtaining accounting information which should be qualitative, opportune, vital, and in a short time. This information is related to patrimony, analytical results, the market (dynamics, dimensions, and structure, and relationships with business partners, competitors, suppliers. Therefore focus more intensely on the quality of accounting information. Definition of quality of accounting information but leave the boundaries and features of accounting communication process and aims to determine \\\\\\"quality criteria\\\\\\" or \\\\\\"qualitative characteristics\\\\\\" to develop a measurement tool. Note that the reviewliterature was found that the normalization and accounting dotrine, criteria for definition of quality of accounting infornation are not identical, their selection and ranking is different. Theory and practice also identifies the fact that information itself is worthless. Instead it is valuable once it is used in a decisional process. Thus, the economic value of the accounting information depends on the earnings obtained after making a decision, diminished by information cost. To be more specific, it depends on the table or on the implemented decision tree, on the informational cost and on the optimal condition established by the decision maker (due to the fact that producing accounting information implies costs which are often considerable and profits arise only form shares. The problem of convergence between content and interpretation of information sent by users also take, and the quality of information to be intelligible. In this case, those who use, say users should have sufficient

  7. A MODEL SELECTION PROCEDURE IN MIXTURE-PROCESS EXPERIMENTS FOR INDUSTRIAL PROCESS OPTIMIZATION

    Directory of Open Access Journals (Sweden)

    Márcio Nascimento de Souza Leão

    2015-08-01

    Full Text Available We present a model selection procedure for use in Mixture and Mixture-Process Experiments. Certain combinations of restrictions on the proportions of the mixture components can result in a very constrained experimental region. This results in collinearity among the covariates of the model, which can make it difficult to fit the model using the traditional method based on the significance of the coefficients. For this reason, a model selection methodology based on information criteria will be proposed for process optimization. Two examples are presented to illustrate this model selection procedure.

  8. Distinct roles of specific fatty acids in cellular processes: implications for interpreting and reporting experiments

    OpenAIRE

    Watt, Matthew J.; Hoy, Andrew J.; Muoio, Deborah M.; Coleman, Rosalind A.

    2011-01-01

    Plasma contains a variety of long-chain fatty acids (FAs), such that about 35% are saturated and 65% are unsaturated. There are countless examples that show how different FAs impart specific and unique effects, or even opposing actions, on cellular function. Despite these differing effects, palmitate (C16:0) is regularly used to represent “FAs” in cell based experiments. Although palmitate can be useful to induce and study stress effects in cultured cells, these effects in isolation are not p...

  9. Chip Design Process Optimization Based on Design Quality Assessment

    Science.gov (United States)

    Häusler, Stefan; Blaschke, Jana; Sebeke, Christian; Rosenstiel, Wolfgang; Hahn, Axel

    2010-06-01

    Nowadays, the managing of product development projects is increasingly challenging. Especially the IC design of ASICs with both analog and digital components (mixed-signal design) is becoming more and more complex, while the time-to-market window narrows at the same time. Still, high quality standards must be fulfilled. Projects and their status are becoming less transparent due to this complexity. This makes the planning and execution of projects rather difficult. Therefore, there is a need for efficient project control. A main challenge is the objective evaluation of the current development status. Are all requirements successfully verified? Are all intermediate goals achieved? Companies often develop special solutions that are not reusable in other projects. This makes the quality measurement process itself less efficient and produces too much overhead. The method proposed in this paper is a contribution to solve these issues. It is applied at a German design house for analog mixed-signal IC design. This paper presents the results of a case study and introduces an optimized project scheduling on the basis of quality assessment results.

  10. Performance monitoring and optimization of industrial processes [abstract

    Directory of Open Access Journals (Sweden)

    Sainlez, M.

    2010-01-01

    Full Text Available Data mining refers to extracting useful knowledge from large amounts of data. It is a result of the natural evolution of information technology and development of recent algorithms. Starting from large databases, the main objective is to find interesting latent patterns. In the end, the quality of a model is assessed by its performance for predicting new observations. Bagging and boosting are general strategies for improving classifier and predictor accuracy. They are examples of ensemble methods, or methods that use a combination of models. The bagging algorithm creates an ensemble of models (by boostrap sampling for a learning scheme where each model gives an equally-weighted prediction. Particularly, random forests are a combination of tree predictors such that each tree depends on the values of a random vector sampled independently and with the same distribution for all trees in the forest. Internal estimates are also used to measure variable importance. Within the framework of a Kraft pulp mill, we analyze recovery boilers pollutants and steam production. This kind of boiler acts both as a high-pressure steam boiler and as a chemical reactor with reductive and oxidative zones. The steam is used in other mill processes and to run a steam turbine in order to produce electrical energy. Significant perspectives are already existing to optimize this production and reduce atmospheric pollutants. Nowadays random forests modeling is a promising way to achieve that.

  11. Process Optimization with Simulation Modeling in a Manufacturing System

    Directory of Open Access Journals (Sweden)

    Akbel Yildiz

    2011-04-01

    Full Text Available Computer simulation has become an important tool in modeling systems in the last ten years due to parallel improvement in computer technologies. Companies tend to computer based system modeling and simulation not to lose any extra income or time to their competitors but to make future investments while they both have the same labor force, resources and technology. This study is an implementation of a machine spare parts manufacturer factory located in city of Turkey. The purpose of the study depends on increasing the utilization rates and optimizing the manufacture process to decrease prouction costs via identifying the bottlenecks in manufacture system. Therefore, ProModel simulation software was used to model the production line of the factory. Production line consists of nineteen work stations and was modeled for the most manufactured two products. The manufacture in the factory is divided into two weeks of batch production time and simulation model was demonstrated and replicated for ten times to get results. Thus, statistics including existing capacity usages of work stations in the whole production line were found to identify the bottlenecks of the critical work stations and machines. With the use of the simulation model, creating scenarios while making changes of the system parameters, taking the cycle times of the work stations, total production quantity, batch sizes and the shifts of the factory in hand helped to make suggestions.

  12. Process optimization in petrochemical industries; Otimizacao de processo nas industrias petroquimicas

    Energy Technology Data Exchange (ETDEWEB)

    Castro Filho, Paulo Farias; Chachamovitz, Joao Carlos

    1992-12-31

    The most recent and efficient technologies of simulation, modeling and process optimization are presented. Some practical problems are analyzed together with a methodology for applying the optimization technologies. (author) 2 refs., 7 figs., 2 tabs.

  13. Cellular modifications and interventions for the damaged heart

    NARCIS (Netherlands)

    Engels, M.C.

    2016-01-01

    The aim of this thesis was to explore cellular modification processes associated with heart disease, as well as harnessing its potential for treatment and prevention of detrimental electrophysiological consequences of heart disease. For regenerative cell replacement therapies, optimal differentiatio

  14. Optimized process parameters for fabricating metal particles reinforced 5083 Al composite by friction stir processing.

    Science.gov (United States)

    Bauri, Ranjit; Yadav, Devinder; Shyam Kumar, C N; Janaki Ram, G D

    2015-12-01

    Metal matrix composites (MMCs) exhibit improved strength but suffer from low ductility. Metal particles reinforcement can be an alternative to retain the ductility in MMCs (Bauri and Yadav, 2010; Thakur and Gupta, 2007) [1,2]. However, processing such composites by conventional routes is difficult. The data presented here relates to friction stir processing (FSP) that was used to process metal particles reinforced aluminum matrix composites. The data is the processing parameters, rotation and traverse speeds, which were optimized to incorporate Ni particles. A wide range of parameters covering tool rotation speeds from 1000 rpm to 1800 rpm and a range of traverse speeds from 6 mm/min to 24 mm/min were explored in order to get a defect free stir zone and uniform distribution of particles. The right combination of rotation and traverse speed was found from these experiments. Both as-received coarse particles (70 μm) and ball-milled finer particles (10 μm) were incorporated in the Al matrix using the optimized parameters.

  15. Optimized process parameters for fabricating metal particles reinforced 5083 Al composite by friction stir processing

    Directory of Open Access Journals (Sweden)

    Ranjit Bauri

    2015-12-01

    Full Text Available Metal matrix composites (MMCs exhibit improved strength but suffer from low ductility. Metal particles reinforcement can be an alternative to retain the ductility in MMCs (Bauri and Yadav, 2010; Thakur and Gupta, 2007 [1,2]. However, processing such composites by conventional routes is difficult. The data presented here relates to friction stir processing (FSP that was used to process metal particles reinforced aluminum matrix composites. The data is the processing parameters, rotation and traverse speeds, which were optimized to incorporate Ni particles. A wide range of parameters covering tool rotation speeds from 1000 rpm to 1800 rpm and a range of traverse speeds from 6 mm/min to 24 mm/min were explored in order to get a defect free stir zone and uniform distribution of particles. The right combination of rotation and traverse speed was found from these experiments. Both as-received coarse particles (70 μm and ball-milled finer particles (10 μm were incorporated in the Al matrix using the optimized parameters.

  16. FRIT OPTIMIZATION FOR SLUDGE BATCH PROCESSING AT THE DEFENSE WASTE PROCESSING FACILITY

    Energy Technology Data Exchange (ETDEWEB)

    Fox, K.

    2009-01-28

    The Savannah River National Laboratory (SRNL) Frit Development Team recommends that the Defense Waste Processing Facility (DWPF) utilize Frit 418 for initial processing of high level waste (HLW) Sludge Batch 5 (SB5). The extended SB5 preparation time and need for DWPF feed have necessitated the use of a frit that is already included on the DWPF procurement specification. Frit 418 has been used previously in vitrification of Sludge Batches 3 and 4. Paper study assessments predict that Frit 418 will form an acceptable glass when combined with SB5 over a range of waste loadings (WLs), typically 30-41% based on nominal projected SB5 compositions. Frit 418 has a relatively high degree of robustness with regard to variation in the projected SB5 composition, particularly when the Na{sub 2}O concentration is varied. The acceptability (chemical durability) and model applicability of the Frit 418-SB5 system will be verified experimentally through a variability study, to be documented separately. Frit 418 has not been designed to provide an optimal melt rate with SB5, but is recommended for initial processing of SB5 until experimental testing to optimize a frit composition for melt rate can be completed. Melt rate performance can not be predicted at this time and must be determined experimentally. Note that melt rate testing may either identify an improved frit for SB5 processing (one which produces an acceptable glass at a faster rate than Frit 418) or confirm that Frit 418 is the best option.

  17. Integrative analysis of large scale expression profiles reveals core transcriptional response and coordination between multiple cellular processes in a cyanobacterium

    Directory of Open Access Journals (Sweden)

    Bhattacharyya-Pakrasi Maitrayee

    2010-08-01

    Full Text Available Abstract Background Cyanobacteria are the only known prokaryotes capable of oxygenic photosynthesis. They play significant roles in global biogeochemical cycles and carbon sequestration, and have recently been recognized as potential vehicles for production of renewable biofuels. Synechocystis sp. PCC 6803 has been extensively used as a model organism for cyanobacterial studies. DNA microarray studies in Synechocystis have shown varying degrees of transcriptome reprogramming under altered environmental conditions. However, it is not clear from published work how transcriptome reprogramming affects pre-existing networks of fine-tuned cellular processes. Results We have integrated 163 transcriptome data sets generated in response to numerous environmental and genetic perturbations in Synechocystis. Our analyses show that a large number of genes, defined as the core transcriptional response (CTR, are commonly regulated under most perturbations. The CTR contains nearly 12% of Synechocystis genes found on its chromosome. The majority of genes in the CTR are involved in photosynthesis, translation, energy metabolism and stress protection. Our results indicate that a large number of differentially regulated genes identified in most reported studies in Synechocystis under different perturbations are associated with the general stress response. We also find that a majority of genes in the CTR are coregulated with 25 regulatory genes. Some of these regulatory genes have been implicated in cellular responses to oxidative stress, suggesting that reactive oxygen species are involved in the regulation of the CTR. A Bayesian network, based on the regulation of various KEGG pathways determined from the expression patterns of their associated genes, has revealed new insights into the coordination between different cellular processes. Conclusion We provide here the first integrative analysis of transcriptome data sets generated in a cyanobacterium. This

  18. Optimizing process time of laser drilling processes in solar cell manufacturing by coaxial camera control

    Science.gov (United States)

    Jetter, Volker; Gutscher, Simon; Blug, Andreas; Knorz, Annerose; Ahrbeck, Christopher; Nekarda, Jan; Carl, Daniel

    2014-03-01

    In emitter wrap through (EWT) solar cells, laser drilling is used to increase the light sensitive area by removing emitter contacts from the front side of the cell. For a cell area of 156 x 156 mm2, about 24000 via-holes with a diameter of 60 μm have to be drilled into silicon wafers with a thickness of 200 μm. The processing time of 10 to 20 s is determined by the number of laser pulses required for safely opening every hole on the bottom side. Therefore, the largest wafer thickness occurring in a production line defines the processing time. However, wafer thickness varies by roughly +/-20 %. To reduce the processing time, a coaxial camera control system was integrated into the laser scanner. It observes the bottom breakthrough from the front side of the wafer by measuring the process emissions of every single laser pulse. To achieve the frame rates and latency times required by the repetition rate of the laser (10 kHz), a camera based on cellular neural networks (CNN) was used where the images are processed directly on the camera chip by 176 x 144 sensor-processor-elements. One image per laser pulse is processed within 36 μs corresponding to a maximum pulse rate of 25 kHz. The laser is stopped when all of the holes are open on the bottom side. The result is a quality control system in which the processing time of a production line is defined by average instead of maximum wafer thickness.

  19. Surface charge and cellular processing of covalently functionalized multiwall carbon nanotubes determine pulmonary toxicity.

    Science.gov (United States)

    Li, Ruibin; Wang, Xiang; Ji, Zhaoxia; Sun, Bingbing; Zhang, Haiyuan; Chang, Chong Hyun; Lin, Sijie; Meng, Huan; Liao, Yu-Pei; Wang, Meiying; Li, Zongxi; Hwang, Angela A; Song, Tze-Bin; Xu, Run; Yang, Yang; Zink, Jeffrey I; Nel, André E; Xia, Tian

    2013-03-26

    Functionalized carbon nanotubes (f-CNTs) are being produced in increased volume because of the ease of dispersion and maintenance of the pristine material physicochemical properties when used in composite materials as well as for other commercial applications. However, the potential adverse effects of f-CNTs have not been quantitatively or systematically explored. In this study, we used a library of covalently functionalized multiwall carbon nanotubes (f-MWCNTs), established from the same starting material, to assess the impact of surface charge in a predictive toxicological model that relates the tubes' pro-inflammatory and pro-fibrogenic effects at cellular level to the development of pulmonary fibrosis. Carboxylate (COOH), polyethylene glycol (PEG), amine (NH2), sidewall amine (sw-NH2), and polyetherimide (PEI)-modified MWCNTs were successfully established from raw or as-prepared (AP-) MWCNTs and comprehensively characterized by TEM, XPS, FTIR, and DLS to obtain information about morphology, length, degree of functionalization, hydrodynamic size, and surface charge. Cellular screening in BEAS-2B and THP-1 cells showed that, compared to AP-MWCNTs, anionic functionalization (COOH and PEG) decreased the production of pro-fibrogenic cytokines and growth factors (including IL-1β, TGF-β1, and PDGF-AA), while neutral and weak cationic functionalization (NH2 and sw-NH2) showed intermediary effects. In contrast, the strongly cationic PEI-functionalized tubes induced robust biological effects. These differences could be attributed to differences in cellular uptake and NLRP3 inflammasome activation, which depends on the propensity toward lysosomal damage and cathepsin B release in macrophages. Moreover, the in vitro hazard ranking was validated by the pro-fibrogenic potential of the tubes in vivo. Compared to pristine MWCNTs, strong cationic PEI-MWCNTs induced significant lung fibrosis, while carboxylation significantly decreased the extent of pulmonary fibrosis. These

  20. Differentiation of cellular processes involved in the induction and maintenance of stimulated neutrophil adherence.

    Science.gov (United States)

    English, D; Gabig, T G

    1986-05-01

    Neutrophil adherence stimulated by phorbol myristate acetate (PMA) was investigated by quantitating the attachment of 51Cr-labeled neutrophils to plastic surfaces and to the endothelium of umbilical veins mounted in compartmentalized Lucite chambers. PMA-induced adherence could be functionally separated into an induction phase requiring cellular metabolism and a Mg++ dependent maintenance phase that was independent of cellular metabolism. Thus, metabolic inhibitors (N-ethylmaleimide, 2-deoxyglucose) blocked adherence when added to neutrophils prior to PMA, but did not cause detachment of cells adhering as a consequence of prior exposure to PMA. PMA failed to induce adherence of neutrophils incubated at low (0.4 degree C) temperature, but temperature reduction, even for prolonged periods, did not cause detachment of adherent cells. Thus, the attractive forces that mediate stimulated adherence persist independently of any sustained metabolic response to the inducing stimulus. However, removal of Mg++ from the media above adherent cells resulted in immediate detachment, indicating that the cation was required for the persistent expression or maintenance of the attractive forces involved. The extent of stimulated adherence correlated well with the extent of degranulation when rates were varied by limiting the incubation time or stimulus concentration. This correlation was not absolute; in the absence of Mg++, PMA induced degranulation normally but failed to enhance adherence. To explain these findings, we investigated the possibility that PMA-stimulated adherence was maintained by Mg++-dependent cellular adherence molecules released during exocytosis. Supernatants of stimulated neutrophils were devoid of adherence-promoting activity, and only weak activity was recovered in supernatants of mechanically disrupted neutrophils. PMA effectively stimulated the tight adherence of degranulated neutrophil cytoplasts to plastic surfaces and did so in the absence of stimulated

  1. Potential and challenges in home care service process optimization : a route optimization approach

    OpenAIRE

    Nakari, Pentti J. E.

    2016-01-01

    Aging of the population is an increasing problem in many countries, including Finland, and it poses a challenge to public services such as home care. Vehicle routing optimization (VRP) type optimization solutions are one possible way to decrease the time required for planning home visits and driving to customer addresses, as well as decreasing transportation costs. Although VRP optimization is widely and succesfully applied to commercial and industrial logistics, the home care ...

  2. Modeling of the inhomogeneity of grain refinement during combined metal forming process by finite element and cellular automata methods

    Energy Technology Data Exchange (ETDEWEB)

    Majta, Janusz; Madej, Łukasz; Svyetlichnyy, Dmytro S.; Perzyński, Konrad; Kwiecień, Marcin, E-mail: mkwiecie@agh.edu.pl; Muszka, Krzysztof

    2016-08-01

    The potential of discrete cellular automata technique to predict the grain refinement in wires produced using combined metal forming process is presented and discussed within the paper. The developed combined metal forming process can be treated as one of the Severe Plastic Deformation (SPD) techniques that consists of three different modes of deformation: asymmetric drawing with bending, namely accumulated angular drawing (AAD), wire drawing (WD) and wire flattening (WF). To accurately replicate complex stress state both at macro and micro scales during subsequent deformations two stage modeling approach was used. First, the Finite Element Method (FEM), implemented in commercial ABAQUS software, was applied to simulate entire combined forming process at the macro scale level. Then, based on FEM results, the Cellular Automata (CA) method was applied for simulation of grain refinement at the microstructure level. Data transferred between FEM and CA methods included set of files with strain tensor components obtained from selected integration points in the macro scale model. As a result of CA simulation, detailed information on microstructure evolution during severe plastic deformation conditions was obtained, namely: changes of shape and sizes of modeled representative volume with imposed microstructure, changes of the number of grains, subgrains and dislocation cells, development of grain boundaries angle distribution as well as changes in the pole figures. To evaluate CA model predictive capabilities, results of computer simulation were compared with scanning electron microscopy and electron back scattered diffraction images (SEM/EBSD) studies of samples after AAD+WD+WF process.

  3. Process Optimization of Bismaleimide (BMI) Resin Infused Carbon Fiber Composite

    Science.gov (United States)

    Ehrlich, Joshua W.; Tate, LaNetra C.; Cox, Sarah B.; Taylor, Brian J.; Wright, M. Clara; Caraccio, Anne J.; Sampson, Jeffery W.

    2013-01-01

    Bismaleimide (BMI) resins are an attractive new addition to world-wide composite applications. This type of thermosetting polyimide provides several unique characteristics such as excellent physical property retention at elevated temperatures and in wet environments, constant electrical properties over a vast array of temperature settings, and nonflammability properties as well. This makes BMI a popular choice in advance composites and electronics applications [I]. Bismaleimide-2 (BMI-2) resin was used to infuse intermediate modulus 7 (IM7) based carbon fiber. Two panel configurations consisting of 4 plies with [+45deg, 90deg]2 and [0deg]4 orientations were fabricated. For tensile testing, a [90deg]4 configuration was tested by rotating the [0deg]4 configirration to lie orthogonal with the load direction of the test fixture. Curing of the BMI-2/IM7 system utilized an optimal infusion process which focused on the integration of the manufacturer-recommended ramp rates,. hold times, and cure temperatures. Completion of the cure cycle for the BMI-2/IM7 composite yielded a product with multiple surface voids determined through visual and metallographic observation. Although the curing cycle was the same for the three panellayups, the surface voids that remained within the material post-cure were different in abundance, shape, and size. For tensile testing, the [0deg]4 layup had a 19.9% and 21.7% greater average tensile strain performance compared to the [90deg]4 and [+45deg, 90deg, 90deg,-45degg] layups, respectively, at failure. For tensile stress performance, the [0deg]4 layup had a 5.8% and 34.0% greater average performance% than the [90deg]4 and [+45deg, 90deg, 90deg,-45deg] layups.

  4. Beta Cell Formation in vivo Through Cellular Networking, Integration and Processing (CNIP) in Wild Type Adult Mice.

    Science.gov (United States)

    Doiron, Bruno; Hu, Wenchao; DeFronzo, Ralph A

    2016-01-01

    Insulin replacement therapy is essential in type 1 diabetic individuals and is required in ~40- 50% of type 2 diabetics during their lifetime. Prior attempts at beta cell regeneration have relied upon pancreatic injury to induce beta cell proliferation, dedifferentiation and activation of the embryonic pathway, or stem cell replacement. We report an alternative method to transform adult non-stem (somatic) cells into pancreatic beta cells. The Cellular Networking, Integration and Processing (CNIP) approach targets cellular mechanisms involved in pancreatic function in the organ's adult state and utilizes a synergistic mechanism that integrates three important levels of cellular regulation to induce beta cell formation: (i) glucose metabolism, (ii) membrane receptor function, and (iii) gene transcription. The aim of the present study was to induce pancreatic beta cell formation in vivo in adult animals without stem cells and without dedifferentiating cells to recapitulate the embryonic pathway as previously published (1-3). Our results employing CNIP demonstrate that: (i) insulin secreting cells can be generated in adult pancreatic tissue in vivo and circumvent the problem of generating endocrine (glucagon and somatostatin) cells that exert deleterious effects on glucose homeostasis, and (ii) longterm normalization of glucose tolerance and insulin secretion can be achieved in a wild type diabetic mouse model. The CNIP cocktail has the potential to be used as a preventative or therapeutic treatment or cure for both type 1 and type 2 diabetes.

  5. Optimal Control of Beer Fermentation Process Using Differential ...

    African Journals Online (AJOL)

    ADOWIE PERE

    process operation. Fermentation processes are increasingly used in industries, laboratories and locally for the ... Fig 1: The Model Diagram for the Fermentation Process. According ..... chemical engineering and chemical process technology.

  6. Identification of genes that regulate multiple cellular processes/responses in the context of lipotoxicity to hepatoma cells

    Directory of Open Access Journals (Sweden)

    Yedwabnick Matthew

    2007-10-01

    Full Text Available Abstract Background In order to devise efficient treatments for complex, multi-factorial diseases, it is important to identify the genes which regulate multiple cellular processes. Exposure to elevated levels of free fatty acids (FFAs and tumor necrosis factor alpha (TNF-α alters multiple cellular processes, causing lipotoxicity. Intracellular lipid accumulation has been shown to reduce the lipotoxicity of saturated FFA. We hypothesized that the genes which simultaneously regulate lipid accumulation as well as cytotoxicity may provide better targets to counter lipotoxicity of saturated FFA. Results As a model system to test this hypothesis, human hepatoblastoma cells (HepG2 were exposed to elevated physiological levels of FFAs and TNF-α. Triglyceride (TG accumulation, toxicity and the genomic responses to the treatments were measured. Here, we present a framework to identify such genes in the context of lipotoxicity. The aim of the current study is to identify the genes that could be altered to treat or ameliorate the cellular responses affected by a complex disease rather than to identify the causal genes. Genes that regulate the TG accumulation, cytotoxicity or both were identified by a modified genetic algorithm partial least squares (GA/PLS analysis. The analyses identified NADH dehydrogenase and mitogen activated protein kinases (MAPKs as important regulators of both cytotoxicity and lipid accumulation in response to FFA and TNF-α exposure. In agreement with the predictions, inhibiting NADH dehydrogenase and c-Jun N-terminal kinase (JNK reduced cytotoxicity significantly and increased intracellular TG accumulation. Inhibiting another MAPK pathway, the extracellular signal regulated kinase (ERK, on the other hand, improved the cytotoxicity without changing TG accumulation. Much greater reduction in the toxicity was observed upon inhibiting the NADH dehydrogenase and MAPK (which were identified by the dual-response analysis, than for the

  7. RPE Query Processing and Optimization Techniques for XML Databases

    Institute of Scientific and Technical Information of China (English)

    Guo-Ren Wang; Bing Sun; Jian-Hua Lv; Ge Yu

    2004-01-01

    An extent join to compute path expressions containing parent-children and ancestor-descendent operations and two path expression optimization rules, path-shortening and path-complementing, are presented in this paper. Path-shortening reduces the number of joins by shortening the path while path-complementing optimizes the path execution by using an equivalent complementary path expression to compute the original one.Experimental results show that the algorithms proposed are more efficient than traditional algorithms.

  8. Cellular differentiation in the process of generation of the eukaryotic cell

    Science.gov (United States)

    Nakamura, Hakobu; Hase, Atsushi

    1990-11-01

    Primitive atmosphere of the earth did not contain oxygen gas (O2) when the proto-cells were generated successfully as the resut of chemical evolution and then evolved. Therefore, they first had acquired anaerobic energy metabolism, fermentation. The cellular metabolisms have often been formed by reorganizing to combine or recombinate between pre-existing metabolisms and newly born bioreactions. Photosynthetic metabolism in eukaryotic chloroplast consists of an electron-transfer photosystem and a fermentative reductive pentose phosphate cycle. On the other hand, O2-respiration of eukaryotic mitochondrion is made of Embden-Meyerhof (EM) pathway and tricarboxylic acid cycle, which originate from a connection of fermentative metabolisms, and an electron-transfer respiratory chain, which has been derived from the photosystem. These metabolisms already are completed in some evolved prokaryotes, for example the cyanobacteriumChlorogloea fritschii and aerobic photosynthetic bacteriaRhodospirillum rubrum andErythrobacter sp. Therefore, it can be reasonably presumed that the eukaryotic chloroplast and mitochondrion have once been formed as the result of metabolic (and genetic) differentiations in most evolved cyanobacterium. Symbiotic theory has explained the origin of eukaryotic cell as that in which the mitochondrion and chloroplast have been derived from endosymbionts of aerobic bacterium and cyanobacterium, respectively, and has mentioned as one of the most potent supportive evidences that amino acid sequences of the photosynthetic and O2 -respiratory enzymes show similarities to corresponding prokaryotic enzymes. However, as will be shown in this discussion, many examples have shown currently that prokaryotic sequences of informative molecules are conserved well not only in those of the mitochondrial and chloroplast molecules but also in the nuclear molecules. In fact, the similarities in sequence of informative molecules are preserved well among the organisms not only

  9. Long-Term Calorie Restriction Enhances Cellular Quality-Control Processes in Human Skeletal Muscle

    Directory of Open Access Journals (Sweden)

    Ling Yang

    2016-01-01

    Full Text Available Calorie restriction (CR retards aging, acts as a hormetic intervention, and increases serum corticosterone and HSP70 expression in rodents. However, less is known regarding the effects of CR on these factors in humans. Serum cortisol and molecular chaperones and autophagic proteins were measured in the skeletal muscle of subjects on CR diets for 3–15 years and in control volunteers. Serum cortisol was higher in the CR group than in age-matched sedentary and endurance athlete groups (15.6 ± 4.6 ng/dl versus 12.3 ± 3.9 ng/dl and 11.2 ± 2.7 ng/dl, respectively; p ≤ 0.001. HSP70, Grp78, beclin-1, and LC3 mRNA and/or protein levels were higher in the skeletal muscle of the CR group compared to controls. Our data indicate that CR in humans is associated with sustained rises in serum cortisol, reduced inflammation, and increases in key molecular chaperones and autophagic mediators involved in cellular protein quality control and removal of dysfunctional proteins and organelles.

  10. Cellular and molecular processes of regeneration, with special emphasis on fish fins.

    Science.gov (United States)

    Nakatani, Yuki; Kawakami, Atsushi; Kudo, Akira

    2007-02-01

    The phenomenon of 'epimorphic regeneration', a complete reformation of lost tissues and organs from adult differentiated cells, has been fascinating many biologists for many years. While most vertebrate species including humans do not have a remarkable ability for regeneration, the lower vertebrates such as urodeles and fish have exceptionally high regeneration abilities. In particular, the teleost fish has a high ability to regenerate a variety of tissues and organs including scales, muscles, spinal cord and heart among vertebrate species. Hence, an understanding of the regeneration mechanism in teleosts will provide an essential knowledge base for rational approaches to tissue and organ regeneration in mammals. In the last decade, small teleost fish such as the zebrafish and medaka have emerged as powerful animal models in which a variety of developmental, genetic and molecular approaches are applicable. In addition, rapid progress in the development of genome resources such as expressed sequence tags and genome sequences has accelerated the speed of the molecular analysis of regeneration. This review summarizes the current status of our understanding of the cellular and molecular basis of regeneration, particularly that regarding fish fins.

  11. Evolution and regulation of cellular periodic processes: a role for paralogues

    DEFF Research Database (Denmark)

    Trachana, Kalliopi; Jensen, Lars Juhl; Bork, Peer

    2010-01-01

    Several cyclic processes take place within a single organism. For example, the cell cycle is coordinated with the 24 h diurnal rhythm in animals and plants, and with the 40 min ultradian rhythm in budding yeast. To examine the evolution of periodic gene expression during these processes, we...

  12. OPTIMIZATION OF PROCESSING PARAMETERS DURING ISM PROCESS OF Ti-15-3

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    Based on the direct finite difference method, a numerical model for simulating the temperature field in the charge during Induction Skull Melting (ISM) process has been developed. By use of the simulation program, the temperature field in the charge of Ti-15V-3Cr-3Sn-3Al(Ti-15-3) has been calculated under the condition of various melting powers and charge weights. Furthermore, the relationship between the ultimate temperature in the melt and the melting power, charge weight has been set up. On the basis of the relationship, the parameters during the ISM process of Ti-15-3 (the principle is adaptable to other titanium alloys) would be optimized, consequently, much man-power and finance would be saved and the quality of the melt would be improved.

  13. Metamodeling and Optimization of a Blister Copper Two-Stage Production Process

    Science.gov (United States)

    Jarosz, Piotr; Kusiak, Jan; Małecki, Stanisław; Morkisz, Paweł; Oprocha, Piotr; Pietrucha, Wojciech; Sztangret, Łukasz

    2016-06-01

    It is often difficult to estimate parameters for a two-stage production process of blister copper (containing 99.4 wt.% of Cu metal) as well as those for most industrial processes with high accuracy, which leads to problems related to process modeling and control. The first objective of this study was to model flash smelting and converting of Cu matte stages using three different techniques: artificial neural networks, support vector machines, and random forests, which utilized noisy technological data. Subsequently, more advanced models were applied to optimize the entire process (which was the second goal of this research). The obtained optimal solution was a Pareto-optimal one because the process consisted of two stages, making the optimization problem a multi-criteria one. A sequential optimization strategy was employed, which aimed for optimal control parameters consecutively for both stages. The obtained optimal output parameters for the first smelting stage were used as input parameters for the second converting stage. Finally, a search for another optimal set of control parameters for the second stage of a Kennecott-Outokumpu process was performed. The optimization process was modeled using a Monte-Carlo method, and both modeling parameters and computed optimal solutions are discussed.

  14. Analysis of Process Parameters for Optimization of Plastic Extrusion in Pipe Manufacturing

    Directory of Open Access Journals (Sweden)

    Mr. Sandip S. Gadekar

    2015-05-01

    Full Text Available The objective of this paper is to study the defects in the plastic pipe, to optimize the plastic pipe manufacturing process. It is very essential to learn the process parameter and the defect in the plastic pipe manufacturing process to optimize it. For the optimization Taguchi techniques is used in this paper. For the research work Shivraj HY-Tech Drip Irrigation pipe manufacturing, Company was selected. This paper is specifically design for the optimization in the current process. The experiment was analyzed using commercial Minitab16 software, interpretation has made, and optimized factor settings were chosen. After prediction of result the quality loss is calculated and it is compare with before implementation of DOE. The research works has improves the Production, quality and optimizes the process.

  15. Diurnal Regulation of Cellular Processes in the Cyanobacterium Synechocystis sp. Strain PCC 6803: Insights from Transcriptomic, Fluxomic, and Physiological Analyses.

    Science.gov (United States)

    Saha, Rajib; Liu, Deng; Hoynes-O'Connor, Allison; Liberton, Michelle; Yu, Jingjie; Bhattacharyya-Pakrasi, Maitrayee; Balassy, Andrea; Zhang, Fuzhong; Moon, Tae Seok; Maranas, Costas D; Pakrasi, Himadri B

    2016-05-03

    Synechocystis sp. strain PCC 6803 is the most widely studied model cyanobacterium, with a well-developed omics level knowledgebase. Like the lifestyles of other cyanobacteria, that of Synechocystis PCC 6803 is tuned to diurnal changes in light intensity. In this study, we analyzed the expression patterns of all of the genes of this cyanobacterium over two consecutive diurnal periods. Using stringent criteria, we determined that the transcript levels of nearly 40% of the genes in Synechocystis PCC 6803 show robust diurnal oscillating behavior, with a majority of the transcripts being upregulated during the early light period. Such transcripts corresponded to a wide array of cellular processes, such as light harvesting, photosynthetic light and dark reactions, and central carbon metabolism. In contrast, transcripts of membrane transporters for transition metals involved in the photosynthetic electron transport chain (e.g., iron, manganese, and copper) were significantly upregulated during the late dark period. Thus, the pattern of global gene expression led to the development of two distinct transcriptional networks of coregulated oscillatory genes. These networks help describe how Synechocystis PCC 6803 regulates its metabolism toward the end of the dark period in anticipation of efficient photosynthesis during the early light period. Furthermore, in silico flux prediction of important cellular processes and experimental measurements of cellular ATP, NADP(H), and glycogen levels showed how this diurnal behavior influences its metabolic characteristics. In particular, NADPH/NADP(+) showed a strong correlation with the majority of the genes whose expression peaks in the light. We conclude that this ratio is a key endogenous determinant of the diurnal behavior of this cyanobacterium. Cyanobacteria are photosynthetic microbes that use energy from sunlight and CO2 as feedstock. Certain cyanobacterial strains are amenable to facile genetic manipulation, thus enabling

  16. Diurnal Regulation of Cellular Processes in the Cyanobacterium Synechocystis sp. Strain PCC 6803: Insights from Transcriptomic, Fluxomic, and Physiological Analyses

    Directory of Open Access Journals (Sweden)

    Rajib Saha

    2016-05-01

    Full Text Available Synechocystis sp. strain PCC 6803 is the most widely studied model cyanobacterium, with a well-developed omics level knowledgebase. Like the lifestyles of other cyanobacteria, that of Synechocystis PCC 6803 is tuned to diurnal changes in light intensity. In this study, we analyzed the expression patterns of all of the genes of this cyanobacterium over two consecutive diurnal periods. Using stringent criteria, we determined that the transcript levels of nearly 40% of the genes in Synechocystis PCC 6803 show robust diurnal oscillating behavior, with a majority of the transcripts being upregulated during the early light period. Such transcripts corresponded to a wide array of cellular processes, such as light harvesting, photosynthetic light and dark reactions, and central carbon metabolism. In contrast, transcripts of membrane transporters for transition metals involved in the photosynthetic electron transport chain (e.g., iron, manganese, and copper were significantly upregulated during the late dark period. Thus, the pattern of global gene expression led to the development of two distinct transcriptional networks of coregulated oscillatory genes. These networks help describe how Synechocystis PCC 6803 regulates its metabolism toward the end of the dark period in anticipation of efficient photosynthesis during the early light period. Furthermore, in silico flux prediction of important cellular processes and experimental measurements of cellular ATP, NADP(H, and glycogen levels showed how this diurnal behavior influences its metabolic characteristics. In particular, NADPH/NADP+ showed a strong correlation with the majority of the genes whose expression peaks in the light. We conclude that this ratio is a key endogenous determinant of the diurnal behavior of this cyanobacterium.

  17. Analyses of Methods and Algorithms for Modelling and Optimization of Biotechnological Processes

    Directory of Open Access Journals (Sweden)

    Stoyan Stoyanov

    2009-08-01

    Full Text Available A review of the problems in modeling, optimization and control of biotechnological processes and systems is given in this paper. An analysis of existing and some new practical optimization methods for searching global optimum based on various advanced strategies - heuristic, stochastic, genetic and combined are presented in the paper. Methods based on the sensitivity theory, stochastic and mix strategies for optimization with partial knowledge about kinetic, technical and economic parameters in optimization problems are discussed. Several approaches for the multi-criteria optimization tasks are analyzed. The problems concerning optimal controls of biotechnological systems are also discussed.

  18. Signal processing for solar array monitoring, fault detection, and optimization

    CERN Document Server

    Braun, Henry; Spanias, Andreas

    2012-01-01

    Although the solar energy industry has experienced rapid growth recently, high-level management of photovoltaic (PV) arrays has remained an open problem. As sensing and monitoring technology continues to improve, there is an opportunity to deploy sensors in PV arrays in order to improve their management. In this book, we examine the potential role of sensing and monitoring technology in a PV context, focusing on the areas of fault detection, topology optimization, and performance evaluation/data visualization. First, several types of commonly occurring PV array faults are considered and detection algorithms are described. Next, the potential for dynamic optimization of an array's topology is discussed, with a focus on mitigation of fault conditions and optimization of power output under non-fault conditions. Finally, monitoring system design considerations such as type and accuracy of measurements, sampling rate, and communication protocols are considered. It is our hope that the benefits of monitoring presen...

  19. Optimizing RDF Data Cubes for Efficient Processing of Analytical Queries

    DEFF Research Database (Denmark)

    Jakobsen, Kim Ahlstrøm; Andersen, Alex B.; Hose, Katja

    2015-01-01

    data warehouses and data cubes. Today, external data sources are essential for analytics and, as the Semantic Web gains popularity, more and more external sources are available in native RDF. With the recent SPARQL 1.1 standard, performing analytical queries over RDF data sources has finally become......In today’s data-driven world, analytical querying, typically based on the data cube concept, is the cornerstone of answering important business questions and making data-driven decisions. Traditionally, the underlying analytical data was mostly internal to the organization and stored in relational...... feasible. However, unlike their relational counterparts, RDF data cubes stores lack optimizations that enable fast querying. In this paper, we present an approach to optimizing RDF data cubes that is based on three novel cube patterns that optimize RDF data cubes, as well as associated algorithms...

  20. An attempt of reduction of optimization costs of complex industrial processes

    Science.gov (United States)

    Sztangret, Łukasz; Kusiak, Jan

    2017-09-01

    Reduction of computational costs of optimization of real industrial processes is crucial, because the models of these processes are often complex and demand time consuming numerical computations. Iterative optimization procedures have to run the simulations many times and therefore the computational costs of the optimization may be unacceptable high. This is why a new optimization methods and strategies which need less simulation runs are searched. The paper is focused on the problem of reduction of computational costs of optimization procedure. The main goal is the presentation of developed by the Authors new, efficient Approximation Based Optimization (ABO) and Modified Approximation Based Optimization (MABO) methods which allow finding the global minimum in smaller number of objective function calls. Detailed algorithm of the MABO method as well as the results of tests using several benchmark functions are presented. The efficiency of MABO method was compared with heuristic methods and the results show that MABO method reduces the computational costs and improve the optimization accuracy.

  1. Reducing residual stresses and deformations in selective laser melting through multi-level multi-scale optimization of cellular scanning strategy

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2016-01-01

    Residual stresses and deformations continue to remain one of the primary challenges towards expanding the scope of selective laser melting as an industrial scale manufacturing process. While process monitoring and feedback-based process control of the process has shown significant potential......, there is still dearth of techniques to tackle the issue. Numerical modelling of selective laser melting process has thus been an active area of research in the last few years. However, large computational resource requirements have slowed the usage of these models for optimizing the process.In this paper......, a calibrated, fast, multiscale thermal model coupled with a 3D finite element mechanical model is used to simulate residual stress formation and deformations during selective laser melting. The resulting reduction in thermal model computation time allows evolutionary algorithm-based optimization of the process...

  2. Rapid Process Optimization: A Novel Process Improvement Methodology to Innovate Health Care Delivery.

    Science.gov (United States)

    Wiler, Jennifer L; Bookman, Kelly; Birznieks, Derek B; Leeret, Robert; Koehler, April; Planck, Shauna; Zane, Richard

    2016-03-26

    Health care systems have utilized various process redesign methodologies to improve care delivery. This article describes the creation of a novel process improvement methodology, Rapid Process Optimization (RPO). This system was used to redesign emergency care delivery within a large academic health care system, which resulted in a decrease: (1) door-to-physician time (Department A: 54 minutes pre vs 12 minutes 1 year post; Department B: 20 minutes pre vs 8 minutes 3 months post), (2) overall length of stay (Department A: 228 vs 184; Department B: 202 vs 192), (3) discharge length of stay (Department A: 216 vs 140; Department B: 179 vs 169), and (4) left without being seen rates (Department A: 5.5% vs 0.0%; Department B: 4.1% vs 0.5%) despite a 47% increased census at Department A (34 391 vs 50 691) and a 4% increase at Department B (8404 vs 8753). The novel RPO process improvement methodology can inform and guide successful care redesign.

  3. Use of Experimental Design for Peuhl Cheese Process Optimization ...

    African Journals Online (AJOL)

    Devika

    added (7 mL), heating temperature (84.12°C) and heating time (15 min). When these optimal ... conditions constitute one of the major obstacles for ... a two levels factorial design;. • a design of ... balance (Sartorius Gmbh, Göttingen, Germany),.

  4. Process optimization of biodiesel production from wild rapeseed (Brassica campestris

    Directory of Open Access Journals (Sweden)

    Héctor Ramírez

    2012-03-01

    Full Text Available The objectives of this study were, to optimize the performance of biodiesel from rapeseed wild oil depending on the molar ratio methanol / oil, the concentration of NaOH and KOH homogeneous catalysts, temperature and time of transesterification through the response surface methodology, and determining the physicochemical characteristics of biodiesel obtained under optimized conditions. A Plackett and Burman (PB12 design was applied for the screening stage and a rotatable central composite design (DCCR for the final optimization. The conditions that maximize the yield of biodiesel (77.8% were obtained at concentrations of 0 to 0.2% NaOH and 0.4 to 0.6% KOH, with time from 77 to 81 minutes, keeping constant the molar ratio of methanol/oil in 6/1 and a temperature of 60 °C. The physicochemical properties of biodiesel obtained under optimized conditions meet the technical specifications given by ASTM D6751 - 07 and EN14214

  5. Optimal Input Strategy for Plug and Play Process Control Systems

    DEFF Research Database (Denmark)

    Kragelund, Martin Nygaard; Leth, John-Josef; Wisniewski, Rafal

    2010-01-01

    This paper considers the problem of optimal operation of a plant, which goal is to maintain production at minimum cost. The system considered in this work consists of a joined plant and redundant input systems. It is assumed that each input system contributes to a flow of goods into the joined part...

  6. Automatic generation of optimal business processes from business rules

    NARCIS (Netherlands)

    Steen, B.; Ferreira Pires, Luis; Iacob, Maria Eugenia

    2010-01-01

    In recent years, business process models are increasingly being used as a means for business process improvement. Business rules can be seen as requirements for business processes, in that they describe the constraints that must hold for business processes that implement these business rules.

  7. Parametric optimization of ultrasonic machining process using gravitational search and fireworks algorithms

    Directory of Open Access Journals (Sweden)

    Debkalpa Goswami

    2015-03-01

    Full Text Available Ultrasonic machining (USM is a mechanical material removal process used to erode holes and cavities in hard or brittle workpieces by using shaped tools, high-frequency mechanical motion and an abrasive slurry. Unlike other non-traditional machining processes, such as laser beam and electrical discharge machining, USM process does not thermally damage the workpiece or introduce significant levels of residual stress, which is important for survival of materials in service. For having enhanced machining performance and better machined job characteristics, it is often required to determine the optimal control parameter settings of an USM process. The earlier mathematical approaches for parametric optimization of USM processes have mostly yielded near optimal or sub-optimal solutions. In this paper, two almost unexplored non-conventional optimization techniques, i.e. gravitational search algorithm (GSA and fireworks algorithm (FWA are applied for parametric optimization of USM processes. The optimization performance of these two algorithms is compared with that of other popular population-based algorithms, and the effects of their algorithm parameters on the derived optimal solutions and computational speed are also investigated. It is observed that FWA provides the best optimal results for the considered USM processes.

  8. A model for Intelligent Random Access Memory architecture (IRAM) cellular automata algorithms on the Associative String Processing machine (ASTRA)

    CERN Document Server

    Rohrbach, F; Vesztergombi, G

    1997-01-01

    In the near future, the computer performance will be completely determined by how long it takes to access memory. There are bottle-necks in memory latency and memory-to processor interface bandwidth. The IRAM initiative could be the answer by putting Processor-In-Memory (PIM). Starting from the massively parallel processing concept, one reached a similar conclusion. The MPPC (Massively Parallel Processing Collaboration) project and the 8K processor ASTRA machine (Associative String Test bench for Research \\& Applications) developed at CERN \\cite{kuala} can be regarded as a forerunner of the IRAM concept. The computing power of the ASTRA machine, regarded as an IRAM with 64 one-bit processors on a 64$\\times$64 bit-matrix memory chip machine, has been demonstrated by running statistical physics algorithms: one-dimensional stochastic cellular automata, as a simple model for dynamical phase transitions. As a relevant result for physics, the damage spreading of this model has been investigated.

  9. NMDA-R inhibition affects cellular process formation in Tilapia melanocytes; a model for pigmented adrenergic neurons in process formation and retraction.

    Science.gov (United States)

    Ogundele, Olalekan Michael; Okunnuga, Adetokunbo Adedotun; Fabiyi, Temitope Deborah; Olajide, Olayemi Joseph; Akinrinade, Ibukun Dorcas; Adeniyi, Philip Adeyemi; Ojo, Abiodun Ayodele

    2014-06-01

    Parkinson's disease has long been described to be a product of dopamine and (or) melanin loss in the substanstia nigra (SN). Although most studies have focused on dopaminergic neurons, it is important to consider the role of pigment cells in the etiology of the disease and to create an in vitro live cell model for studies involving pigmented adrenergic cells of the SN in Parkinsonism. The Melanocytes share specific features with the pigmented adrenergic neurons as both cells are pigmented, contain adrenergic receptors and have cellular processes. Although the melanocyte cellular processes are relatively short and observable only when stimulated appropriately by epinephrine and other factors or molecules. This study employs the manipulation of N-Methyl-D-Aspartate Receptor (NMDA-R), a major receptor in neuronal development, in the process formation pattern of the melanocyte in order to create a suitable model to depict cellular process elongation and shortening in pigmented adrenergic cells. NMDA-R is an important glutamate receptor implicated in neurogenesis, neuronal migration, maturation and cell death, thus we investigated the role of NMDA-R potentiation by glutamate/KCN and its inhibition by ketamine in the behavior of fish scale melanocytes in vitro. This is aimed at establishing the regulatory role of NMDA-R in this cell type (melanocytes isolated form Tilapia) in a similar manner to what is observable in the mammalian neurons. In vitro live cell culture was prepared in modified Ringer's solution following which the cells were treated as follows; Control, Glutamate, Ketamine, Glutamate + Ketamine, KCN + Ketamine and KCN. The culture was maintained for 10 min and the changes were captured in 3D-Time frame at 0, 5 and 10 min for the control and 5, 7 and 10 min for each of the treatment category. Glutamate treatment caused formation of short cellular processes localized directly on the cell body while ketamine treatment (inhibition of NMDA-R) facilitated

  10. Modelling the interaction of aeolian and fluvial processes with a combined cellular model of sand dunes and river systems

    Science.gov (United States)

    Liu, Baoli; Coulthard, Tom J.

    2017-09-01

    Aeolian and fluvial processes are important agents for shaping the surface of the Earth, but are largely studied in isolation despite there being many locations where both processes are acting together and influencing each other. Using field data to investigate fluvial-aeolian interactions is, however, hampered by our short length of record and low temporal resolution of observations. Here we use numerical modelling to investigate, for the first time, the interplay between aeolian (sand dunes) and fluvial (river channel) processes. This modelling is carried out by combining two existing cellular models of aeolian and fluvial processes that requires considerable consideration of the different process representation and time stepping used. The result is a fully coupled (in time and space) sand dune - river model. Over a thousand-year simulation the model shows how the migration of sand dunes is readily blocked by rivers, yet aeolian processes can push the channel downwind. Over time cyclic channel avulsions develop indicating that aeolian action on fluvial systems may play an important part in governing avulsion frequency, and thus alluvial architecture.

  11. Cellular processing of the amyloidogenic cystatin C variant of hereditary cerebral hemorrhage with amyloidosis, Icelandic type

    DEFF Research Database (Denmark)

    Benedikz, Eirikur; Merz, G S; Schwenk, V

    1999-01-01

    of an amyloidogenic mutation on the intracellular processing of its protein product. The protein, a mutant of the cysteine protease inhibitor cystatin C, is the amyloid precursor protein in Hereditary Cerebral Hemorrhage with Amyloidosis--Icelandic type (HCHWA-I). The amyloid fibers are composed of mutant cystatin C...

  12. Optimization Of PVDF-TrFE Processing Conditions For The Fabrication Of Organic MEMS Resonators.

    Science.gov (United States)

    Ducrot, Pierre-Henri; Dufour, Isabelle; Ayela, Cédric

    2016-01-21

    This paper reports a systematic optimization of processing conditions of PVDF-TrFE piezoelectric thin films, used as integrated transducers in organic MEMS resonators. Indeed, despite data on electromechanical properties of PVDF found in the literature, optimized processing conditions that lead to these properties remain only partially described. In this work, a rigorous optimization of parameters enabling state-of-the-art piezoelectric properties of PVDF-TrFE thin films has been performed via the evaluation of the actuation performance of MEMS resonators. Conditions such as annealing duration, poling field and poling duration have been optimized and repeatability of the process has been demonstrated.

  13. OPTIMAL PORTFOLIO ON TRACKING THE EXPECTED WEALTH PROCESS WITH LIQUIDITY CONSTRAINTS

    Institute of Scientific and Technical Information of China (English)

    Luo Kui; Wang Guangming; Hu Yijun

    2011-01-01

    In this article, the authors consider the optimal portfolio on tracking the expected wealth process with liquidity constraints. The constrained optimal portfolio is first formulated as minimizing the cumulate variance between the wealth process and the expected wealth process. Then, the dynamic programming methodology is applied to reduce the whole problem to solving the Hamilton-Jacobi-Bellman equation coupled with the liquidity constraint, and the method of Lagrange multiplier is applied to handle the constraint. Finally, a numerical method is proposed to solve the constrained HJB equation and the constrained optimal strategy. Especially, the explicit solution to this optimal problem is derived when there is no liquidity constraint.

  14. Biodiesel production from microalgae Spirulina maxima by two step process: Optimization of process variable

    Directory of Open Access Journals (Sweden)

    M.A. Rahman

    2017-04-01

    Full Text Available Biodiesel from green energy source is gaining tremendous attention for ecofriendly and economically aspect. In this investigation, a two-step process was developed for the production of biodiesel from microalgae Spirulina maxima and determined best operating conditions for the steps. In the first stage, acid esterification was conducted to lessen acid value (AV from 10.66 to 0.51 mgKOH/g of the feedstock and optimal conditions for maximum esterified oil yielding were found at molar ratio 12:1, temperature 60°C, 1% (wt% H2SO4, and mixing intensity 400 rpm for a reaction time of 90 min. The second stage alkali transesterification was carried out for maximum biodiesel yielding (86.1% and optimal conditions were found at molar ratio 9:1, temperature 65°C, mixing intensity 600 rpm, catalyst concentration 0.75% (wt% KOH for a reaction time of 20 min. Biodiesel were analyzed according to ASTM standards and results were within standards limit. Results will helpful to produce third generation algal biodiesel from microalgae Spirulina maxima in an efficient manner.

  15. Optimal Control of a Fed-batch Fermentation Process by Neuro-Dynamic Programming

    Directory of Open Access Journals (Sweden)

    Tatiana Ilkova

    2004-10-01

    Full Text Available In this paper the method for optimal control of a fermentation process is presented, that is based on an approach for optimal control - Neuro-Dynamic programming. For this aim the approximation neural network is developed and the decision of the optimization problem is improved by an iteration mode founded on the Bellman equation. With this approach computing time and procedure are decreased and quality of the biomass at the end of the process is increased.

  16. A Knowledge-reuse Based Intelligent Reasoning Model for Worsted Process Optimization

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    The textile process planning is a knowledge reuse process in nature, which depends on the expert's knowledge and experience. It seems to be very difficult to build up an integral mathematical model to optimize hundreds of the processing parameters. In fact, the existing process cases which were recorded to ensure the ability to trace production steps can also be used to optimize the process itself. This paper presents a novel knowledge-reuse based hybrid intelligent reasoning model (HIRM) for worsted process optimization. The model architecture and reasoning mechanism are respectively described. An applied case with HIRM is given to demonstrate that the best process decision can be made, and important processing parameters such as for raw material optimized.

  17. Process Optimization of a Novel Immediate Release Film Coating System using QbD Principles

    National Research Council Canada - National Science Library

    Teckoe, Jason; Mascaro, Tracey; Farrell, Thomas P; Rajabi-Siahboomi, Ali R

    2013-01-01

    This work describes a quality-by-design (QbD) approach to determine the optimal coating process conditions and robust process operating space for an immediate release aqueous film coating system (Opadry® 200...

  18. β-carotene treatment alters the cellular death process in oxidative stress-induced K562 cells.

    Science.gov (United States)

    Akçakaya, Handan; Tok, Sabiha; Dal, Fulya; Cinar, Suzan Adin; Nurten, Rustem

    2017-03-01

    Oxidizing agents (e.g., H2 O2 ) cause structural and functional disruptions of molecules by affecting lipids, proteins, and nucleic acids. As a result, cellular mechanisms related to disrupted macro molecules are affected and cell death is induced. Oxidative damage can be prevented at a certain point by antioxidants or the damage can be reversed. In this work, we studied the cellular response against oxidative stress induced by H2 O2 and antioxidant-oxidant (β-carotene-H2 O2 ) interactions in terms of time, concentration, and treatment method (pre-, co-, and post) in K562 cells. We showed that co- or post-treatment with β-carotene did not protect cells from the damage of oxidative stress furthermore co- and post-β-carotene-treated oxidative stress induced cells showed similar results with only H2 O2 treated cells. However, β-carotene pre-treatment prevented oxidative damage induced by H2 O2 at concentrations lower than 1,000 μM compared with only H2 O2 -treated and co- and post-β-carotene-treated oxidative stress-induced cells in terms of studied cellular parameters (mitochondrial membrane potential [Δψm ], cell cycle and apoptosis). Prevention effect of β-carotene pre-treatment was lost at concentrations higher than 1,000 μM H2 O2 (2-10 mM). These findings suggest that β-carotene pre-treatment alters the effects of oxidative damage induced by H2 O2 and cell death processes in K562 cells.

  19. A differential genome-wide transcriptome analysis: impact of cellular copper on complex biological processes like aging and development.

    Directory of Open Access Journals (Sweden)

    Jörg Servos

    Full Text Available The regulation of cellular copper homeostasis is crucial in biology. Impairments lead to severe dysfunctions and are known to affect aging and development. Previously, a loss-of-function mutation in the gene encoding the copper-sensing and copper-regulated transcription factor GRISEA of the filamentous fungus Podospora anserina was reported to lead to cellular copper depletion and a pleiotropic phenotype with hypopigmentation of the mycelium and the ascospores, affected fertility and increased lifespan by approximately 60% when compared to the wild type. This phenotype is linked to a switch from a copper-dependent standard to an alternative respiration leading to both a reduced generation of reactive oxygen species (ROS and of adenosine triphosphate (ATP. We performed a genome-wide comparative transcriptome analysis of a wild-type strain and the copper-depleted grisea mutant. We unambiguously assigned 9,700 sequences of the transcriptome in both strains to the more than 10,600 predicted and annotated open reading frames of the P. anserina genome indicating 90% coverage of the transcriptome. 4,752 of the transcripts differed significantly in abundance with 1,156 transcripts differing at least 3-fold. Selected genes were investigated by qRT-PCR analyses. Apart from this general characterization we analyzed the data with special emphasis on molecular pathways related to the grisea mutation taking advantage of the available complete genomic sequence of P. anserina. This analysis verified but also corrected conclusions from earlier data obtained by single gene analysis, identified new candidates of factors as part of the cellular copper homeostasis system including target genes of transcription factor GRISEA, and provides a rich reference source of quantitative data for further in detail investigations. Overall, the present study demonstrates the importance of systems biology approaches also in cases were mutations in single genes are analyzed to

  20. Mechanistic Models for Process Development and Optimization of Fed-batch Fermentation Systems

    DEFF Research Database (Denmark)

    Mears, Lisa; Stocks, Stuart M.; Albæk, Mads O.

    2016-01-01

    into account the oxygen transfer conditions, as well as the evaporation rates of the system. Mechanistic models are valuable tools which are applicable for both process development and optimization. The state estimator described will be a valuable tool for future work as part of control strategy development...... for on-line process control and optimization....

  1. A generic methodology for processing route synthesis and design based on superstructure optimization

    DEFF Research Database (Denmark)

    Bertran, Maria-Ona; Frauzem, Rebecca; Sanchez-Arcilla, Ana Sofia

    2017-01-01

    In this paper, a systematic framework for novel and sustainable synthesis-design of processing routes is presented along with the associated computer-aided methods and tools. In Stage 1, superstructure optimization is used to determine the optimal processing route(s). In Stage 2, the design issue...

  2. Considerations on the Optimal and Efficient Processing of Information-Bearing Signals

    Science.gov (United States)

    Harms, Herbert Andrew

    2013-01-01

    Noise is a fundamental hurdle that impedes the processing of information-bearing signals, specifically the extraction of salient information. Processing that is both optimal and efficient is desired; optimality ensures the extracted information has the highest fidelity allowed by the noise, while efficiency ensures limited resource usage. Optimal…

  3. Stretching the limits of forming processes by robust optimization: A demonstrator

    NARCIS (Netherlands)

    Wiebenga, J.H.; Atzema, E.H.; Atzema, E.H.; van den Boogaard, Antonius H.; Yoon, Jeong Whan; Stoughton, Thomas B.; Rolfe, Bernard; Beynon, John H.; Hodgson, Peter

    2014-01-01

    Robust design of forming processes using numerical simulations is gaining attention throughout the industry. In this work, it is demonstrated how robust optimization can assist in further stretching the limits of metal forming processes. A deterministic and a robust optimization study are performed,

  4. Optimization of enrichment processes of pentachlorophenol (PCP) from water samples

    Institute of Scientific and Technical Information of China (English)

    LI Ping; LIU Jun-xin

    2004-01-01

    The method of enriching PCP (pentachlorophenol) from aquatic environment by solid phase extraction (SPE) was studied.Several factors affecting the recoveries of PCP, including sample pH, eluting solvent, eluting volume and flow rate of water sample, were optimized by orthogonal array design(OAD). The optimized results were sample pH 4; eluting solvent, 100% methanol; eluting solvent volume, 2 mi and flow rate of water sample, 4 mi/min. A comparison is made between SPE and liquid-liquid extraction(LLE) method. The recoveries of PCP were in the range of 87.6 % 133.6 % and 79 %- 120.3 % for S PE and LLE, respectively. Important advantages of the SPE compared with the LLE include the short extraction time and reduced consumption of organic solvents. SPE can replace LLE for isolating and concentrating PCP from water samples.

  5. OPTIMIZATION OF THE PROCESS OF DRYING THE FILTRATE DISTILLERY DREGS

    Directory of Open Access Journals (Sweden)

    A. A. Shevtsov

    2013-01-01

    Full Text Available The interactions of various factors affecting the process of drying the filtrate distillery dregs are investigated. Rational conditions for the process of drying the filtrate distillery dregs in a spray dryer are obtained.

  6. Process and Energy Optimization Assessment, Rock Island Arsenal, IL

    Science.gov (United States)

    2004-09-01

    applications and process equipment. Process equipment includes two absorption chillers , two steam- operated forge presses, immersion heaters for the...evaluate the boiler strategy at this facility. The facility is currently in the process of replacing the absorption chillers with electric chillers ...Carbonate Fuel Cells for DoD Applications: Rock Island Arsenal MCFC Engi- neer Research and Development Center, Construction Engineering Research

  7. Constrained Run-to-Run Optimization for Batch Process Based on Support Vector Regression Model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    An iterative (run-to-run) optimization method was presented for batch processes under input constraints. Generally it is very difficult to acquire an accurate mechanistic model for a batch process. Because support vector machine is powerful for the problems characterized by small samples, nonlinearity, high dimension and local minima, support vector regression models were developed for the end-point optimization of batch processes. Since there is no analytical way to find the optimal trajectory, an iterative method was used to exploit the repetitive nature of batch processes to determine the optimal operating policy. The optimization algorithm is proved convergent. The numerical simulation shows that the method can improve the process performance through iterations.

  8. Cellular processing of the amyloidogenic cystatin C variant of hereditary cerebral hemorrhage with amyloidosis, Icelandic type

    DEFF Research Database (Denmark)

    Benedikz, Eirikur; Merz, G S; Schwenk, V

    1999-01-01

    (L68Q) that lacks the first 10 amino acids. We have previously shown that processing of wild-type cystatin C entails formation of a transient intracellular dimer that dissociates prior to secretion, such that extracellular cystatin C is monomeric. We report here that the cystatin C mutation engenders...... several alterations in its intracellular trafficking. It forms a stable intracellular dimer that is partially retained in the endoplasmic reticulum and degraded. The bulk of mutant cystatin C that is secreted does not dissociate and is secreted as an inactive dimer. Thus, formation of the stable mutant...

  9. Evaluation of dynamic behavior forecasting parameters in the process of transition rule induction of unidimensional cellular automata.

    Science.gov (United States)

    Weinert, Wagner Rodrigo; Lopes, Heitor Silvério

    2010-01-01

    The simulation of the dynamics of a cellular systems based on cellular automata (CA) can be computationally expensive. This is particularly true when such simulation is part of a procedure of rule induction to find suitable transition rules for the CA. Several efforts have been described in the literature to make this problem more treatable. This work presents a study about the efficiency of dynamic behavior forecasting parameters (DBFPs) used for the induction of transition rules of CA for a specific problem: the classification by the majority rule. A total of 8 DBFPs were analyzed for the 31 best-performing rules found in the literature. Some of these DBFPs were highly correlated each other, meaning they yield the same information. Also, most rules presented values of the DBFPs very close each other. An evolutionary algorithm, based on gene expression programming, was developed for finding transition rules according a given preestablished behavior. The simulation of the dynamic behavior of the CA is not used to evaluate candidate transition rules. Instead, the average values for the DBFPs were used as reference. Experiments were done using the DBFPs separately and together. In both cases, the best induced transition rules were not acceptable solutions for the desired behavior of the CA. We conclude that, although the DBFPs represent interesting aspects of the dynamic behavior of CAs, the transition rule induction process still requires the simulation of the dynamics and cannot rely only on the DBFPs.

  10. Processing the Bouguer anomaly map of Biga and the surrounding area by the cellular neural network: application to the southwestern Marmara region

    Science.gov (United States)

    Aydogan, D.

    2007-04-01

    An image processing technique called the cellular neural network (CNN) approach is used in this study to locate geological features giving rise to gravity anomalies such as faults or the boundary of two geologic zones. CNN is a stochastic image processing technique based on template optimization using the neighborhood relationships of cells. These cells can be characterized by a functional block diagram that is typical of neural network theory. The functionality of CNN is described in its entirety by a number of small matrices (A, B and I) called the cloning template. CNN can also be considered to be a nonlinear convolution of these matrices. This template describes the strength of the nearest neighbor interconnections in the network. The recurrent perceptron learning algorithm (RPLA) is used in optimization of cloning template. The CNN and standard Canny algorithms were first tested on two sets of synthetic gravity data with the aim of checking the reliability of the proposed approach. The CNN method was compared with classical derivative techniques by applying the cross-correlation method (CC) to the same anomaly map as this latter approach can detect some features that are difficult to identify on the Bouguer anomaly maps. This approach was then applied to the Bouguer anomaly map of Biga and its surrounding area, in Turkey. Structural features in the area between Bandirma, Biga, Yenice and Gonen in the southwest Marmara region are investigated by applying the CNN and CC to the Bouguer anomaly map. Faults identified by these algorithms are generally in accordance with previously mapped surface faults. These examples show that the geologic boundaries can be detected from Bouguer anomaly maps using the cloning template approach. A visual evaluation of the outputs of the CNN and CC approaches is carried out, and the results are compared with each other. This approach provides quantitative solutions based on just a few assumptions, which makes the method more

  11. AN ASSESSMENT AND OPTIMIZATION OF QUALITY OF STRATEGY PROCESS

    Directory of Open Access Journals (Sweden)

    Snezana Nestic

    2013-12-01

    Full Text Available In order to improve the quality of their processes companies usually rely on quality management systems and the requirements of ISO 9001:2008. The small and medium-sized companies are faced with a series of challenges in objectification, evaluation and assessment of the quality of processes. In this paper, the strategy process is decomposed for one typical medium size of manufacturing company and the indicators of the defined sub processes, based on the requirements of ISO 9001:2008, are developed. The weights of sub processes are calculated using fuzzy set approach. Finally, the developed solution based on the genetic algorithm approach is presented and tested on data from 142 manufacturing companies. The presented solution enables assessment of the quality of a strategy process, ranks the indicators and provides a basis for successful improvement of the quality of the strategy process.

  12. Parametric Optimization of Nd:YAG Laser Beam Machining Process Using Artificial Bee Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Rajarshi Mukherjee

    2013-01-01

    Full Text Available Nd:YAG laser beam machining (LBM process has a great potential to manufacture intricate shaped microproducts with its unique characteristics. In practical applications, such as drilling, grooving, cutting, or scribing, the optimal combination of Nd:YAG LBM process parameters needs to be sought out to provide the desired machining performance. Several mathematical techniques, like Taguchi method, desirability function, grey relational analysis, and genetic algorithm, have already been applied for parametric optimization of Nd:YAG LBM processes, but in most of the cases, suboptimal or near optimal solutions have been reached. This paper focuses on the application of artificial bee colony (ABC algorithm to determine the optimal Nd:YAG LBM process parameters while considering both single and multiobjective optimization of the responses. A comparative study with other population-based algorithms, like genetic algorithm, particle swarm optimization, and ant colony optimization algorithm, proves the global applicability and acceptability of ABC algorithm for parametric optimization. In this algorithm, exchange of information amongst the onlooker bees minimizes the search iteration for the global optimal and avoids generation of suboptimal solutions. The results of two sample paired t-tests also demonstrate its superiority over the other optimization algorithms.

  13. Morphology of Filamentous Fungi: Linking Cellular Biology to Process Engineering Using Aspergillus niger

    Science.gov (United States)

    Krull, Rainer; Cordes, Christiana; Horn, Harald; Kampen, Ingo; Kwade, Arno; Neu, Thomas R.; Nörtemann, Bernd

    In various biotechnological processes, filamentous fungi, e.g. Aspergillus niger, are widely applied for the production of high value-added products due to their secretion efficiency. There is, however, a tangled relationship between the morphology of these microorganisms, the transport phenomena and the related productivity. The morphological characteristics vary between freely dispersed mycelia and distinct pellets of aggregated biomass. Hence, advantages and disadvantages for mycel or pellet cultivation have to be balanced out carefully. Due to this inadequate understanding of morphogenesis of filamentous microorganisms, fungal morphology, along with reproducibility of inocula of the same quality, is often a bottleneck of productivity in industrial production. To obtain an optimisation of the production process it is of great importance to gain a better understanding of the molecular and cell biology of these microorganisms as well as the approaches in biochemical engineering and particle technique, in particular to characterise the interactions between the growth conditions, cell morphology, spore-hyphae-interactions and product formation. Advances in particle and image analysis techniques as well as micromechanical devices and their applications to fungal cultivations have made available quantitative morphological data on filamentous cells. This chapter provides the ambitious aspects of this line of action, focussing on the control and characterisation of the morphology, the transport gradients and the approaches to understand the metabolism of filamentous fungi. Based on these data, bottlenecks in the morphogenesis of A. niger within the complex production pathways from gene to product should be identified and this may improve the production yield.

  14. Morphology of filamentous fungi: linking cellular biology to process engineering using Aspergillus niger.

    Science.gov (United States)

    Krull, Rainer; Cordes, Christiana; Horn, Harald; Kampen, Ingo; Kwade, Arno; Neu, Thomas R; Nörtemann, Bernd

    2010-01-01

    In various biotechnological processes, filamentous fungi, e.g. Aspergillus niger, are widely applied for the production of high value-added products due to their secretion efficiency. There is, however, a tangled relationship between the morphology of these microorganisms, the transport phenomena and the related productivity. The morphological characteristics vary between freely dispersed mycelia and distinct pellets of aggregated biomass. Hence, advantages and disadvantages for mycel or pellet cultivation have to be balanced out carefully. Due to this inadequate understanding of morphogenesis of filamentous microorganisms, fungal morphology, along with reproducibility of inocula of the same quality, is often a bottleneck of productivity in industrial production. To obtain an optimisation of the production process it is of great importance to gain a better understanding of the molecular and cell biology of these microorganisms as well as the approaches in biochemical engineering and particle technique, in particular to characterise the interactions between the growth conditions, cell morphology, spore-hyphae-interactions and product formation. Advances in particle and image analysis techniques as well as micromechanical devices and their applications to fungal cultivations have made available quantitative morphological data on filamentous cells. This chapter provides the ambitious aspects of this line of action, focussing on the control and characterisation of the morphology, the transport gradients and the approaches to understand the metabolism of filamentous fungi. Based on these data, bottlenecks in the morphogenesis of A. niger within the complex production pathways from gene to product should be identified and this may improve the production yield.

  15. Cellular and molecular mechanisms activating the cell death processes by chalcones: Critical structural effects.

    Science.gov (United States)

    Champelovier, Pierre; Chauchet, Xavier; Hazane-Puch, Florence; Vergnaud, Sabrina; Garrel, Catherine; Laporte, François; Boutonnat, Jean; Boumendjel, Ahcène

    2013-12-01

    Chalcones are naturally occurring compounds with diverse pharmacological activities. Chalcones derive from the common structure: 1,3-diphenylpropenone. The present study aims to better understand the mechanistic pathways triggering chalcones anticancer effects and providing evidences that minor structural difference could lead to important difference in mechanistic effect. We selected two recently investigated chalcones (A and B) and investigated them on glioblastoma cell lines. It was found that chalcone A induced an apoptotic process (type I PCD), via the activation of caspase-3, -8 and -9. Chalcone A also increased CDK1/cyclin B ratios and decreased the mitochondrial transmembrane potential (ΔΨm). Chalcone B induced an autophagic cell death process (type II PCD), ROS-related but independent of both caspases and protein synthesis. Both chalcones increased Bax/Bcl2 ratios and decreased Ki67 and CD71 antigen expressions. The present investigation reveals that despite the close structure of chalcones A and B, significant differences in mechanism of effect were found.

  16. A cellular automata model for social-learning processes in a classroom context

    Science.gov (United States)

    Bordogna, C. M.; Albano, E. V.

    2002-02-01

    A model for teaching-learning processes that take place in the classroom is proposed and simulated numerically. Recent ideas taken from the fields of sociology, educational psychology, statistical physics and computational science are key ingredients of the model. Results of simulations are consistent with well-established empirical results obtained in classrooms by means of different evaluation tools. It is shown that students engaged in collaborative groupwork reach higher achievements than those attending traditional lectures only. However, in many cases, this difference is subtle and consequently very difficult to be detected using tests. The influence of the number of students forming the collaborative groups on the average knowledge achieved is also studied and discussed.

  17. Cellular Aspects of Shigella Pathogenesis: Focus on the Manipulation of Host Cell Processes.

    Science.gov (United States)

    Killackey, Samuel A; Sorbara, Matthew T; Girardin, Stephen E

    2016-01-01

    Shigella is a Gram-negative bacterium that is responsible for shigellosis. Over the years, the study of Shigella has provided a greater understanding of how the host responds to bacterial infection, and how bacteria have evolved to effectively counter the host defenses. In this review, we provide an update on some of the most recent advances in our understanding of pivotal processes associated with Shigella infection, including the invasion into host cells, the metabolic changes that occur within the bacterium and the infected cell, cell-to-cell spread mechanisms, autophagy and membrane trafficking, inflammatory signaling and cell death. This recent progress sheds a new light into the mechanisms underlying Shigella pathogenesis, and also more generally provides deeper understanding of the complex interplay between host cells and bacterial pathogens in general.

  18. Integrated and Modular Design of an Optimized Process Architecture

    Directory of Open Access Journals (Sweden)

    Colin Raßfeld

    2013-07-01

    Full Text Available Global economic integration increased the complexity of business activities, so organizations are forced to become more efficient each day. Process organization is a very useful way of aligning organizational systems towards business processes. However, an organization must do more than just focus its attention and efforts on processes. The layout design has also a significant impact on the system performance.. We contribute to this field by developing a tailored process-oriented organizational structure and new layout design for the quality assurance of a leading German automotive manufacturer. The target concept we developed was evaluated by process owners and an IT-based process simulation. Our results provide solid empirical back-up in which the performance and effects are  assessed from a qualitative and quantitative perspective

  19. Optimal Process Design of Shrinkage and Sink Marks in Injection Molding

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The optimal process conditions of an injection molded polypropylenes dustpan were investigated to improve the part quality. A fractional factorial experiment was employed to screen the significant factors and main combinations among the numerous process parameters. And, with the consideration of interaction effects, an L27 orthogonal array based on the Taguchi method was conducted to determine the optimal process conditions. The results indicate that the melt temperature has the most remarkable influence on both the volume shrinkage and sink marks criterion weights. But the optimal process conditions and the order of influence are different for the two criterion weights.

  20. Optimal operation of integrated processes. Studies on heat recovery systems

    Energy Technology Data Exchange (ETDEWEB)

    Glemmestad, Bjoern

    1997-12-31

    Separators, reactors and a heat exchanger network (HEN) for heat recovery are important parts of an integrated plant. This thesis deals with the operation of HENs, in particular, optimal operation. The purpose of heat integration is to save energy, but the HEN also introduces new interactions and feedback into the overall plant. A prerequisite for optimisation is that there are extra degrees of freedom left after regulatory control is implemented. It is shown that extra degrees of freedom may not always be utilized for energy optimisation, and a quantitative expression for the degrees of freedom that can be so utilized are presented. A simplified expression that is often valid is also deduced. The thesis presents some improvements and generalisations of a structure based method that has been proposed earlier. Structural information is used to divide possible manipulations into three categories depending on how each manipulation affects the utility consumption. By means of these categories and two heuristic rules for operability, the possible manipulations are ordered in a priority table. This table is used to determine which manipulation should be preferred and which manipulation should be selected if an active manipulation is saturated. It is shown that the method may correspond to split-range control. A method that uses parametric information in addition to structural information is proposed. In this method, the optimal control structure is found through solving an integer programming problem. The thesis also proposes a method that combines the use of steady state optimisation and optimal selection of measurements. 86 refs., 46 figs., 8 tabs.

  1. Optimal cognitive transmission exploiting redundancy in the primary ARQ process

    DEFF Research Database (Denmark)

    Michelusi, Nicholo; Simeone, Osvaldo; Levorato, Marco

    2011-01-01

    interference to the reception of the PM at the Primary Receiver (PR) and SR. Such interference may induce retransmissions of the same PM, which plays to the advantage of the secondary user, while at the same time making decoding of the PM more difficult also at the SR and reducing the available margin...... on the given interference constraint at the PR. It is proved that the optimal secondary strategy prioritizes transmissions in the states where the PM is known to the SR, due to the ability of the latter to perform interference mitigation and obtain a larger secondary throughput. Moreover, when the primary...

  2. Energy supply chain optimization of hybrid feedstock processes: a review.

    Science.gov (United States)

    Elia, Josephine A; Floudas, Christodoulos A

    2014-01-01

    The economic, environmental, and social performances of energy systems depend on their geographical locations and the surrounding market infrastructure for feedstocks and energy products. Strategic decisions to locate energy conversion facilities must take all upstream and downstream operations into account, prompting the development of supply chain modeling and optimization methods. This article reviews the contributions of energy supply chain studies that include heat, power, and liquid fuels production. Studies are categorized based on specific features of the mathematical model, highlighting those that address energy supply chain models with and without considerations of multiperiod decisions. Studies that incorporate uncertainties are discussed, and opportunities for future research developments are outlined.

  3. SIMULATION AS A TOOL FOR PROCESS OPTIMIZATION OF LOGISTIC SYSTEMS

    Directory of Open Access Journals (Sweden)

    Radko Popovič

    2015-09-01

    Full Text Available The paper deals with the simulation of the production processes, especially module of Siemens Tecnomatix software. Tecnomatix Process Simulate is designed for building new or modifying existing production processes. The simulation created in this software has a posibility for fast testing of planned changes or improvements of the production processes. On the base of simulation you can imagine the future picture of the real production system. 3D Simulation can reflects the actual status and conditions on the running system and of course, after some improvements, it can show the possible figure of the production system.

  4. A Novel Method for Assessing and Optimizing Software Project Process Based Risk Control

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    A new approach for assessing and optimizing software project process based on software risk control pre-sented, which evaluates and optimizes software project process from the view of controlling the software project risks. A model for optimizing software risk control is given, a discrete optimization algorithm based on dynamic programming is proposed and an example of using above method to solve a problem is also included in this paper. By improving the old passive post-project control into an active effective pre-action, this new method can greatly promote the possibility of success of software projects.

  5. Processing and characterization of multi-cellular monolithic bioceramics for bone regenerative scaffolds

    Science.gov (United States)

    Ari-Wahjoedi, Bambang; Ginta, Turnad Lenggo; Parman, Setyamartana; Abustaman, Mohd Zikri Ahmad

    2014-10-01

    Multicellular monolithic ceramic body is a ceramic material which has many gas or liquid passages partitioned by thin walls throughout the bulk material. There are many currently known advanced industrial applications of multicellular ceramics structures i.e. as supports for various catalysts, electrode support structure for solid oxide fuel cells, refractories, electric/electronic materials, aerospace vehicle re-entry heat shields and biomaterials for dental as well as orthopaedic implants by naming only a few. Multicellular ceramic bodies are usually made of ceramic phases such as mullite, cordierite, aluminum titanate or pure oxides such as silica, zirconia and alumina. What make alumina ceramics is excellent for the above functions are the intrinsic properties of alumina which are hard, wear resistant, excellent dielectric properties, resists strong acid and alkali attacks at elevated temperatures, good thermal conductivities, high strength and stiffness as well as biocompatible. In this work the processing technology leading to truly multicellular monolithic alumina ceramic bodies and their characterization are reported. Ceramic slip with 66 wt.% solid loading was found to be optimum as impregnant to the polyurethane foam template. Mullitic ceramic composite of alumina-sodium alumino disilicate-Leucite-like phases with bulk and true densities of 0.852 and 1.241 g cm-3 respectively, pore linear density of ±35 cm-1, linear and bulk volume shrinkages of 7-16% and 32 vol.% were obtained. The compressive strength and elastic modulus of the bioceramics are ≈0.5-1.0 and ≈20 MPa respectively.

  6. Optimization of frozen wild blueberry vacuum drying process

    Directory of Open Access Journals (Sweden)

    Šumić Zdravko M.

    2015-01-01

    Full Text Available The objective of this research was to optimize the vacuum drying of frozen blueberries in order to preserve health benefits phytochemicals using response surface methodology. The drying was performed in a new design of vacuum dryer equipment. Investigated range of temperature was 46-74°C and of pressure 38-464 mbar. Total solids, total phenolics, vitamin C, anthocyanin content and total color change were used as quality indicators of dried blueberries. Within the experimental range of studied variables, the optimum conditions of 60 °C and 100 mbar were established for vacuum drying of blueberries. Separate validation experiments were conducted at optimum conditions to verify predictions and adequacy of the second-order polynomial models. Under these optimal conditions, the predicted amount of total phenolics was 3.70 mgCAE/100dw, vitamin C 59.79 mg/100gdw, anthocyanin content 2746.33 mg/100gdw, total solids 89.50% and total color change 88.83. [Projekat Ministarstva nauke Republike Srbije, br. TR 31044

  7. Optimization of process condition of nanosilica production by hydrothermal method

    Science.gov (United States)

    Qisti, N.; Indrasti, N. S.; Suprihatin

    2016-11-01

    Bagasse ashes have high silica content thus it can be used in nanosilicaproduction to increase its benefit value. This study aimed to get the best time for synthesis and to determine the optimum synthesis time and temperature. This study used the hydrothermal method, a simple method with relatively low reaction temperature and provide a good chemical homogeneity. Time varieties in synthesizing silica were 8,10 and 12 hours, at the temperature of 150 °C. But the results were not as expected. Moreover, optimization of synthesis temperature and time used 4 hours at the temperature of 150 °C based on previous studies. Optimization was conducted using the Response Surface Methodology (RSM). Later, a testusing PSA (Particle Size Analyzer) was performed to obtain particle sizes and PDI values (Polydispersity Index). The results showed that the prediction model of temperature synthesis was 152.67 °C synthesis time of 6 hours, particle size of 276.288 nm and PDI value of 0.189642. The tests showed that the size of particle obtained was 330.39 nm and PDI value at 0.3580. Actual results and predicted results were not significant different.

  8. Optimization of Process Variables by Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Kashipeta Ravinder

    2015-12-01

    Full Text Available In the present study optimisation of the growth medium for the production of Cyclodextrin glucanotransferase (CGTase was carried out using response surface methodology. Four important parameters namely starch, yeast extract, K2HPO4 and MgSO4 concentrations were selected as the independent variables and the enzyme activity (CGTase activity U/mL was the dependent response variable. Each of these independent variables was studied at five different levels as per central composite design (CCD in four variables with a total of 28 experimental runs. The optimal calculated values of tested variables for maximal production of CGTase were found to be comprised of: starch, 2.16 %; yeast extract, 0.6 %; K2HPO4, 0.62 %; MgSO4, 0.04 % with a predicted CGTase activity of 150 U/ml. These predicted optimal parameters were tested in the laboratory and the final CGTase activity obtained was very close to the predicted value at 148.2 U/ml.

  9. Rate-optimal Bayesian intensity smoothing for inhomogeneous Poisson processes

    NARCIS (Netherlands)

    E. Belitser; P. Serra; H. van Zanten

    2015-01-01

    We apply nonparametric Bayesian methods to study the problem of estimating the intensity function of an inhomogeneous Poisson process. To motivate our results we start by analyzing count data coming from a call center which we model as a Poisson process. This analysis is carried out using a certain

  10. Oxidative stress affects FET proteins localization and alternative pre-mRNA processing in cellular models of ALS.

    Science.gov (United States)

    Svetoni, Francesca; Caporossi, Daniela; Paronetto, Maria Paola

    2014-10-01

    FUS/TLS, EWS and TAF15 are members of the FET family of DNA and RNA binding proteins, involved in multiple steps of DNA and RNA processing and implicated in the regulation of gene expression and cell-signaling. All members of the FET family contribute to human pathologies, as they are involved in sarcoma translocations and neurodegenerative diseases. Mutations in FUS/TLS, in EWSR1 and in TAF15 genescause Amyotrophic Lateral Sclerosis (ALS), a fatal human neurodegenerative disease that affects primarily motor neurons and is characterized by the progressive loss of motor neurons and degradation of the neuromuscular junctions.ALS-associated FET mutations cause FET protein relocalization into cytoplasmic aggregates, thus impairing their normal function. Protein aggregation has been suggested as a co-opting factor during the disease pathogenesis. Cytoplasmic mislocalization of FET proteins contributes to the formation of cytoplasmic aggregates that may alter RNA processing and initiate motor neuron degeneration. Interestingly, oxidative stress, which is implicated in the pathogenesis of ALS, triggers the accumulation of mutant FUS in cytoplasmic stress granules where it binds and sequester wild-type FUS.In order to evaluate the role of FET proteins in ALS and their involvement in the response to oxidative stress, we have developed cellular models of ALS expressing ALS-related FET mutants in neuroblastoma cell lines. Upon treatment with sodium arsenite, cells were analysed by immunofluorescence to monitor the localization of wild-type and mutated FET proteins. Furthermore, we have characterized signal transduction pathways and cell survival upon oxidative stress in our cellular models of ALS. Interestingly, we found that EWS mutant proteins display a different localization from FUS mutants and neither wild-type nor mutated EWS protein translocate into stress granules upon oxidative stress treatment. Collectively, our data provide a new link between the oxidative stress

  11. Effects of phenothiazine-structured compounds on APP processing in Alzheimer's disease cellular model.

    Science.gov (United States)

    Yuksel, Melike; Biberoglu, Kevser; Onder, Seda; Akbulut, K Gonca; Tacal, Ozden

    2017-07-01

    The excess accumulation of amyloid-β (Aβ) peptides derived from the sequential cleavage of amyloid precursor protein (APP) by secretases, is one of the toxic key events leading to neuronal loss in Alzheimer's disease (AD). Studies have shown that cholinergic activity may also be involved in the regulation of APP metabolism. In the current study, we have investigated the roles of toluidine blue O (TBO) and thionine (TH), newly recognized phenothiazine-derived cholinesterase inhibitors, on the metabolism of APP in Chinese hamster ovary cells stably expressing human APP751 and presenilin 1 (PS70 cells). We assessed the effects of both compounds on the levels of Aβ, soluble APP-α (sAPPα), intracellular APP and β-site APP-cleaving enzyme 1 (BACE1). After treatment of PS70 cells with TBO or TH without any side effect on cell viability, the levels of secreted Aβ40, Aβ42 and sAPPα were assayed by specific sandwich ELISAs while APP and BACE1 in cell lysates were analyzed using Western blot. The secreted Aβ40, Aβ42 and sAPPα in TBO- and TH-treated cells were found to be reduced in a dose-dependent manner compared to vehicle-treated cells. Results suggest that TH mitigated the Aβ pathology by lowering APP levels whereas reduced Aβ caused by TBO treatment seems to be the outcome of both less substrate availability and amyloidogenic APP processing. Taken together, our results represent the first report demonstrating that TBO and TH can affect amyloid metabolism in vitro. Copyright © 2017 Elsevier B.V. and Société Française de Biochimie et Biologie Moléculaire (SFBBM). All rights reserved.

  12. Processing and characterization of multi-cellular monolithic bioceramics for bone regenerative scaffolds

    Energy Technology Data Exchange (ETDEWEB)

    Ari-Wahjoedi, Bambang, E-mail: bambang-ariwahjoedi@petronas.com.my [Department of Fundamental and Applied Sciences, Universiti Teknologi PETRONAS, Bandar Seri Iskandar, 31750 Tronoh, Perak Darul Ridzuan (Malaysia); Centre for Intelligent Signal and Imaging Research, Universiti Teknologi PETRONAS, Bandar Seri Iskandar (Malaysia); Ginta, Turnad Lenggo [Department of Mechanical Engineering, Universiti Teknologi PETRONAS, Bandar Seri Iskandar, 31750 Tronoh, Perak Darul Ridzuan (Malaysia); Centre for Intelligent Signal and Imaging Research, Universiti Teknologi PETRONAS, Bandar Seri Iskandar, 31750 Tro (Malaysia); Parman, Setyamartana [Department of Mechanical Engineering, Universiti Teknologi PETRONAS, Bandar Seri Iskandar, 31750 Tronoh, Perak Darul Ridzuan (Malaysia); Abustaman, Mohd Zikri Ahmad [Kebabangan Petroleum Operating Company Sdn Bhd, Lvl. 52, Tower 2, PETRONAS Twin Towers, KLCC, 50088 Kuala Lumpur (Malaysia)

    2014-10-24

    Multicellular monolithic ceramic body is a ceramic material which has many gas or liquid passages partitioned by thin walls throughout the bulk material. There are many currently known advanced industrial applications of multicellular ceramics structures i.e. as supports for various catalysts, electrode support structure for solid oxide fuel cells, refractories, electric/electronic materials, aerospace vehicle re-entry heat shields and biomaterials for dental as well as orthopaedic implants by naming only a few. Multicellular ceramic bodies are usually made of ceramic phases such as mullite, cordierite, aluminum titanate or pure oxides such as silica, zirconia and alumina. What make alumina ceramics is excellent for the above functions are the intrinsic properties of alumina which are hard, wear resistant, excellent dielectric properties, resists strong acid and alkali attacks at elevated temperatures, good thermal conductivities, high strength and stiffness as well as biocompatible. In this work the processing technology leading to truly multicellular monolithic alumina ceramic bodies and their characterization are reported. Ceramic slip with 66 wt.% solid loading was found to be optimum as impregnant to the polyurethane foam template. Mullitic ceramic composite of alumina-sodium alumino disilicate-Leucite-like phases with bulk and true densities of 0.852 and 1.241 g cm{sup −3} respectively, pore linear density of ±35 cm{sup −1}, linear and bulk volume shrinkages of 7-16% and 32 vol.% were obtained. The compressive strength and elastic modulus of the bioceramics are ≈0.5-1.0 and ≈20 MPa respectively.

  13. Importance of the alternative oxidase (AOX) pathway in regulating cellular redox and ROS homeostasis to optimize photosynthesis during restriction of the cytochrome oxidase pathway in Arabidopsis thaliana.

    Science.gov (United States)

    Vishwakarma, Abhaypratap; Tetali, Sarada Devi; Selinski, Jennifer; Scheibe, Renate; Padmasree, Kollipara

    2015-09-01

    The importance of the alternative oxidase (AOX) pathway, particularly AOX1A, in optimizing photosynthesis during de-etiolation, under elevated CO2, low temperature, high light or combined light and drought stress is well documented. In the present study, the role of AOX1A in optimizing photosynthesis was investigated when electron transport through the cytochrome c oxidase (COX) pathway was restricted at complex III. Leaf discs of wild-type (WT) and aox1a knock-out mutants of Arabidopsis thaliana were treated with antimycin A (AA) under growth-light conditions. To identify the impact of AOX1A deficiency in optimizing photosynthesis, respiratory O2 uptake and photosynthesis-related parameters were measured along with changes in redox couples, reactive oxygen species (ROS), lipid peroxidation and expression levels of genes related to respiration, the malate valve and the antioxidative system. In the absence of AA, aox1a knock-out mutants did not show any difference in physiological, biochemical or molecular parameters compared with WT. However, after AA treatment, aox1a plants showed a significant reduction in both respiratory O2 uptake and NaHCO3-dependent O2 evolution. Chlorophyll fluorescence and P700 studies revealed that in contrast to WT, aox1a knock-out plants were incapable of maintaining electron flow in the chloroplastic electron transport chain, and thereby inefficient heat dissipation (low non-photochemical quenching) was observed. Furthermore, aox1a mutants exhibited significant disturbances in cellular redox couples of NAD(P)H and ascorbate (Asc) and consequently accumulation of ROS and malondialdehyde (MDA) content. By contrast, WT plants showed a significant increase in transcript levels of CSD1, CAT1, sAPX, COX15 and AOX1A in contrast to aox1a mutants. These results suggest that AOX1A plays a significant role in sustaining the chloroplastic redox state and energization to optimize photosynthesis by regulating cellular redox homeostasis and ROS

  14. Probabilistic safety assessment and optimal control of hazardous technological systems. A marked point process approach

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J. [VTT Automation, Espoo (Finland)

    1997-04-01

    The thesis models risk management as an optimal control problem for a stochastic process. The approach classes the decisions made by management into three categories according to the control methods of a point process: (1) planned process lifetime, (2) modification of the design, and (3) operational decisions. The approach is used for optimization of plant shutdown criteria and surveillance test strategies of a hypothetical nuclear power plant. 62 refs. The thesis includes also five previous publications by author.

  15. The optimization of electrocoagulation process for treatment of the textile wastewater by Response surface Methodology (RSM

    Directory of Open Access Journals (Sweden)

    Samaneh Ghodrati

    2014-10-01

    Conclusion: The experimental results indicated that the EC process is an efficient and promising process for the decolorization and COD removal of textile effluents. Under the optimized conditions, the experimental values had a good correlation with the predicted ones, indicating suitability of the model and the success of the RSM in optimizing the conditions of EC process in treating the textile wastewater with maximum removals of color and COD under selected conditions of independent variables.

  16. Process Optimization for Injection Moulding of Passive Microwave Components

    DEFF Research Database (Denmark)

    Scholz, Steffen G.; Mueller, Tobias; Santos Machado, Leonardo

    2016-01-01

    utilizing the injection moulding process. A design of experiment study has been carried out, varying process parameters such as injection speed, holding pressure, mould- and barrel temperature. The replicated parts are characterized by measuring geometric features and the part weight. Using an evaluation...... algorithm for modelling, the influence of different moulding parameters on the final part quality was assessed. Firstly a process model and secondly a quality model has been calculated. The results shows that part quality can be controlled by monitoring characteristic numbers....

  17. Optimize APC revenue with process-oriented assessments.

    Science.gov (United States)

    Kovar, Michael S; Lowery, Elizabeth

    2002-04-01

    To maintain or improve revenue streams under the Medicare outpatient prospective payment system (PPS), healthcare financial managers should use a process-oriented approach to assess the effectiveness of revenue capture in departments most affected by the PPS's use of ambulatory patient classifications, which typically include the radiology, cardiology, and emergency departments. Such an assessment should be conducted by a multidisciplinary team with senior management support. The team ideally should include the CFO, COO, and leaders from the departments to be assessed. Such an assessment process should consist of five basic phases: chargemaster/charge-capture analysis, revenue-capture process assessment, claims review, development of implementation strategies, and monitoring.

  18. How landscape heterogeneity frames optimal diffusivity in searching processes.

    Directory of Open Access Journals (Sweden)

    E P Raposo

    2011-11-01

    Full Text Available Theoretical and empirical investigations of search strategies typically have failed to distinguish the distinct roles played by density versus patchiness of resources. It is well known that motility and diffusivity of organisms often increase in environments with low density of resources, but thus far there has been little progress in understanding the specific role of landscape heterogeneity and disorder on random, non-oriented motility. Here we address the general question of how the landscape heterogeneity affects the efficiency of encounter interactions under global constant density of scarce resources. We unveil the key mechanism coupling the landscape structure with optimal search diffusivity. In particular, our main result leads to an empirically testable prediction: enhanced diffusivity (including superdiffusive searches, with shift in the diffusion exponent, favors the success of target encounters in heterogeneous landscapes.

  19. Optimal cognitive transmission exploiting redundancy in the primary ARQ process

    DEFF Research Database (Denmark)

    Michelusi, Nicholo; Simeone, Osvaldo; Levorato, Marco

    2011-01-01

    transmissions to the SU. We investigate secondary transmission policies that take advantage of this redundancy. The basic idea is that, if a Secondary Receiver (SR) learns the Primary Message (PM) in a given primary retransmission, then it can use this knowledge to cancel the primary interference...... in the subsequent slots in case of primary retransmissions, thus achieving a larger secondary throughput. This gives rise to interesting trade-offs in the design of the secondary policy. In fact, on the one hand, a secondary transmission potentially increases the secondary throughput but, on the other, causes...... on the given interference constraint at the PR. It is proved that the optimal secondary strategy prioritizes transmissions in the states where the PM is known to the SR, due to the ability of the latter to perform interference mitigation and obtain a larger secondary throughput. Moreover, when the primary...

  20. Control and optimization system and method for chemical looping processes

    Science.gov (United States)

    Lou, Xinsheng; Joshi, Abhinaya; Lei, Hao

    2015-02-17

    A control system for optimizing a chemical loop system includes one or more sensors for measuring one or more parameters in a chemical loop. The sensors are disposed on or in a conduit positioned in the chemical loop. The sensors generate one or more data signals representative of an amount of solids in the conduit. The control system includes a data acquisition system in communication with the sensors and a controller in communication with the data acquisition system. The data acquisition system receives the data signals and the controller generates the control signals. The controller is in communication with one or more valves positioned in the chemical loop. The valves are configured to regulate a flow of the solids through the chemical loop.

  1. Maximum power point tracking for optimizing energy harvesting process

    Science.gov (United States)

    Akbari, S.; Thang, P. C.; Veselov, D. S.

    2016-10-01

    There has been a growing interest in using energy harvesting techniques for powering wireless sensor networks. The reason for utilizing this technology can be explained by the sensors limited amount of operation time which results from the finite capacity of batteries and the need for having a stable power supply in some applications. Energy can be harvested from the sun, wind, vibration, heat, etc. It is reasonable to develop multisource energy harvesting platforms for increasing the amount of harvesting energy and to mitigate the issue concerning the intermittent nature of ambient sources. In the context of solar energy harvesting, it is possible to develop algorithms for finding the optimal operation point of solar panels at which maximum power is generated. These algorithms are known as maximum power point tracking techniques. In this article, we review the concept of maximum power point tracking and provide an overview of the research conducted in this area for wireless sensor networks applications.

  2. New design method for valves internals, to optimize process

    Energy Technology Data Exchange (ETDEWEB)

    Jorge, Leonardo [PDVSA (Venezuela)

    2011-07-01

    In the heavy oil industry, various methods can be used to reduce the viscosity of oil, one of them being the injection of diluent. This method is commonly used in the Orinoco oil belt but it requires good control of the volume of diluent injected as well as the gas flow to optimize production; thus flow control valves need to be accurate. A new valve with a new method was designed with the characteristic of being very reliable and was then bench tested and compared with the other commercially available valves. Results showed better repeatability, accuracy and reliability with lower maintenance for the new method. The use of this valve provides significant savings while distributing the exact amount of fluids; up to date a less than 2% failure rate has been recorded in the field. The new method developed demonstrated impressive performance and PDVSA has decided to use it in mass.

  3. Optimization of process variables for the microbial degradation of ...

    African Journals Online (AJOL)

    STORAGESEVER

    2008-07-18

    Jul 18, 2008 ... The optimum process conditions for maximizing phenol degradation (removal) were recognized as ... 1997) and as such waste waters generated from these industrial ..... foaming and led to solvent and cell losses. In this work,.

  4. Optimized Laplacian image sharpening algorithm based on graphic processing unit

    Science.gov (United States)

    Ma, Tinghuai; Li, Lu; Ji, Sai; Wang, Xin; Tian, Yuan; Al-Dhelaan, Abdullah; Al-Rodhaan, Mznah

    2014-12-01

    In classical Laplacian image sharpening, all pixels are processed one by one, which leads to large amount of computation. Traditional Laplacian sharpening processed on CPU is considerably time-consuming especially for those large pictures. In this paper, we propose a parallel implementation of Laplacian sharpening based on Compute Unified Device Architecture (CUDA), which is a computing platform of Graphic Processing Units (GPU), and analyze the impact of picture size on performance and the relationship between the processing time of between data transfer time and parallel computing time. Further, according to different features of different memory, an improved scheme of our method is developed, which exploits shared memory in GPU instead of global memory and further increases the efficiency. Experimental results prove that two novel algorithms outperform traditional consequentially method based on OpenCV in the aspect of computing speed.

  5. Optimization of Wireless Transceivers under Processing Energy Constraints

    Science.gov (United States)

    Wang, Gaojian; Ascheid, Gerd; Wang, Yanlu; Hanay, Oner; Negra, Renato; Herrmann, Matthias; Wehn, Norbert

    2017-08-01

    Focus of the article is on achieving maximum data rates under a processing energy constraint. For a given amount of processing energy per information bit, the overall power consumption increases with the data rate. When targeting data rates beyond 100 Gb/s, the system's overall power consumption soon exceeds the power which can be dissipated without forced cooling. To achieve a maximum data rate under this power constraint, the processing energy per information bit must be minimized. Therefore, in this article, suitable processing efficient transmission schemes together with energy efficient architectures and their implementations are investigated in a true cross-layer approach. Target use cases are short range wireless transmitters working at carrier frequencies around 60 GHz and bandwidths between 1 GHz and 10 GHz.

  6. Bio-oil Production - Process Optimization and Product Quality

    DEFF Research Database (Denmark)

    Hoffmann, Jessica

    and pharmaceutical products, it will become a high-cost commodity. Therefore it is of great importance to develop a sustainable and marketable process for the conversion of biomass, which is feedstock flexible and energy efficient and offers high conversion efficiency. Only a process like this has the ability......, fossil fuels still accounted for 87% of global and 81% of EU primary energy consumption. In an effort to reduce the carbon footprint of a continued supply of liquid fuels, processes utilizing biomass in general, and lignocellulosic biomass in particular, are being developed to replace their fossil...... to produce a drop-in product that is commercially compatible to conventional fuels as wells as has the capability to endure. Furthermore, liquid biofuels in future need to be produced in bulk to meet demand; thus, the challenge becomes one of finding the right process with high feedstock flexibility. One...

  7. OPTIMIZATION OF INJECTION MOLDING PROCESS BASED ON NUMERICAL SIMULATIONAND BP NEURAL NETWORKS

    Institute of Scientific and Technical Information of China (English)

    王玉; 邢渊; 阮雪榆

    2001-01-01

    Plastic injection molding is a very complex process and its process planning has a direct influence on product quality and production efficiency. This paper studied the optimization of injection molding process by combining the numerical simulation with back-propagation(BP) networks. The BP networks are trained by the results of numerical simulation. The trained BP networks may:(1) shorten time for process planning;(2) optimize process parameters;(3) be employed in on-line quality control;(4) be integrated with knowledge-based system(KBS) and case-based reasoning(CBR) to make intelligent process planning of injection molding.

  8. Airport Logistics : Modeling and Optimizing the Turn-Around Process

    OpenAIRE

    Norin, Anna

    2008-01-01

    The focus of this licentiate thesis is air transportation and especially the logistics at an airport. The concept of airport logistics is investigated based on the following definition: Airport logistics is the planning and control of all resources and information that create a value for the customers utilizing the airport. As a part of the investigation, indicators for airport performance are considered. One of the most complex airport processes is the turn-around process. The turn-around is...

  9. Optimization of Energy and Exergy Consumption in MEG Regeneration Processes

    OpenAIRE

    Billington, Henrik Reymert

    2009-01-01

    Monoethylene glycol (MEG) is commonly used for hydrate inhibition in fields that require continuous injection. Traditional processes for regeneration and reclamation of MEG require significant amounts of heat. Reclamation (salt removal) is usually done by complete evaporation of salty MEG in a flash separator under partial vacuum. Regeneration (water removal) is done by distillation. Heat integration in current processes is limited. The oil and gas industry is heading towards energy systems b...

  10. Influence of processing conditions on strut structure and compressive properties of cellular lattice structures fabricated by selective laser melting

    Energy Technology Data Exchange (ETDEWEB)

    Qiu, Chunlei, E-mail: c.qiu@bham.ac.uk [School of Metallurgy and Materials, University of Birmingham, Edgbaston, Birmingham B15 2TT (United Kingdom); Yue, Sheng [School of Materials, University of Manchester, Manchester M13 9PL (United Kingdom); Research Complex at Harwell, Rutherford Appleton Laboratory, Didcot, Oxfordshire OX11 0FA (United Kingdom); Adkins, Nicholas J.E.; Ward, Mark; Hassanin, Hany [School of Metallurgy and Materials, University of Birmingham, Edgbaston, Birmingham B15 2TT (United Kingdom); Lee, Peter D., E-mail: peter.lee@manchester.ac.uk [School of Materials, University of Manchester, Manchester M13 9PL (United Kingdom); Research Complex at Harwell, Rutherford Appleton Laboratory, Didcot, Oxfordshire OX11 0FA (United Kingdom); Withers, Philip J., E-mail: p.j.withers@manchester.ac.uk [School of Materials, University of Manchester, Manchester M13 9PL (United Kingdom); Research Complex at Harwell, Rutherford Appleton Laboratory, Didcot, Oxfordshire OX11 0FA (United Kingdom); Attallah, Moataz M., E-mail: m.m.attallah@bham.ac.uk [School of Metallurgy and Materials, University of Birmingham, Edgbaston, Birmingham B15 2TT (United Kingdom)

    2015-03-25

    AlSi10Mg cellular lattice structures have been fabricated by selective laser melting (SLM) using a range of laser scanning speeds and powers. The as-fabricated strut size, morphology and internal porosity were investigated using optical microscopy (OM), scanning electron microscopy (SEM) and X-ray microtomography (micro-CT) and correlated to the compressive properties of the structure. Strut diameter was found to increase monotonically with laser power while the porosity was largest at intermediate powers. Laser scanning speed was found to thicken the struts only at slow rates while the porosity was largest at intermediate speeds. High speed imaging showed the melt pool to be larger at high laser powers. Further the melt pool shape was found to vary cyclically over time, steadily growing before becoming increasingly instable and irregularly shaped before abruptly falling in size due to splashing of molten materials and the process repeating. Upon compressive loading, lattice deformation was homogeneous prior to the peak stress before falling sharply due to the creation of a (one strut wide) shear band at around 45° to the compression axis. The specific yield strength expressed as the yield stress/(yield stress of the aluminium × relative density) is not independent of processing conditions, suggesting that further improvements in properties can be achieved by process optimisation. Lattice struts failed near nodes by a mixture of ductile and brittle fracture.

  11. Optimization of injection molding process for car fender in consideration of energy efficiency and product quality

    Directory of Open Access Journals (Sweden)

    Hong Seok Park

    2014-10-01

    Full Text Available Energy efficiency is an essential consideration in sustainable manufacturing. This study presents the car fender-based injection molding process optimization that aims to resolve the trade-off between energy consumption and product quality at the same time in which process parameters are optimized variables. The process is specially optimized by applying response surface methodology and using nondominated sorting genetic algorithm II (NSGA II in order to resolve multi-object optimization problems. To reduce computational cost and time in the problem-solving procedure, the combination of CAE-integration tools is employed. Based on the Pareto diagram, an appropriate solution is derived out to obtain optimal parameters. The optimization results show that the proposed approach can help effectively engineers in identifying optimal process parameters and achieving competitive advantages of energy consumption and product quality. In addition, the engineering analysis that can be employed to conduct holistic optimization of the injection molding process in order to increase energy efficiency and product quality was also mentioned in this paper.

  12. Optimization of submerged arc welding process parameters using quasi-oppositional based Jaya algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Rao, R. Venkata; Rai, Dhiraj P. [Sardar Vallabhbhai National Institute of Technology, Gujarat (India)

    2017-05-15

    Submerged arc welding (SAW) is characterized as a multi-input process. Selection of optimum combination of process parameters of SAW process is a vital task in order to achieve high quality of weld and productivity. The objective of this work is to optimize the SAW process parameters using a simple optimization algorithm, which is fast, robust and convenient. Therefore, in this work a very recently proposed optimization algorithm named Jaya algorithm is applied to solve the optimization problems in SAW process. In addition, a modified version of Jaya algorithm with oppositional based learning, named “Quasi-oppositional based Jaya algorithm” (QO-Jaya) is proposed in order to improve the performance of the Jaya algorithm. Three optimization case studies are considered and the results obtained by Jaya algorithm and QO-Jaya algorithm are compared with the results obtained by well-known optimization algorithms such as Genetic algorithm (GA), Particle swarm optimization (PSO), Imperialist competitive algorithm (ICA) and Teaching learning based optimization (TLBO).

  13. Modeling and multi-criteria optimization of an industrial process for continuous lactic acid production.

    Science.gov (United States)

    Mokeddem, Diab; Khellaf, Abdelhafid

    2014-06-01

    The key feature of this paper is the optimization of an industrial process for continuous production of lactic acid. For this, a two-stage fermentor process integrated with cell recycling has been mathematically modeled and optimized for overall productivity, conversion, and yield simultaneously. Non-dominated sorting genetic algorithm (NSGA-II) was applied to solve the constrained multi-objective optimization problem as it is capable of finding multiple Pareto-optimal solutions in a single run, thereby avoiding the need to use a single-objective optimization several times. Compared with traditional methods, NSGA-II could find most of the solutions in the true Pareto-front and its simulation is also very direct and convenient. The effects of operating variables on the optimal solutions are discussed in detail. It was observed that we can make higher profit with an acceptable compromise in a two-stage system with greater efficiency.

  14. Optimization of a recombinant human growth hormone purification process using quality by design.

    Science.gov (United States)

    Ortiz-Enriquez, Carolina; Romero-Díaz, Alexis de Jesús; Hernández-Moreno, Ana V; Cueto-Rojas, Hugo F; Miranda-Hernández, Mariana P; López-Morales, Carlos A; Pérez, Néstor O; Salazar-Ceballos, Rodolfo; Cruz-García, Norberto; Flores-Ortiz, Luis F; Medina-Rivero, Emilio

    2016-11-16

    This work describes a strategy to optimize a downstream processing of a recombinant human growth hormone (rhGH) by incorporating a quality by design approach toward meeting higher quality specifications. The optimized process minimized the presence of impurities and degradation by-products during manufacturing by the establishment of in-process controls. Capillary zone electrophoresis, reverse phase, and size-exclusion chromatographies were used as analytical techniques to establish new critical process parameters for the solubilization, capture, and intermediate purification steps aiming to maintain rhGH quality by complying with pharmacopeial specifications. The results indicated that the implemented improvements in the process allowed the optimization of the specific recovery and purification of rhGH without compromising its quality. In addition, this optimization facilitated the stringent removal of the remaining impurities in further polishing stages, as demonstrated by the analysis of the obtained active pharmaceutical ingredient.

  15. Thermodynamic optimization of a Penrose process: an engineers' approach to black hole thermodynamics

    CERN Document Server

    Bravetti, Alessandro; Lopez-Monsalvo, Cesar S

    2015-01-01

    In this work we present a new view on the thermodynamics of black holes introducing effects of irreversibility by employing thermodynamic optimization and finite-time thermodynamics. These questions are of importance both in physics and in engineering, combining standard thermodynamics with optimal control theory in order to find optimal protocols and bounds for realistic processes without assuming anything about the microphysics involved. We find general bounds on the maximum work and the efficiency of thermodynamic processes involving black holes that can be derived exclusively from the knowledge of thermodynamic relations at equilibrium. Since these new bounds consider the finite duration of the processes, they are more realistic and stringent than their reversible counterparts. To illustrate our arguments, we consider in detail the thermodynamic optimization of a Penrose process, i.e. the problem of finding the least dissipative process extracting all the angular momentum from a Kerr black hole in finite ...

  16. A Bayesian optimal design for degradation tests based on the inverse Gaussian process

    Energy Technology Data Exchange (ETDEWEB)

    Peng, Weiwen; Liu, Yu; Li, Yan Feng; Zhu, Shun Peng; Huang, Hong Zhong [University of Electronic Science and Technology of China, Chengdu (China)

    2014-10-15

    The inverse Gaussian process is recently introduced as an attractive and flexible stochastic process for degradation modeling. This process has been demonstrated as a valuable complement for models that are developed on the basis of the Wiener and gamma processes. We investigate the optimal design of the degradation tests on the basis of the inverse Gaussian process. In addition to an optimal design with pre-estimated planning values of model parameters, we also address the issue of uncertainty in the planning values by using the Bayesian method. An average pre-posterior variance of reliability is used as the optimization criterion. A trade-off between sample size and number of degradation observations is investigated in the degradation test planning. The effects of priors on the optimal designs and on the value of prior information are also investigated and quantified. The degradation test planning of a GaAs Laser device is performed to demonstrate the proposed method.

  17. Processing investigation and optimization for hybrid thermoplastic composites

    Institute of Scientific and Technical Information of China (English)

    M Tufail

    2007-01-01

    A thermoplastic based composite material is suitable for automobile and aerospace applications. The recyclability of thermoplastic and clean processing further enhance its use. The only limitation encountered in using this material is its high-melt viscosity. Various techniques have been developed to overcome this problem. Commingled materials are one of such methods adopted for making proper use of thermoplastic. A major problem observed during the use of a commingled material is its de-commingling, wherein, the uniform distribution of fiber and thermoplastic yarn gets disturbed and affects the final quality of the composite. The effects of the braiding process on laminate quality were investigated. Flat plaques were produced by braiding the commingled yarn, using a 48-carrier braiding machine. The braids (and control woven samples) were subsequently heated and consolidated in a nonisothermal compression molding operation. Prior to the manufacture of the 'best quality' plaques, a series of moldings were produced under different consolidation conditions, to study the dependence of properties on the process variables. This enabled a processing window to be established for each material and helped to separate the respective effects of yarn handling, textile processing, and consolidation on laminate properties.

  18. Cellular scanning strategy for selective laser melting: Evolution of optimal grid-based scanning path & parametric approach to thermal homogeneity

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Tutum, Cem Celal; Hattel, Jesper Henri

    2013-01-01

    Selective laser melting, as a rapid manufacturing technology, is uniquely poised to enforce a paradigm shift in the manufacturing industry by eliminating the gap between job- and batch-production techniques. Products from this process, however, tend to show an increased amount of defects...

  19. Numerical simulation and optimization of clearance in sheet shearing process

    Institute of Scientific and Technical Information of China (English)

    秦泗吉; 李洪波; 彭加耕; 李硕本

    2003-01-01

    An analysis model to simplify the shearing and blanking process was developed. Based on the simplified model, the shearing process was simulated by FEM and analyzed for various clearances. An optimum clearance in the process was determined by new approach based on orientation of the maximum shearing stress on the characteristic line linking two blades, according to the law of crack propagation and experiments. The optimum clearance determined by this method can be used to dictate the range of reasonable clearance. By the new approach, the optimum clearance can be obtained conveniently and accurately even if there is some difference between the selected points, where the initial crack is assumed originated, and the actual one, where the initial crack occurs really.

  20. Risk Management and Loss Optimization at Design Process of Products

    Directory of Open Access Journals (Sweden)

    Katalin Németh-Erdődi

    2008-06-01

    Full Text Available We’d like to introduce a flexible system of design process elements to support theformation and tool selection of an efficient, „lean” product design process. To do this weidentify numerical risk factors and introduce a calculating method for optimising takinginto consideration- the effect of design steps on usage characteristics,- the time needed by the design elements and the resultant losses,- the effect of design on the success of the implementation process.A generic model was developed for harmonising and sequencing of market and technicalactivities with built-in acceptance phase. The steps of the model can be selected flexiblydepending on design goals. The model regards the concurrent character of market,technical and organising activities, the critical speed of information flow between them andthe control, decision and confirmation points.

  1. Optimization of post combustion carbon capture process-solvent selection

    Directory of Open Access Journals (Sweden)

    Udara S. P. R. Arachchige, Muhammad Mohsin, Morten C. Melaaen

    2012-01-01

    Full Text Available The reduction of the main energy requirements in the CO2 capture process that is re-boiler duty in stripper section is important. Present study was focused on selection of better solvent concentration and CO2 lean loading for CO2 capture process. Both coal and gas fired power plant flue gases were considered to develop the capture plant with different efficiencies. Solvent concentration was varied from 25 to 40 (w/w % and CO2 lean loading was varied from 0.15 to 0.30 (mol CO2/mol MEA for 70-95 (mol % CO2 removal efficiencies. The optimum specifications for coal and gas processes such as MEA concentration, CO2 lean loading, and solvent inlet flow rate were obtained.

  2. Combining On-Line Characterization Tools with Modern Software Environments for Optimal Operation of Polymerization Processes

    Directory of Open Access Journals (Sweden)

    Navid Ghadipasha

    2016-02-01

    Full Text Available This paper discusses the initial steps towards the formulation and implementation of a generic and flexible model centric framework for integrated simulation, estimation, optimization and feedback control of polymerization processes. For the first time it combines the powerful capabilities of the automatic continuous on-line monitoring of polymerization system (ACOMP, with a modern simulation, estimation and optimization software environment towards an integrated scheme for the optimal operation of polymeric processes. An initial validation of the framework was performed for modelling and optimization using literature data, illustrating the flexibility of the method to apply under different systems and conditions. Subsequently, off-line capabilities of the system were fully tested experimentally for model validations, parameter estimation and process optimization using ACOMP data. Experimental results are provided for free radical solution polymerization of methyl methacrylate.

  3. Emission control with route optimization in solid waste collection process: A case study

    Indian Academy of Sciences (India)

    Omer Apaydin; M Talha Gonullu

    2008-04-01

    Solid waste collection processes are usually carried out by using trucks with diesel engine. In solid waste collection process, the trucks emit to environment different emissions from its exhausts. For this reason, in solid waste collection process, it is necessary that route optimization should be performed in order to decrease the emissions. This study was performed in Trabzon City with 39 districts, a shortest path model was used in order to optimize solid waste collection/hauling processes to minimize emission. Unless it performs route optimization in solid waste collection/hauling process, emissions increase due to empty miles negativeness. A software was used as an optimization tool. The software provided Geographical Information System (GIS) elements such as numerical pathways, demographic distribution data, container distribution data and solid waste production data. In addition, thematic container layer was having 777 points for the entire city. By using the software, the optimized route was compared with the present route. If the optimized route in solid waste collection system is used, route distance and route time will be decreased by 24·6 % and 44·3 % as mean of nine routes, respectively. By performing the stationary container collection process and route optimization, it is determined that CO2, NOx, HC, CO, PM emissions will be reduced 831·4, 12·8, 1·2, 0·4, 0·7 per route, respectively

  4. Optimizing the order processing of customized products using product configuration

    DEFF Research Database (Denmark)

    Hvam, Lars; Bonev, Martin; Denkena, B.

    2011-01-01

    . Product configuration based on integrated modular product structure and product family architecture has been recognized as an effective means for implementing mass customization. In order to evaluate the effects of product configuration on order processing, a study has been conducted by the Department...... of Management Engineering and Operations Management of the Technical University of Denmark in cooperation with the Institute of Production Engineering and Machine Tools of the Leibniz Universität Hannover. Thereby, a product configuration system has been modelled for a manufacturer of mass customized products...... and its benefits for the order processing have been evaluated....

  5. Off-line Optimization for Earley-style HPSG Processing

    CERN Document Server

    Minnen, G; Götz, T; Minnen, Guido; Gerdemann, Dale; Goetz, Thilo

    1995-01-01

    A novel approach to HPSG based natural language processing is described that uses an off-line compiler to automatically prime a declarative grammar for generation or parsing, and inputs the primed grammar to an advanced Earley-style processor. This way we provide an elegant solution to the problems with empty heads and efficient bidirectional processing which is illustrated for the special case of HPSG generation. Extensive testing with a large HPSG grammar revealed some important constraints on the form of the grammar.

  6. Optimization of the curing process of a sandwich panel

    Science.gov (United States)

    Phyo Maung, Pyi; Tatarnikov, O.; Malysheva, G.

    2016-10-01

    This study presented finite element modelling and experimental measurements of temperatures during the autoclave curing of the T-50 aircraft wing sandwich panel. This panel consists of upper and lower carbon fibre based laminates and an aluminium foil honeycomb. The finite element modelling was performed using the Femap-Nastran product. During processing, the temperature at various points on the surface of the panel was measured using the thermocouples. The finite element method simulated the thermal conditions and determined the temperatures in the different parts of the panel for a full cycle of the curing process. A comparison of the calculated and experimental data shows that their difference does not exceed 6%.

  7. Thermomechanical processing optimization for 304 austenitic stainless steel using artificial neural network and genetic algorithm

    Science.gov (United States)

    Feng, Wen; Yang, Sen

    2016-12-01

    Thermomechanical processing has an important effect on the grain boundary character distribution. To obtain the optimal thermomechanical processing parameters is the key of grain boundary engineering. In this study, genetic algorithm (GA) based on artificial neural network model was proposed to optimize the thermomechanical processing parameters. In this model, a back-propagation neural network (BPNN) was established to map the relationship between thermomechanical processing parameters and the fraction of low-Σ CSL boundaries, and GA integrated with BPNN (BPNN/GA) was applied to optimize the thermomechanical processing parameters. The validation of the optimal thermomechanical processing parameters was verified by an experiment. Moreover, the microstructures and the intergranular corrosion resistance of the base material (BM) and the materials produced by the optimal thermomechanical processing parameters (termed as the GBEM) were studied. Compared to the BM specimen, the fraction of low-Σ CSL boundaries was increased from 56.8 to 77.9% and the random boundary network was interrupted by the low-Σ CSL boundaries, and the intergranular corrosion resistance was improved in the GBEM specimen. The results indicated that the BPNN/GA model was an effective and reliable means for the thermomechanical processing parameters optimization, which resulted in improving the intergranular corrosion resistance in 304 austenitic stainless steel.

  8. Optimization of multi-objective integrated process planning and scheduling problem using a priority based optimization algorithm

    Science.gov (United States)

    Ausaf, Muhammad Farhan; Gao, Liang; Li, Xinyu

    2015-12-01

    For increasing the overall performance of modern manufacturing systems, effective integration of process planning and scheduling functions has been an important area of consideration among researchers. Owing to the complexity of handling process planning and scheduling simultaneously, most of the research work has been limited to solving the integrated process planning and scheduling (IPPS) problem for a single objective function. As there are many conflicting objectives when dealing with process planning and scheduling, real world problems cannot be fully captured considering only a single objective for optimization. Therefore considering multi-objective IPPS (MOIPPS) problem is inevitable. Unfortunately, only a handful of research papers are available on solving MOIPPS problem. In this paper, an optimization algorithm for solving MOIPPS problem is presented. The proposed algorithm uses a set of dispatching rules coupled with priority assignment to optimize the IPPS problem for various objectives like makespan, total machine load, total tardiness, etc. A fixed sized external archive coupled with a crowding distance mechanism is used to store and maintain the non-dominated solutions. To compare the results with other algorithms, a C-matric based method has been used. Instances from four recent papers have been solved to demonstrate the effectiveness of the proposed algorithm. The experimental results show that the proposed method is an efficient approach for solving the MOIPPS problem.

  9. Integrated gasification combined cycle (IGCC) process simulation and optimization

    Energy Technology Data Exchange (ETDEWEB)

    Emun, F.; Gadalla, M.; Majozi, T.; Boer, D. [University of Rovira & Virgili, Tarragona (Spain). Dept. of Chemical Engineering

    2010-03-05

    The integrated gasification combined cycle (IGCC) is an electrical power generation system which offers efficient generation from coal with lower effect on the environment than conventional coal power plants. However, further improvement of its efficiency and thereby lowering emissions are important tasks to achieve a more sustainable energy production. In this paper, a process simulation tool is proposed for simulation of IGCC. This tool is used to improve IGCC's efficiency and the environmental performance through an analysis of the operating conditions, together with process integration studies. Pinch analysis principles and process integration insights are then employed to make topological changes to the flowsheet to improve the energy efficiency and minimize the operation costs. Process data of the Texaco gasifier and the associated plants (coal preparation, air separation unit, gas cleaning, sulfur recovery, gas turbine, steam turbine and the heat recovery steam generator) are considered as a base case, and simulated using Aspen Plus. The results of parameter analysis and heat integration studies indicate that thermal efficiency of 45% can be reached, while a significant decrease in CO{sub 2} and SOx emissions is observed. The CO{sub 2} and SOx emission levels reached are 698 kg/MWh and 0.15 kg/MWh, respectively. Application of pinch analysis determines energy targets, and also identifies potential modifications for further improvement to overall energy efficiency. Benefits of energy integration and steam production possibilities can further be quantified. Overall benefits can be translated to minimum operation costs and atmospheric emissions.

  10. Microfluidic chip designs process optimization and dimensional quality control

    DEFF Research Database (Denmark)

    Calaon, Matteo; Hansen, Hans Nørgaard; Tosello, Guido

    2015-01-01

    . Subsequent nickel electroplating was employed to replicate the obtained geometries on the tool, which was used to mold on transparent polymer substrates the functional structures. To assess the critical factors affecting the replication quality throughout the different steps of the proposed process chain...

  11. Structural optimization for materially informed design to robotic production processes

    NARCIS (Netherlands)

    Bier, H.H.; Mostafavi, S.

    2015-01-01

    Hyperbody’s materially informed Design-to-Robotic-Production (D2RP) processes for additive and subtractive manufacturing aim to achieve performative porosity in architecture at various scales. An extended series of D2RP experiments aiming to produce prototypes at 1:1 scale wherein design materiality

  12. Optimal design of an extrusion process for a hinge bracket

    Energy Technology Data Exchange (ETDEWEB)

    Na, Geum Ju; Jang, Myung Geun; Kim, Jong Bong [Seoul National University, Seoul (Korea, Republic of)

    2016-05-15

    This study considers process design in forming a hinge bracket. A thin hinge bracket is typically produced by bending a sheet panel or welding a hollow bar into a sheet panel. However, the hinge bracket made by bending or welding does not have sufficient durability in severe operating conditions because of the stress concentration in the bended region or the low corrosion resistance of the welded region. Therefore, this study uses forming to produce the hinge bracket part of a foldable container and to ensure durability in difficult operating conditions. An extrusion process for a T-shaped hinge bracket is studied using finite element analysis. Preliminary analysis shows that a very high forging load is required to form the bracket by forging. Therefore, extrusion is considered as a candidate process. Producing the part through the extrusion process enables many brackets to be made in a single extrusion and through successive cutting of the extruded part, thereby reducing the manufacturing cost. The design focuses on reducing the extrusion load and on ensuring shape accuracy. An initial billet is designed to reduce the extrusion load and to obtain a geometrically accurate part. The extruded part is bent frequently because of uneven material flow. Thus, extrusion die geometries are designed to obtain straight parts.

  13. Optimization of coagulation-flocculation process for pastas industry ...

    African Journals Online (AJOL)

    Jane

    2011-10-17

    Oct 17, 2011 ... pastas industry effluent using response surface methodology ... Research and practical applications have shown that ... treatment and in some other applications. .... The germination indexes were determined for two vegetable species ..... flocculation process for palm oil mill effluent using response surface.

  14. Estimation and optimization of the performance of polyhedral process networks

    NARCIS (Netherlands)

    Haastregt, Sven Joseph Johannes van

    2013-01-01

    A system-level design methodology such as Daedalus provides designers with a forward synthesis flow for automated design, programming, and implementation of multiprocessor systems-on-chip. Daedalus employs the polyhedral process network model of computation to represent applications. These networks

  15. Optimization of Nonlinear Figure-of-Merits of Integrated Power MOSFETs in Partial SOI Process

    DEFF Research Database (Denmark)

    Fan, Lin; Jørgensen, Ivan Harald Holger; Knott, Arnold

    2016-01-01

    different operating conditions. A systematic analysis of the optimization of these FOMs has not been previously established. The optimization methods are verified on a 100 V power MOSFET implemented in a 0.18 µm partial SOI process. Its FOMs are lowered by 1.3-18.3 times and improved by 22...

  16. Applying Analytical Derivative and Sparse Matrix Techniques to Large-Scale Process Optimization Problems

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The performance of analytical derivative and sparse matrix techniques applied to a traditional densesequential quadratic programming(SQP) is studied, and the strategy utilizing those techniques is also presented. Computational results on two typicalchemical optimization problems demonstrate significant enhancement in efficiency, which shows this strategy ispromising and suitable for large-scale process optimization problems.

  17. Optimization of EDM Process of (Cu-W EDM Electrodes on Different Progression

    Directory of Open Access Journals (Sweden)

    Arvind Kumar Tiwari

    2014-11-01

    Full Text Available The purpose of this research work is to determine the optimal cutting condition of EDM process of different work piece materials using different compositions of Cu-W tool Electrodes. The key cutting factors such as Discharge Current, Voltage, Pulse- On – Time, Duty Cycle, Spark Gap and flushing pressure will be optimized.

  18. Optimization process planning using hybrid genetic algorithm and intelligent search for job shop machining.

    Science.gov (United States)

    Salehi, Mojtaba; Bahreininejad, Ardeshir

    2011-08-01

    Optimization of process planning is considered as the key technology for computer-aided process planning which is a rather complex and difficult procedure. A good process plan of a part is built up based on two elements: (1) the optimized sequence of the operations of the part; and (2) the optimized selection of the machine, cutting tool and Tool Access Direction (TAD) for each operation. In the present work, the process planning is divided into preliminary planning, and secondary/detailed planning. In the preliminary stage, based on the analysis of order and clustering constraints as a compulsive constraint aggregation in operation sequencing and using an intelligent searching strategy, the feasible sequences are generated. Then, in the detailed planning stage, using the genetic algorithm which prunes the initial feasible sequences, the optimized operation sequence and the optimized selection of the machine, cutting tool and TAD for each operation based on optimization constraints as an additive constraint aggregation are obtained. The main contribution of this work is the optimization of sequence of the operations of the part, and optimization of machine selection, cutting tool and TAD for each operation using the intelligent search and genetic algorithm simultaneously.

  19. Differential search algorithm-based parametric optimization of electrochemical micromachining processes

    Directory of Open Access Journals (Sweden)

    Debkalpa Goswami

    2014-01-01

    Full Text Available Electrochemical micromachining (EMM appears to be a very promising micromachining process for having higher machining rate, better precision and control, reliability, flexibility, environmental acceptability, and capability of machining a wide range of materials. It permits machining of chemically resistant materials, like titanium, copper alloys, super alloys and stainless steel to be used in biomedical, electronic, micro-electromechanical system and nano-electromechanical system applications. Therefore, the optimal use of an EMM process for achieving enhanced machining rate and improved profile accuracy demands selection of its various machining parameters. Various optimization tools, primarily Derringer’s desirability function approach have been employed by the past researchers for deriving the best parametric settings of EMM processes, which inherently lead to sub-optimal or near optimal solutions. In this paper, an attempt is made to apply an almost new optimization tool, i.e. differential search algorithm (DSA for parametric optimization of three EMM processes. A comparative study of optimization performance between DSA, genetic algorithm and desirability function approach proves the wide acceptability of DSA as a global optimization tool.

  20. Optimized Enhanced Bioremediation Through 4D Geophysical Monitoring and Autonomous Data Collection, Processing and Analysis

    Science.gov (United States)

    2014-09-01

    ER-200717) Optimized Enhanced Bioremediation Through 4D Geophysical Monitoring and Autonomous Data Collection, Processing and Analysis...N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Optimized Enhanced Bioremediation Through 4D Geophysical Monitoring and Autonomous Data...8 2.1.2 The Geophysical Signatures of Bioremediation ......................................... 8 2.2 PRIOR

  1. Optimal Stopping Problems Driven by Lévy Processes and Pasting Principles

    NARCIS (Netherlands)

    Surya, B.A.

    2007-01-01

    Solving optimal stopping problems driven by Lévy processes has been a challenging task and has found many applications in modern theory of mathematical finance. For example situations in which optimal stopping typically arise include the problem of finding the arbitrage-free price of the American p

  2. Sequential optimization of strip bending process using multiquadric radial basis function surrogate models

    NARCIS (Netherlands)

    Havinga, Gosse Tjipke; van den Boogaard, Antonius H.; Klaseboer, G.

    2013-01-01

    Surrogate models are used within the sequential optimization strategy for forming processes. A sequential improvement (SI) scheme is used to refine the surrogate model in the optimal region. One of the popular surrogate modeling methods for SI is Kriging. However, the global response of Kriging mode

  3. Optimization of Nano-Process Deposition Parameters Based on Gravitational Search Algorithm

    Directory of Open Access Journals (Sweden)

    Norlina Mohd Sabri

    2016-06-01

    Full Text Available This research is focusing on the radio frequency (RF magnetron sputtering process, a physical vapor deposition technique which is widely used in thin film production. This process requires the optimized combination of deposition parameters in order to obtain the desirable thin film. The conventional method in the optimization of the deposition parameters had been reported to be costly and time consuming due to its trial and error nature. Thus, gravitational search algorithm (GSA technique had been proposed to solve this nano-process parameters optimization problem. In this research, the optimized parameter combination was expected to produce the desirable electrical and optical properties of the thin film. The performance of GSA in this research was compared with that of Particle Swarm Optimization (PSO, Genetic Algorithm (GA, Artificial Immune System (AIS and Ant Colony Optimization (ACO. Based on the overall results, the GSA optimized parameter combination had generated the best electrical and an acceptable optical properties of thin film compared to the others. This computational experiment is expected to overcome the problem of having to conduct repetitive laboratory experiments in obtaining the most optimized parameter combination. Based on this initial experiment, the adaptation of GSA into this problem could offer a more efficient and productive way of depositing quality thin film in the fabrication process.

  4. Optimal estimation of the intensity function of a spatial point process

    DEFF Research Database (Denmark)

    Guan, Yongtao; Jalilian, Abdollah; Waagepetersen, Rasmus

    easily computable estimating functions. We derive the optimal estimating function in a class of first-order estimating functions. The optimal estimating function depends on the solution of a certain Fredholm integral equation and reduces to the likelihood score in case of a Poisson process. We discuss...

  5. Using Minitab-Box Benken Software to Optimize the Induction Heating Process

    Directory of Open Access Journals (Sweden)

    LEUCA Teodor

    2014-05-01

    Full Text Available This paper presents aspects on finding some optimal dependence between the inductor input parameters (frequency, air gap and current density and its output parameters (spacing, power density, heating time. It also shows the results of coupling the numerical modeling of the inductive heating process, using Flux 2D, with a numerical model of optimization, using Minitab software.

  6. Optimization of business processes in banks through flexible workflow

    Science.gov (United States)

    Postolache, V.

    2017-08-01

    This article describes an integrated business model of a commercial bank. There are examples of components that go into its composition: wooden models and business processes, strategic goals, organizational structure, system architecture, operational and marketing risk models, etc. The practice has shown that the development and implementation of the integrated business model of the bank significantly increase operating efficiency and its management, ensures organizational and technology stable development. Considering the evolution of business processes in the banking sector, should be analysed their common characteristics. From the author’s point of view, a business process is a set of various activities of a commercial bank in which “Input” is one or more financial and material resources, as a result of this activity and “output” is created by banking product, which is some value to consumer. Using workflow technology, management business process efficiency issue is a matter of managing the integration of resources and sequence of actions aimed at achieving this goal. In turn, it implies management of jobs or functions’ interaction, synchronizing of the assignments periods, reducing delays in the transmission of the results etc. Workflow technology is very important for managers at all levels, as they can use it to easily strengthen the control over what is happening in a particular unit, and in the bank as a whole. The manager is able to plan, to implement rules, to interact within the framework of the company’s procedures and tasks entrusted to the system of the distribution function and execution control, alert on the implementation and issuance of the statistical data on the effectiveness of operating procedures. Development and active use of the integrated bank business model is one of the key success factors that contribute to long-term and stable development of the bank, increase employee efficiency and business processes, implement the

  7. The Cellular Processing Capacity Limits the Amounts of Chimeric U7 snRNA Available for Antisense Delivery.

    Science.gov (United States)

    Eckenfelder, Agathe; Tordo, Julie; Babbs, Arran; Davies, Kay E; Goyenvalle, Aurélie; Danos, Olivier

    2012-06-26

    Many genetic diseases are induced by mutations disturbing the maturation of pre-mRNAs, often affecting splicing. Antisense oligoribonucleotides (AONs) have been used to modulate splicing thereby circumventing the deleterious effects of mutations. Stable delivery of antisense sequences is achieved by linking them to small nuclear RNA (snRNAs) delivered by viral vectors, as illustrated by studies where therapeutic exon skipping was obtained in animal models of Duchenne muscular dystrophy (DMD). Yet, clinical translation of these approaches is limited by the amounts of vector to be administered. In this respect, maximizing the amount of snRNA antisense shuttle delivered by the vector is essential. Here, we have used a muscle- and heart-specific enhancer (MHCK) to drive the expression of U7 snRNA shuttles carrying antisense sequences against the human or murine DMD pre-mRNAs. Although antisense delivery and subsequent exon skipping were improved both in tissue culture and in vivo, we observed the formation of additional U7 snRNA by-products following gene transfer. These included aberrantly 3' processed as well as unprocessed species that may arise because of the saturation of the cellular processing capacity. Future efforts to increase the amounts of functional U7 shuttles delivered into a cell will have to take this limitation into account.

  8. Optimization of Memory Management in Image Processing using Pipelining Technique

    Directory of Open Access Journals (Sweden)

    P.S. Ramesh

    2015-02-01

    Full Text Available The quality of the image is mainly based on the various phenomena which generally consume lots of memory that needs to be resolved addressed. The handling of the memory is mainly affected due to disorderly arranged pixels in an image. This may lead to salt and pepper noise which will affect the quality of the image. The aim of this study is to remove the salt and pepper noise which is most crucial in image processing fields. In this study, we proposed a technique which combines adaptive mean filtering technique and wavelet transform technique based on pipeline processing to remove intensity spikes from the image and then both Otsu’s and Clahe algorithms are used to enhance the image. The implemented framework produces good results and proves against salt and pepper noise using PSNR algorithm.

  9. Process Optimization of Bismaleimide (BMI) Resin Infused Carbon Fiber Composite

    Science.gov (United States)

    Ehrlich, Joshua W.; Tate, LaNetra C.; Cox, Sarah B.; Taylor, Brian J.; Wright, M. Clara; Faughnan, Patrick D.; Batterson, Lawrence M.; Caraccio, Anne J.; Sampson, Jeffery W.

    2013-01-01

    Engineers today are presented with the opportunity to design and build the next generation of space vehicles out of the lightest, strongest, and most durable materials available. Composites offer excellent structural characteristics and outstanding reliability in many forms that will be utilized in future aerospace applications including the Commercial Crew and Cargo Program and the Orion space capsule. NASA's Composites for Exploration (CoEx) project researches the various methods of manufacturing composite materials of different fiber characteristics while using proven infusion methods of different resin compositions. Development and testing on these different material combinations will provide engineers the opportunity to produce optimal material compounds for multidisciplinary applications. Through the CoEx project, engineers pursue the opportunity to research and develop repair patch procedures for damaged spacecraft. Working in conjunction with Raptor Resins Inc., NASA engineers are utilizing high flow liquid infusion molding practices to manufacture high-temperature composite parts comprised of intermediate modulus 7 (IM7) carbon fiber material. IM7 is a continuous, high-tensile strength composite with outstanding structural qualities such as high shear strength, tensile strength and modulus as well as excellent corrosion, creep, and fatigue resistance. IM7 carbon fiber, combined with existing thermoset and thermoplastic resin systems, can provide improvements in material strength reinforcement and deformation-resistant properties for high-temperature applications. Void analysis of the different layups of the IM7 material discovered the largest total void composition within the [ +45 , 90 , 90 , -45 ] composite panel. Tensile and compressional testing proved the highest mechanical strength was found in the [0 4] layup. This paper further investigates the infusion procedure of a low-cost/high-performance BMI resin into an IM7 carbon fiber material and the

  10. Multiparameter Optimization and Controlling for Cylindrical Grinding Process

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper bursts the bondage of conventional no-burn thought, presents an optimum strategy permitting burn appear in grinding roughing stage, but the burning layer can be summed on the following finishing stage. On the base of the basic grinding models, the objective function and constrained functions for the multiparameter optimum grinding models had been built in this paper. By the computer simulation, the nonlinear optimum grinding control parameters had been obtained, and the truth grinding process had...

  11. Deconvoluting the Friction Stir Weld Process for Optimizing Welds

    Science.gov (United States)

    Schneider, Judy; Nunes, Arthur C.

    2008-01-01

    In the friction stir welding process, the rotating surfaces of the pin and shoulder contact the weld metal and force a rotational flow within the weld metal. Heat, generated by the metal deformation as well as frictional slippage with the contact surface, softens the metal and makes it easier to deform. As in any thermo-mechanical processing of metal, the flow conditions are critical to the quality of the weld. For example, extrusion of metal from under the shoulder of an excessively hot weld may relax local pressure and result in wormhole defects. The trace of the weld joint in the wake of the weld may vary geometrically depending upon the flow streamlines around the tool with some geometry more vulnerable to loss of strength from joint contamination than others. The material flow path around the tool cannot be seen in real time during the weld. By using analytical "tools" based upon the principles of mathematics and physics, a weld model can be created to compute features that can be observed. By comparing the computed observations with actual data, the weld model can be validated or adjusted to get better agreement. Inputs to the model to predict weld structures and properties include: hot working properties ofthe metal, pin tool geometry, travel rate, rotation and plunge force. Since metals record their prior hot working history, the hot working conditions imparted during FSW can be quantified by interpreting the final microstructure. Variations in texture and grain size result from variations in the strain accommodated at a given strain rate and temperature. Microstructural data from a variety of FSWs has been correlated with prior marker studies to contribute to our understanding of the FSW process. Once this stage is reached, the weld modeling process can save significant development costs by reducing costly trial-and-error approaches to obtaining quality welds.

  12. Some optimal dividend problems for a surplus process with interest

    Institute of Scientific and Technical Information of China (English)

    YANG Hu; GENG Wen-ting

    2008-01-01

    We derive some results on the dividend payments prior to ruin in the classical surplus process with interest. An integro-differential equation with a boundary conditions satisfied by the expected present value of dividend payments is derived and solved. Furthermore, we derive an integro-differential equation for the moment generating function, through which we analyze the higher moment of the present value of dividend payments. Finally, closed-form expressions for exponential claims are given.

  13. Optimization of Soybean Press Cake Treatments and Processing

    Directory of Open Access Journals (Sweden)

    Dumitru Tucu

    2007-09-01

    Full Text Available This paper presents some results given by a systemic study of methods used in soybeans press cake treatment and processing. The influence of raw materials on soybean pressing system and the parameters of extrusion process are analyzed. Principally, the experiments confirm the influences of heat process parameters in case of soybean press cakes production using classic solutions and microwave energy. These experiments start up by manufacturing soybean press cake in industrial conditions at “S.C. International romoster srl” –Dudestii Vechi, Timis County. For ensuring the best conditions, the experimental stand included an extruder, a system for toasting the soybeans press cake, a system for parameters’ control and the system for ensuring the processing of water. The following possibilities were analyzed: (1 Soybeans press cake obtained by the classical method without toasting at pressure of extrusion p1 = 75 kgf cm-2 and flow Q1 = 800 kg h-1; (2 Soybeans press cake obtained at pressure of extrusion head p2 = 85 kgf cm-2 and flow Q2 = 600 kg h-1; (3 Soybeans press cake obtained at pressure of extrusion head p3 = 95 kgf cm-2 and fl ow Q3 = 300 kg h-1; Using this application we tested a new method for treatment and studied the special systems which can be applied in industrial practice at “S.C. International romoster srl” – Dudestii Vechi, Timis County. During the testings and researches the variation of electrical permeability was observed. Differences between theoretical equation and practical results in calculus and energy measurement in the workspace were noticed.

  14. Optimization of the process of plasma ignition of coal

    Energy Technology Data Exchange (ETDEWEB)

    Peregudov, V.S. [Russian Academy of Sciences, Novosibirsk (Russian Federation)

    2009-04-15

    Results are given of experimental and theoretical investigations of plasma ignition of coal as a result of its thermochemical preparation in application to the processes of firing up a boiler and stabilizing the flame combustion. The experimental test bed with a commercial-scale burner is used for determining the conditions of plasma ignition of low-reactivity high-ash anthracite depending on the concentration of coal in the air mixture and velocity of the latter. The calculations produce an equation (important from the standpoint of practical applications) for determining the energy expenditure for plasma ignition of coal depending on the basic process parameters. The tests reveal the difficulties arising in firing up a boiler with direct delivery of pulverized coal from the mill to furnace. A scheme is suggested, which enables one to reduce the energy expenditure for ignition of coal and improve the reliability of the process of firing up such a boiler. Results are given of calculation of plasma thermochemical preparation of coal under conditions of lower concentration of oxygen in the air mixture.

  15. Comparative analysis of cogeneration power plants optimization based on stochastic method using superstructure and process simulator

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Leonardo Rodrigues de [Instituto Federal do Espirito Santo, Vitoria, ES (Brazil)], E-mail: leoaraujo@ifes.edu.br; Donatelli, Joao Luiz Marcon [Universidade Federal do Espirito Santo (UFES), Vitoria, ES (Brazil)], E-mail: joaoluiz@npd.ufes.br; Silva, Edmar Alino da Cruz [Instituto Tecnologico de Aeronautica (ITA/CTA), Sao Jose dos Campos, SP (Brazil); Azevedo, Joao Luiz F. [Instituto de Aeronautica e Espaco (CTA/IAE/ALA), Sao Jose dos Campos, SP (Brazil)

    2010-07-01

    Thermal systems are essential in facilities such as thermoelectric plants, cogeneration plants, refrigeration systems and air conditioning, among others, in which much of the energy consumed by humanity is processed. In a world with finite natural sources of fuels and growing energy demand, issues related with thermal system design, such as cost estimative, design complexity, environmental protection and optimization are becoming increasingly important. Therefore the need to understand the mechanisms that degrade energy, improve energy sources use, reduce environmental impacts and also reduce project, operation and maintenance costs. In recent years, a consistent development of procedures and techniques for computational design of thermal systems has occurred. In this context, the fundamental objective of this study is a performance comparative analysis of structural and parametric optimization of a cogeneration system using stochastic methods: genetic algorithm and simulated annealing. This research work uses a superstructure, modelled in a process simulator, IPSEpro of SimTech, in which the appropriate design case studied options are included. Accordingly, the cogeneration system optimal configuration is determined as a consequence of the optimization process, restricted within the configuration options included in the superstructure. The optimization routines are written in MsExcel Visual Basic, in order to work perfectly coupled to the simulator process. At the end of the optimization process, the system optimal configuration, given the characteristics of each specific problem, should be defined. (author)

  16. Multiobjective Optimization in Combinatorial Wind Farms System Integration and Resistive SFCL Using Analytical Hierarchy Process

    DEFF Research Database (Denmark)

    Moghadasi, Amirhasan; Sarwat, Arif; Guerrero, Josep M.

    2016-01-01

    on the extreme load reduction is effectively demonstrated. A large WPP has a complicated structure using several components, and the inclusion of RSFCL composes this layout more problematic for optimal performance of the system. Hence, the most-widely decision-making technique based on the analytic hierarchy...... process (AHP) is proposed for the optimal design of the combinatorial RSFCL and 50MW WPP to compute the three-dimensional alignment in Pareto front at the end of the optimization run. The numerical simulations verify effectiveness of the proposed approach, using the Pareto optimality concept....

  17. A graph-based ant colony optimization approach for process planning.

    Science.gov (United States)

    Wang, JinFeng; Fan, XiaoLiang; Wan, Shuting

    2014-01-01

    The complex process planning problem is modeled as a combinatorial optimization problem with constraints in this paper. An ant colony optimization (ACO) approach has been developed to deal with process planning problem by simultaneously considering activities such as sequencing operations, selecting manufacturing resources, and determining setup plans to achieve the optimal process plan. A weighted directed graph is conducted to describe the operations, precedence constraints between operations, and the possible visited path between operation nodes. A representation of process plan is described based on the weighted directed graph. Ant colony goes through the necessary nodes on the graph to achieve the optimal solution with the objective of minimizing total production costs (TPC). Two cases have been carried out to study the influence of various parameters of ACO on the system performance. Extensive comparative experiments have been conducted to demonstrate the feasibility and efficiency of the proposed approach.

  18. Real-time economic optimization for a fermentation process using Model Predictive Control

    DEFF Research Database (Denmark)

    Petersen, Lars Norbert; Jørgensen, John Bagterp

    2014-01-01

    Fermentation is a widely used process in production of many foods, beverages, and pharmaceuticals. The main goal of the control system is to maximize profit of the fermentation process, and thus this is also the main goal of this paper. We present a simple dynamic model for a fermentation process...... and demonstrate its usefulness in economic optimization. The model is formulated as an index-1 differential algebraic equation (DAE), which guarantees conservation of mass and energy in discrete form. The optimization is based on recent advances within Economic Nonlinear Model Predictive Control (E......-NMPC), and also utilizes the index-1 DAE model. The E-NMPC uses the single-shooting method and the adjoint method for computation of the optimization gradients. The process constraints are relaxed to soft-constraints on the outputs. Finally we derive the analytical solution to the economic optimization problem...

  19. On the optimality equation for average cost Markov control processes with Feller transition probabilities

    Science.gov (United States)

    Jaskiewicz, Anna; Nowak, Andrzej S.

    2006-04-01

    We consider Markov control processes with Borel state space and Feller transition probabilities, satisfying some generalized geometric ergodicity conditions. We provide a new theorem on the existence of a solution to the average cost optimality equation.

  20. Optimization on start-up process of high-pressure rotor for large power steam turbine

    Directory of Open Access Journals (Sweden)

    Du Qiu-Wan

    2016-01-01

    Full Text Available This paper combines thermal-structure coupling technique and pattern search optimization algorithm to establish an optimization system for the start-up process of a turbine unit. Firstly, a finite element model for thermal-structure coupling calculation is established to accurately analyze the transient temperature field and thermal stress field, which can obtain the thermal stress distribution during start-up process. Afterwards, a program of optimization on rotor start-up process is exploited to improve the time allocation in each operating stage of start-up process, which minimizes the maximum equivalent stress of rotor. The maximum equivalent stress has reduced 25.7% after the optimization, which reveals obvious effect.