WorldWideScience

Sample records for developing optimal input

  1. Developing optimal input design strategies in cancer systems biology with applications to microfluidic device engineering.

    Science.gov (United States)

    Menolascina, Filippo; Bellomo, Domenico; Maiwald, Thomas; Bevilacqua, Vitoantonio; Ciminelli, Caterina; Paradiso, Angelo; Tommasi, Stefania

    2009-10-15

    Mechanistic models are becoming more and more popular in Systems Biology; identification and control of models underlying biochemical pathways of interest in oncology is a primary goal in this field. Unfortunately the scarce availability of data still limits our understanding of the intrinsic characteristics of complex pathologies like cancer: acquiring information for a system understanding of complex reaction networks is time consuming and expensive. Stimulus response experiments (SRE) have been used to gain a deeper insight into the details of biochemical mechanisms underlying cell life and functioning. Optimisation of the input time-profile, however, still remains a major area of research due to the complexity of the problem and its relevance for the task of information retrieval in systems biology-related experiments. We have addressed the problem of quantifying the information associated to an experiment using the Fisher Information Matrix and we have proposed an optimal experimental design strategy based on evolutionary algorithm to cope with the problem of information gathering in Systems Biology. On the basis of the theoretical results obtained in the field of control systems theory, we have studied the dynamical properties of the signals to be used in cell stimulation. The results of this study have been used to develop a microfluidic device for the automation of the process of cell stimulation for system identification. We have applied the proposed approach to the Epidermal Growth Factor Receptor pathway and we observed that it minimises the amount of parametric uncertainty associated to the identified model. A statistical framework based on Monte-Carlo estimations of the uncertainty ellipsoid confirmed the superiority of optimally designed experiments over canonical inputs. The proposed approach can be easily extended to multiobjective formulations that can also take advantage of identifiability analysis. Moreover, the availability of fully automated

  2. Simplex-based optimization of numerical and categorical inputs in early bioprocess development: Case studies in HT chromatography.

    Science.gov (United States)

    Konstantinidis, Spyridon; Titchener-Hooker, Nigel; Velayudhan, Ajoy

    2017-08-01

    Bioprocess development studies often involve the investigation of numerical and categorical inputs via the adoption of Design of Experiments (DoE) techniques. An attractive alternative is the deployment of a grid compatible Simplex variant which has been shown to yield optima rapidly and consistently. In this work, the method is combined with dummy variables and it is deployed in three case studies wherein spaces are comprised of both categorical and numerical inputs, a situation intractable by traditional Simplex methods. The first study employs in silico data and lays out the dummy variable methodology. The latter two employ experimental data from chromatography based studies performed with the filter-plate and miniature column High Throughput (HT) techniques. The solute of interest in the former case study was a monoclonal antibody whereas the latter dealt with the separation of a binary system of model proteins. The implemented approach prevented the stranding of the Simplex method at local optima, due to the arbitrary handling of the categorical inputs, and allowed for the concurrent optimization of numerical and categorical, multilevel and/or dichotomous, inputs. The deployment of the Simplex method, combined with dummy variables, was therefore entirely successful in identifying and characterizing global optima in all three case studies. The Simplex-based method was further shown to be of equivalent efficiency to a DoE-based approach, represented here by D-Optimal designs. Such an approach failed, however, to both capture trends and identify optima, and led to poor operating conditions. It is suggested that the Simplex-variant is suited to development activities involving numerical and categorical inputs in early bioprocess development. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. User input in iterative design for prevention product development: leveraging interdisciplinary methods to optimize effectiveness.

    Science.gov (United States)

    Guthrie, Kate M; Rosen, Rochelle K; Vargas, Sara E; Guillen, Melissa; Steger, Arielle L; Getz, Melissa L; Smith, Kelley A; Ramirez, Jaime J; Kojic, Erna M

    2017-10-01

    The development of HIV-preventive topical vaginal microbicides has been challenged by a lack of sufficient adherence in later stage clinical trials to confidently evaluate effectiveness. This dilemma has highlighted the need to integrate translational research earlier in the drug development process, essentially applying behavioral science to facilitate the advances of basic science with respect to the uptake and use of biomedical prevention technologies. In the last several years, there has been an increasing recognition that the user experience, specifically the sensory experience, as well as the role of meaning-making elicited by those sensations, may play a more substantive role than previously thought. Importantly, the role of the user-their sensory perceptions, their judgements of those experiences, and their willingness to use a product-is critical in product uptake and consistent use post-marketing, ultimately realizing gains in global public health. Specifically, a successful prevention product requires an efficacious drug, an efficient drug delivery system, and an effective user. We present an integrated iterative drug development and user experience evaluation method to illustrate how user-centered formulation design can be iterated from the early stages of preclinical development to leverage the user experience. Integrating the user and their product experiences into the formulation design process may help optimize both the efficiency of drug delivery and the effectiveness of the user.

  4. Optimizing microwave photodetection: input-output theory

    Science.gov (United States)

    Schöndorf, M.; Govia, L. C. G.; Vavilov, M. G.; McDermott, R.; Wilhelm, F. K.

    2018-04-01

    High fidelity microwave photon counting is an important tool for various areas from background radiation analysis in astronomy to the implementation of circuit quantum electrodynamic architectures for the realization of a scalable quantum information processor. In this work we describe a microwave photon counter coupled to a semi-infinite transmission line. We employ input-output theory to examine a continuously driven transmission line as well as traveling photon wave packets. Using analytic and numerical methods, we calculate the conditions on the system parameters necessary to optimize measurement and achieve high detection efficiency. With this we can derive a general matching condition depending on the different system rates, under which the measurement process is optimal.

  5. Distributed Optimal Consensus Control for Multiagent Systems With Input Delay.

    Science.gov (United States)

    Zhang, Huaipin; Yue, Dong; Zhao, Wei; Hu, Songlin; Dou, Chunxia; Huaipin Zhang; Dong Yue; Wei Zhao; Songlin Hu; Chunxia Dou; Hu, Songlin; Zhang, Huaipin; Dou, Chunxia; Yue, Dong; Zhao, Wei

    2018-06-01

    This paper addresses the problem of distributed optimal consensus control for a continuous-time heterogeneous linear multiagent system subject to time varying input delays. First, by discretization and model transformation, the continuous-time input-delayed system is converted into a discrete-time delay-free system. Two delicate performance index functions are defined for these two systems. It is shown that the performance index functions are equivalent and the optimal consensus control problem of the input-delayed system can be cast into that of the delay-free system. Second, by virtue of the Hamilton-Jacobi-Bellman (HJB) equations, an optimal control policy for each agent is designed based on the delay-free system and a novel value iteration algorithm is proposed to learn the solutions to the HJB equations online. The proposed adaptive dynamic programming algorithm is implemented on the basis of a critic-action neural network (NN) structure. Third, it is proved that local consensus errors of the two systems and weight estimation errors of the critic-action NNs are uniformly ultimately bounded while the approximated control policies converge to their target values. Finally, two simulation examples are presented to illustrate the effectiveness of the developed method.

  6. On Optimal Input Design and Model Selection for Communication Channels

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yanyan [ORNL; Djouadi, Seddik M [ORNL; Olama, Mohammed M [ORNL

    2013-01-01

    In this paper, the optimal model (structure) selection and input design which minimize the worst case identification error for communication systems are provided. The problem is formulated using metric complexity theory in a Hilbert space setting. It is pointed out that model selection and input design can be handled independently. Kolmogorov n-width is used to characterize the representation error introduced by model selection, while Gel fand and Time n-widths are used to represent the inherent error introduced by input design. After the model is selected, an optimal input which minimizes the worst case identification error is shown to exist. In particular, it is proven that the optimal model for reducing the representation error is a Finite Impulse Response (FIR) model, and the optimal input is an impulse at the start of the observation interval. FIR models are widely popular in communication systems, such as, in Orthogonal Frequency Division Multiplexing (OFDM) systems.

  7. On the Nature of the Input in Optimality Theory

    DEFF Research Database (Denmark)

    Heck, Fabian; Müller, Gereon; Vogel, Ralf

    2002-01-01

    The input has two main functions in optimality theory (Prince and Smolensky 1993). First, the input defines the candidate set, in other words it determines which output candidates compete for optimality, and which do not. Second, the input is referred to by faithfulness constraints that prohibit...... output candidates from deviating from specifications in the input. Whereas there is general agreement concerning the relevance of the input in phonology, the nature of the input in syntax is notoriously unclear. In this article, we show that the input should not be taken to define syntactic candidate...... and syntax is due to a basic, irreducible difference between these two components of grammar: Syntax is an information preserving system, phonology is not....

  8. Optimizing Human Input in Social Network Analysis

    Science.gov (United States)

    2018-01-23

    7] T. L. Lai and H. Robbins, “Asymptotically efficient adaptive allocation rules,” Advances in applied mathematics , vol. 6, no. 1, pp. 4–22, 1985. [8...W. Whitt, “Heavy traffic limit theorems for queues: a survey,” in Mathematical Methods in Queueing Theory. Springer, 1974, pp. 307–350. [9] H...Regret lower bounds and optimal algorithms,” in Proceedings of the 2015 ACM SIGMETRICS International Conference on Measurement and Modeling of Computer

  9. Input and language development in bilingually developing children.

    Science.gov (United States)

    Hoff, Erika; Core, Cynthia

    2013-11-01

    Language skills in young bilingual children are highly varied as a result of the variability in their language experiences, making it difficult for speech-language pathologists to differentiate language disorder from language difference in bilingual children. Understanding the sources of variability in bilingual contexts and the resulting variability in children's skills will help improve language assessment practices by speech-language pathologists. In this article, we review literature on bilingual first language development for children under 5 years of age. We describe the rate of development in single and total language growth, we describe effects of quantity of input and quality of input on growth, and we describe effects of family composition on language input and language growth in bilingual children. We provide recommendations for language assessment of young bilingual children and consider implications for optimizing children's dual language development. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  10. Optimal Input Design for Aircraft Parameter Estimation using Dynamic Programming Principles

    Science.gov (United States)

    Morelli, Eugene A.; Klein, Vladislav

    1990-01-01

    A new technique was developed for designing optimal flight test inputs for aircraft parameter estimation experiments. The principles of dynamic programming were used for the design in the time domain. This approach made it possible to include realistic practical constraints on the input and output variables. A description of the new approach is presented, followed by an example for a multiple input linear model describing the lateral dynamics of a fighter aircraft. The optimal input designs produced by the new technique demonstrated improved quality and expanded capability relative to the conventional multiple input design method.

  11. Input and output constraints affecting irrigation development

    Science.gov (United States)

    Schramm, G.

    1981-05-01

    In many of the developing countries the expansion of irrigated agriculture is used as a major development tool for bringing about increases in agricultural output, rural economic growth and income distribution. Apart from constraints imposed by water availability, the major limitations considered to any acceleration of such programs are usually thought to be those of costs and financial resources. However, as is shown on the basis of empirical data drawn from Mexico, in reality the feasibility and effectiveness of such development programs is even more constrained by the lack of specialized physical and human factors on the input and market limitations on the output side. On the input side, the limited availability of complementary factors such as, for example, truly functioning credit systems for small-scale farmers or effective agricultural extension services impose long-term constraints on development. On the output side the limited availability, high risk, and relatively slow growth of markets for high-value crops sharply reduce the usually hoped-for and projected profitable crop mix that would warrant the frequently high costs of irrigation investments. Three conclusions are drawn: (1) Factors in limited supply have to be shadow-priced to reflect their high opportunity costs in alternative uses. (2) Re-allocation of financial resources from immediate construction of projects to longer-term increase in the supply of scarce, highly-trained manpower resources are necessary in order to optimize development over time. (3) Inclusion of high-value, high-income producing crops in the benefit-cost analysis of new projects is inappropriate if these crops could potentially be grown in already existing projects.

  12. Optimal Input Strategy for Plug and Play Process Control Systems

    DEFF Research Database (Denmark)

    Kragelund, Martin Nygaard; Leth, John-Josef; Wisniewski, Rafal

    2010-01-01

    This paper considers the problem of optimal operation of a plant, which goal is to maintain production at minimum cost. The system considered in this work consists of a joined plant and redundant input systems. It is assumed that each input system contributes to a flow of goods into the joined pa...... the performance of the plant. The results are applied to a coal fired power plant where an additional new fuel system, gas, becomes available....

  13. On Optimal Input Design for Feed-forward Control

    OpenAIRE

    Hägg, Per; Wahlberg, Bo

    2013-01-01

    This paper considers optimal input design when the intended use of the identified model is to construct a feed-forward controller based on measurable disturbances. The objective is to find a minimum power excitation signal to be used in a system identification experiment, such that the corresponding model-based feed-forward controller guarantees, with a given probability, that the variance of the output signal is within given specifications. To start with, some low order model problems are an...

  14. Workflow Optimization for Tuning Prostheses with High Input Channel

    Science.gov (United States)

    2017-10-01

    of Specific Aim 1 by driving a commercially available two DoF wrist and single DoF hand. The high -level control system will provide analog signals...AWARD NUMBER: W81XWH-16-1-0767 TITLE: Workflow Optimization for Tuning Prostheses with High Input Channel PRINCIPAL INVESTIGATOR: Daniel Merrill...Unlimited The views, opinions and/or findings contained in this report are those of the author(s) and should not be construed as an official Department

  15. Input-output interactions and optimal monetary policy

    DEFF Research Database (Denmark)

    Petrella, Ivan; Santoro, Emiliano

    2011-01-01

    This paper deals with the implications of factor demand linkages for monetary policy design in a two-sector dynamic general equilibrium model. Part of the output of each sector serves as a production input in both sectors, in accordance with a realistic input–output structure. Strategic...... complementarities induced by factor demand linkages significantly alter the transmission of shocks and amplify the loss of social welfare under optimal monetary policy, compared to what is observed in standard two-sector models. The distinction between value added and gross output that naturally arises...... in this context is of key importance to explore the welfare properties of the model economy. A flexible inflation targeting regime is close to optimal only if the central bank balances inflation and value added variability. Otherwise, targeting gross output variability entails a substantial increase in the loss...

  16. Distribution Development for STORM Ingestion Input Parameters

    Energy Technology Data Exchange (ETDEWEB)

    Fulton, John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-07-01

    The Sandia-developed Transport of Radioactive Materials (STORM) code suite is used as part of the Radioisotope Power System Launch Safety (RPSLS) program to perform statistical modeling of the consequences due to release of radioactive material given a launch accident. As part of this modeling, STORM samples input parameters from probability distributions with some parameters treated as constants. This report described the work done to convert four of these constant inputs (Consumption Rate, Average Crop Yield, Cropland to Landuse Database Ratio, and Crop Uptake Factor) to sampled values. Consumption rate changed from a constant value of 557.68 kg / yr to a normal distribution with a mean of 102.96 kg / yr and a standard deviation of 2.65 kg / yr. Meanwhile, Average Crop Yield changed from a constant value of 3.783 kg edible / m 2 to a normal distribution with a mean of 3.23 kg edible / m 2 and a standard deviation of 0.442 kg edible / m 2 . The Cropland to Landuse Database ratio changed from a constant value of 0.0996 (9.96%) to a normal distribution with a mean value of 0.0312 (3.12%) and a standard deviation of 0.00292 (0.29%). Finally the crop uptake factor changed from a constant value of 6.37e-4 (Bq crop /kg)/(Bq soil /kg) to a lognormal distribution with a geometric mean value of 3.38e-4 (Bq crop /kg)/(Bq soil /kg) and a standard deviation value of 3.33 (Bq crop /kg)/(Bq soil /kg)

  17. A policy iteration approach to online optimal control of continuous-time constrained-input systems.

    Science.gov (United States)

    Modares, Hamidreza; Naghibi Sistani, Mohammad-Bagher; Lewis, Frank L

    2013-09-01

    This paper is an effort towards developing an online learning algorithm to find the optimal control solution for continuous-time (CT) systems subject to input constraints. The proposed method is based on the policy iteration (PI) technique which has recently evolved as a major technique for solving optimal control problems. Although a number of online PI algorithms have been developed for CT systems, none of them take into account the input constraints caused by actuator saturation. In practice, however, ignoring these constraints leads to performance degradation or even system instability. In this paper, to deal with the input constraints, a suitable nonquadratic functional is employed to encode the constraints into the optimization formulation. Then, the proposed PI algorithm is implemented on an actor-critic structure to solve the Hamilton-Jacobi-Bellman (HJB) equation associated with this nonquadratic cost functional in an online fashion. That is, two coupled neural network (NN) approximators, namely an actor and a critic are tuned online and simultaneously for approximating the associated HJB solution and computing the optimal control policy. The critic is used to evaluate the cost associated with the current policy, while the actor is used to find an improved policy based on information provided by the critic. Convergence to a close approximation of the HJB solution as well as stability of the proposed feedback control law are shown. Simulation results of the proposed method on a nonlinear CT system illustrate the effectiveness of the proposed approach. Copyright © 2013 ISA. All rights reserved.

  18. Optimization of Quantum-state-preserving Frequency Conversion by Changing the Input Signal

    DEFF Research Database (Denmark)

    Andersen, Lasse Mejling; Reddy, D. V.; McKinstrie, C. J.

    We optimize frequency conversion based on four-wave mixing by using the input modes of the system. We find a 10-25 % higher conversion efficiency relative to a pump-shaped input signal.......We optimize frequency conversion based on four-wave mixing by using the input modes of the system. We find a 10-25 % higher conversion efficiency relative to a pump-shaped input signal....

  19. Bi-Objective Optimal Control Modification Adaptive Control for Systems with Input Uncertainty

    Science.gov (United States)

    Nguyen, Nhan T.

    2012-01-01

    This paper presents a new model-reference adaptive control method based on a bi-objective optimal control formulation for systems with input uncertainty. A parallel predictor model is constructed to relate the predictor error to the estimation error of the control effectiveness matrix. In this work, we develop an optimal control modification adaptive control approach that seeks to minimize a bi-objective linear quadratic cost function of both the tracking error norm and predictor error norm simultaneously. The resulting adaptive laws for the parametric uncertainty and control effectiveness uncertainty are dependent on both the tracking error and predictor error, while the adaptive laws for the feedback gain and command feedforward gain are only dependent on the tracking error. The optimal control modification term provides robustness to the adaptive laws naturally from the optimal control framework. Simulations demonstrate the effectiveness of the proposed adaptive control approach.

  20. Optimal control of LQR for discrete time-varying systems with input delays

    Science.gov (United States)

    Yin, Yue-Zhu; Yang, Zhong-Lian; Yin, Zhi-Xiang; Xu, Feng

    2018-04-01

    In this work, we consider the optimal control problem of linear quadratic regulation for discrete time-variant systems with single input and multiple input delays. An innovative and simple method to derive the optimal controller is given. The studied problem is first equivalently converted into a problem subject to a constraint condition. Last, with the established duality, the problem is transformed into a static mathematical optimisation problem without input delays. The optimal control input solution to minimise performance index function is derived by solving this optimisation problem with two methods. A numerical simulation example is carried out and its results show that our two approaches are both feasible and very effective.

  1. Reinforcement learning for adaptive optimal control of unknown continuous-time nonlinear systems with input constraints

    Science.gov (United States)

    Yang, Xiong; Liu, Derong; Wang, Ding

    2014-03-01

    In this paper, an adaptive reinforcement learning-based solution is developed for the infinite-horizon optimal control problem of constrained-input continuous-time nonlinear systems in the presence of nonlinearities with unknown structures. Two different types of neural networks (NNs) are employed to approximate the Hamilton-Jacobi-Bellman equation. That is, an recurrent NN is constructed to identify the unknown dynamical system, and two feedforward NNs are used as the actor and the critic to approximate the optimal control and the optimal cost, respectively. Based on this framework, the action NN and the critic NN are tuned simultaneously, without the requirement for the knowledge of system drift dynamics. Moreover, by using Lyapunov's direct method, the weights of the action NN and the critic NN are guaranteed to be uniformly ultimately bounded, while keeping the closed-loop system stable. To demonstrate the effectiveness of the present approach, simulation results are illustrated.

  2. Optimizing Input/Output Using Adaptive File System Policies

    Science.gov (United States)

    Madhyastha, Tara M.; Elford, Christopher L.; Reed, Daniel A.

    1996-01-01

    Parallel input/output characterization studies and experiments with flexible resource management algorithms indicate that adaptivity is crucial to file system performance. In this paper we propose an automatic technique for selecting and refining file system policies based on application access patterns and execution environment. An automatic classification framework allows the file system to select appropriate caching and pre-fetching policies, while performance sensors provide feedback used to tune policy parameters for specific system environments. To illustrate the potential performance improvements possible using adaptive file system policies, we present results from experiments involving classification-based and performance-based steering.

  3. Identifying Inputs to Leadership Development within an Interdisciplinary Leadership Minor

    Science.gov (United States)

    McKim, Aaron J.; Sorensen, Tyson J.; Velez, Jonathan J.

    2015-01-01

    Researchers conducted a qualitative analysis of students' experiences while enrolled in an interdisciplinary leadership minor with the intent to determine programmatic inputs that spur leadership development. Based on students' reflections, three domains of programmatic inputs for leadership development within the minor were identified. These…

  4. The optimal input optical pulse shape for the self-phase modulation based chirp generator

    Science.gov (United States)

    Zachinyaev, Yuriy; Rumyantsev, Konstantin

    2018-04-01

    The work is aimed to obtain the optimal shape of the input optical pulse for the proper functioning of the self-phase modulation based chirp generator allowing to achieve high values of chirp frequency deviation. During the research, the structure of the device based on self-phase modulation effect using has been analyzed. The influence of the input optical pulse shape of the transmitting optical module on the chirp frequency deviation has been studied. The relationship between the frequency deviation of the generated chirp and frequency linearity for the three options for implementation of the pulse shape has been also estimated. The results of research are related to the development of the theory of radio processors based on fiber-optic structures and can be used in radars, secure communications, geolocation and tomography.

  5. Optimal input shaping for Fisher identifiability of control-oriented lithium-ion battery models

    Science.gov (United States)

    Rothenberger, Michael J.

    This dissertation examines the fundamental challenge of optimally shaping input trajectories to maximize parameter identifiability of control-oriented lithium-ion battery models. Identifiability is a property from information theory that determines the solvability of parameter estimation for mathematical models using input-output measurements. This dissertation creates a framework that exploits the Fisher information metric to quantify the level of battery parameter identifiability, optimizes this metric through input shaping, and facilitates faster and more accurate estimation. The popularity of lithium-ion batteries is growing significantly in the energy storage domain, especially for stationary and transportation applications. While these cells have excellent power and energy densities, they are plagued with safety and lifespan concerns. These concerns are often resolved in the industry through conservative current and voltage operating limits, which reduce the overall performance and still lack robustness in detecting catastrophic failure modes. New advances in automotive battery management systems mitigate these challenges through the incorporation of model-based control to increase performance, safety, and lifespan. To achieve these goals, model-based control requires accurate parameterization of the battery model. While many groups in the literature study a variety of methods to perform battery parameter estimation, a fundamental issue of poor parameter identifiability remains apparent for lithium-ion battery models. This fundamental challenge of battery identifiability is studied extensively in the literature, and some groups are even approaching the problem of improving the ability to estimate the model parameters. The first approach is to add additional sensors to the battery to gain more information that is used for estimation. The other main approach is to shape the input trajectories to increase the amount of information that can be gained from input

  6. Adaptive optimal control of unknown constrained-input systems using policy iteration and neural networks.

    Science.gov (United States)

    Modares, Hamidreza; Lewis, Frank L; Naghibi-Sistani, Mohammad-Bagher

    2013-10-01

    This paper presents an online policy iteration (PI) algorithm to learn the continuous-time optimal control solution for unknown constrained-input systems. The proposed PI algorithm is implemented on an actor-critic structure where two neural networks (NNs) are tuned online and simultaneously to generate the optimal bounded control policy. The requirement of complete knowledge of the system dynamics is obviated by employing a novel NN identifier in conjunction with the actor and critic NNs. It is shown how the identifier weights estimation error affects the convergence of the critic NN. A novel learning rule is developed to guarantee that the identifier weights converge to small neighborhoods of their ideal values exponentially fast. To provide an easy-to-check persistence of excitation condition, the experience replay technique is used. That is, recorded past experiences are used simultaneously with current data for the adaptation of the identifier weights. Stability of the whole system consisting of the actor, critic, system state, and system identifier is guaranteed while all three networks undergo adaptation. Convergence to a near-optimal control law is also shown. The effectiveness of the proposed method is illustrated with a simulation example.

  7. Optimally decoding the input rate from an observation of the interspike intervals

    Energy Technology Data Exchange (ETDEWEB)

    Feng Jianfeng [COGS, University of Sussex at Brighton (United Kingdom) and Computational Neuroscience Laboratory, Babraham Institute, Cambridge (United Kingdom)]. E-mail: jf218@cam.ac.uk

    2001-09-21

    A neuron extensively receives both inhibitory and excitatory inputs. What is the ratio r between these two types of input so that the neuron can most accurately read out input information (rate)? We explore the issue in this paper provided that the neuron is an ideal observer - decoding the input information with the attainment of the Cramer-Rao inequality bound. It is found that, in general, adding certain amounts of inhibitory inputs to a neuron improves its capability of accurately decoding the input information. By calculating the Fisher information of an integrate-and-fire neuron, we determine the optimal ratio r for decoding the input information from an observation of the efferent interspike intervals. Surprisingly, the Fisher information can be zero for certain values of the ratio, seemingly implying that it is impossible to read out the encoded information at these values. By analysing the maximum likelihood estimate of the input information, it is concluded that the input information is in fact most easily estimated at the points where the Fisher information vanishes. (author)

  8. On the input distribution and optimal beamforming for the MISO VLC wiretap channel

    KAUST Repository

    Arfaoui, Mohamed Amine; Rezki, Zouheir; Ghrayeb, Ali; Alouini, Mohamed-Slim

    2017-01-01

    We investigate in this paper the achievable secrecy rate of the multiple-input single-output (MISO) visible light communication (VLC) Gaussian wiretap channel with single user and single eavesdropper. We consider the cases when the location of eavesdropper is known or unknown to the transmitter. In the former case, we derive the optimal beamforming in closed form, subject to constrained inputs. In the latter case, we apply robust beamforming. Furthermore, we study the achievable secrecy rate when the input follows the truncated generalized normal (TGN) distribution. We present several examples which demonstrate the substantial improvements in the secrecy rates achieved by the proposed techniques.

  9. On the input distribution and optimal beamforming for the MISO VLC wiretap channel

    KAUST Repository

    Arfaoui, Mohamed Amine

    2017-05-12

    We investigate in this paper the achievable secrecy rate of the multiple-input single-output (MISO) visible light communication (VLC) Gaussian wiretap channel with single user and single eavesdropper. We consider the cases when the location of eavesdropper is known or unknown to the transmitter. In the former case, we derive the optimal beamforming in closed form, subject to constrained inputs. In the latter case, we apply robust beamforming. Furthermore, we study the achievable secrecy rate when the input follows the truncated generalized normal (TGN) distribution. We present several examples which demonstrate the substantial improvements in the secrecy rates achieved by the proposed techniques.

  10. Consideration of Optimal Input on Semi-Active Shock Control System

    Science.gov (United States)

    Kawashima, Takeshi

    In press working, unidirectional transmission of mechanical energy is expected in order to maximize the life of the dies. To realize this transmission, the author has developed a shock control system based on the sliding mode control technique. The controller makes a collision-receiving object effectively deform plastically by adjusting the force of the actuator inserted between the colliding objects, while the deformation of the colliding object is held at the necessity minimum. However, the actuator has to generate a large force corresponding to the impulsive force. Therefore, development of such an actuator is a formidable challenge. The author has proposed a semi-active shock control system in which the impulsive force is adjusted by a brake mechanism, although the system exhibits inferior performance. Thus, the author has also designed an actuator using a friction device for semi-active shock control, and proposed an active seatbelt system as an application. The effectiveness has been confirmed by a numerical simulation and model experiment. In this study, the optimal deformation change of the colliding object is theoretically examined in the case that the collision-receiving object has perfect plasticity and the colliding object has perfect elasticity. As a result, the optimal input condition is obtained so that the ratio of the maximum deformation of the collision-receiving object to the maximum deformation of the colliding object becomes the maximum. Additionally, the energy balance is examined.

  11. Optimal Control of a PEM Fuel Cell for the Inputs Minimization

    Directory of Open Access Journals (Sweden)

    José de Jesús Rubio

    2014-01-01

    Full Text Available The trajectory tracking problem of a proton exchange membrane (PEM fuel cell is considered. To solve this problem, an optimal controller is proposed. The optimal technique has the objective that the system states should reach the desired trajectories while the inputs are minimized. The proposed controller uses the Hamilton-Jacobi-Bellman method where its Riccati equation is considered as an adaptive function. The effectiveness of the proposed technique is verified by two simulations.

  12. Development of MIDAS/SMR Input Deck for SMART

    International Nuclear Information System (INIS)

    Cho, S. W.; Oh, H. K.; Lee, J. M.; Lee, J. H.; Yoo, K. J.; Kwun, S. K.; Hur, H.

    2010-01-01

    The objective of this study is to develop MIDAS/SMR code basic input deck for the severe accidents by simulating the steady state for the SMART plant. SMART plant is an integrated reactor developed by KAERI. For the assessment of reactor safety and severe accident management strategy, it is necessary to simulate severe accidents using the MIDAS/SMR code which is being developed by KAERI. The input deck of the MIDAS/SMR code for the SMART plant is prepared to simulate severe accident sequences for the users who are not familiar with the code. A steady state is obtained and the results are compared with design values. The input deck will be improved through the simulation of the DBAs and severe accidents. The base input deck of the MIDAS/SMR code can be used to simulate severe accident scenarios after improvement. Source terms and hydrogen generation can be analyzed through the simulation of the severe accident. The information gained from analyses of severe accidents is expected to be helpful to develop the severe accident management strategy

  13. Using Random Forests to Select Optimal Input Variables for Short-Term Wind Speed Forecasting Models

    Directory of Open Access Journals (Sweden)

    Hui Wang

    2017-10-01

    Full Text Available Achieving relatively high-accuracy short-term wind speed forecasting estimates is a precondition for the construction and grid-connected operation of wind power forecasting systems for wind farms. Currently, most research is focused on the structure of forecasting models and does not consider the selection of input variables, which can have significant impacts on forecasting performance. This paper presents an input variable selection method for wind speed forecasting models. The candidate input variables for various leading periods are selected and random forests (RF is employed to evaluate the importance of all variable as features. The feature subset with the best evaluation performance is selected as the optimal feature set. Then, kernel-based extreme learning machine is constructed to evaluate the performance of input variables selection based on RF. The results of the case study show that by removing the uncorrelated and redundant features, RF effectively extracts the most strongly correlated set of features from the candidate input variables. By finding the optimal feature combination to represent the original information, RF simplifies the structure of the wind speed forecasting model, shortens the training time required, and substantially improves the model’s accuracy and generalization ability, demonstrating that the input variables selected by RF are effective.

  14. Optimal testing input sets for reduced diagnosis time of nuclear power plant digital electronic circuits

    International Nuclear Information System (INIS)

    Kim, D.S.; Seong, P.H.

    1994-01-01

    This paper describes the optimal testing input sets required for the fault diagnosis of the nuclear power plant digital electronic circuits. With the complicated systems such as very large scale integration (VLSI), nuclear power plant (NPP), and aircraft, testing is the major factor of the maintenance of the system. Particularly, diagnosis time grows quickly with the complexity of the component. In this research, for reduce diagnosis time the authors derived the optimal testing sets that are the minimal testing sets required for detecting the failure and for locating of the failed component. For reduced diagnosis time, the technique presented by Hayes fits best for the approach to testing sets generation among many conventional methods. However, this method has the following disadvantages: (a) it considers only the simple network (b) it concerns only whether the system is in failed state or not and does not provide the way to locate the failed component. Therefore the authors have derived the optimal testing input sets that resolve these problems by Hayes while preserving its advantages. When they applied the optimal testing sets to the automatic fault diagnosis system (AFDS) which incorporates the advanced fault diagnosis method of artificial intelligence technique, they found that the fault diagnosis using the optimal testing sets makes testing the digital electronic circuits much faster than that using exhaustive testing input sets; when they applied them to test the Universal (UV) Card which is a nuclear power plant digital input/output solid state protection system card, they reduced the testing time up to about 100 times

  15. Second-order Optimality Conditions for Optimal Control of the Primitive Equations of the Ocean with Periodic Inputs

    International Nuclear Information System (INIS)

    Tachim Medjo, T.

    2011-01-01

    We investigate in this article the Pontryagin's maximum principle for control problem associated with the primitive equations (PEs) of the ocean with periodic inputs. We also derive a second-order sufficient condition for optimality. This work is closely related to Wang (SIAM J. Control Optim. 41(2):583-606, 2002) and He (Acta Math. Sci. Ser. B Engl. Ed. 26(4):729-734, 2006), in which the authors proved similar results for the three-dimensional Navier-Stokes (NS) systems.

  16. Development and operation of K-URT data input system

    International Nuclear Information System (INIS)

    Kim, Yun Jae; Myoung, Noh Hoon; Kim, Jong Hyun; Han, Jae Jun

    2010-05-01

    Activities for TSPA(Total System Performance Assessment) on the permanent disposal of high level radioactive waste includes production of input data, safety assessment using input data, license procedure and others. These activities are performed in 5 steps as follows; (1) Adequate planning, (2) Controlled execution, (3) Complete documentation, (4) Thorough review, (5) Independent oversight. For the confidence building, it is very important to record and manage the materials obtained from research works in transparency. For the documentation of disposal research work from planning stage to data management stage, KAERI developed CYPRUS named CYBER R and D Platform for Radwaste Disposal in Underground System with a QA(Quality Assurance) System. In CYPRUS, QA system makes effects on other functions such as data management, project management and others. This report analyzes the structure of CYPRUS and proposes to accumulate qualified data, to provide a convenient application and to promote access and use of CYPRUS for a future-oriented system

  17. Input price risk and optimal timing of energy investment: choice between fossil- and biofuels

    Energy Technology Data Exchange (ETDEWEB)

    Murto, Pauli; Nese, Gjermund

    2002-05-01

    We consider energy investment, when a choice has to be made between fossil fuel and biomass fired production technologies. A dynamic model is presented to illustrate the effect of the different degrees of input price uncertainty on the choice of technology and the timing of the investment. It is shown that when the choice of technology is irreversible, it may be optimal to postpone the investment even if it would otherwise be optimal to invest in one or both of the plant types. We provide a numerical example based on cost, estimates of two different power plant types. (author)

  18. Input price risk and optimal timing of energy investment: choice between fossil- and biofuels

    International Nuclear Information System (INIS)

    Murto, Pauli; Nese, Gjermund

    2002-01-01

    We consider energy investment, when a choice has to be made between fossil fuel and biomass fired production technologies. A dynamic model is presented to illustrate the effect of the different degrees of input price uncertainty on the choice of technology and the timing of the investment. It is shown that when the choice of technology is irreversible, it may be optimal to postpone the investment even if it would otherwise be optimal to invest in one or both of the plant types. We provide a numerical example based on cost, estimates of two different power plant types. (author)

  19. Improved quality of input data for maintenance optimization using expert judgment

    International Nuclear Information System (INIS)

    Oien, Knut

    1998-01-01

    Most maintenance optimization models need an estimate of the so-called 'naked' failure rate function as input. In practice it is very difficult to estimate the 'naked' failure rate, because overhauls and other preventive maintenance actions tend to 'corrupt' the recorded lifelengths. The purpose of this paper is to stress the importance of utilizing the knowledge of maintenance engineers, i.e., expert judgment, in addition to recorded equipment lifelengths, in order to get credible input data. We have shown that without utilizing expert judgment, the estimated mean time to failure may be strongly biased, often by a factor of 2-3, depending on the life distribution that is assumed. We recommend including a simple question about the mean remaining lifelength on the work-order forms. By this approach the knowledge of maintenance engineers may be incorporated in a simple and cost-effective way

  20. Full-order optimal compensators for flow control: the multiple inputs case

    Science.gov (United States)

    Semeraro, Onofrio; Pralits, Jan O.

    2018-03-01

    Flow control has been the subject of numerous experimental and theoretical works. We analyze full-order, optimal controllers for large dynamical systems in the presence of multiple actuators and sensors. The full-order controllers do not require any preliminary model reduction or low-order approximation: this feature allows us to assess the optimal performance of an actuated flow without relying on any estimation process or further hypothesis on the disturbances. We start from the original technique proposed by Bewley et al. (Meccanica 51(12):2997-3014, 2016. https://doi.org/10.1007/s11012-016-0547-3), the adjoint of the direct-adjoint (ADA) algorithm. The algorithm is iterative and allows bypassing the solution of the algebraic Riccati equation associated with the optimal control problem, typically infeasible for large systems. In this numerical work, we extend the ADA iteration into a more general framework that includes the design of controllers with multiple, coupled inputs and robust controllers (H_{∞} methods). First, we demonstrate our results by showing the analytical equivalence between the full Riccati solutions and the ADA approximations in the multiple inputs case. In the second part of the article, we analyze the performance of the algorithm in terms of convergence of the solution, by comparing it with analogous techniques. We find an excellent scalability with the number of inputs (actuators), making the method a viable way for full-order control design in complex settings. Finally, the applicability of the algorithm to fluid mechanics problems is shown using the linearized Kuramoto-Sivashinsky equation and the Kármán vortex street past a two-dimensional cylinder.

  1. Evolving a Method to Capture Science Stakeholder Inputs to Optimize Instrument, Payload, and Program Design

    Science.gov (United States)

    Clark, P. E.; Rilee, M. L.; Curtis, S. A.; Bailin, S.

    2012-03-01

    We are developing Frontier, a highly adaptable, stably reconfigurable, web-accessible intelligent decision engine capable of optimizing design as well as the simulating operation of complex systems in response to evolving needs and environment.

  2. Design optimization of radial flux permanent magnetwind generator for highest annual energy input and lower magnet volumes

    Energy Technology Data Exchange (ETDEWEB)

    Faiz, J.; Rajabi-Sebdani, M.; Ebrahimi, B. M. (Univ. of Tehran, Tehran (Iran)); Khan, M. A. (Univ. of Cape Town, Cape Town (South Africa))

    2008-07-01

    This paper presents a multi-objective optimization method to maximize annual energy input (AEI) and minimize permanent magnet (PM) volume in use. For this purpose, the analytical model of the machine is utilized. Effects of generator specifications on the annual energy input and PM volume are then investigated. Permanent magnet synchronous generator (PMSG) parameters and dimensions are then optimized using genetic algorithm incorporated with an appropriate objective function. The results show an enhancement in PMSG performance. Finally 2D time stepping finite element method (2D TSFE) is used to verify the analytical results. Comparison of the results validates the optimization method

  3. Optimization model of peach production relevant to input energies – Yield function in Chaharmahal va Bakhtiari province, Iran

    International Nuclear Information System (INIS)

    Ghatrehsamani, Shirin; Ebrahimi, Rahim; Kazi, Salim Newaz; Badarudin Badry, Ahmad; Sadeghinezhad, Emad

    2016-01-01

    The aim of this study was to determine the amount of input–output energy used in peach production and to develop an optimal model of production in Chaharmahal va Bakhtiari province, Iran. Data were collected from 100 producers by administering a questionnaire in face-to-face interviews. Farms were selected based on random sampling method. Results revealed that the total energy of production is 47,951.52 MJ/ha and the highest share of energy consumption belongs to chemical fertilizers (35.37%). Consumption of direct energy was 47.4% while indirect energy was 52.6%. Also, Total energy consumption was divided into two groups; renewable and non-renewable (19.2% and 80.8% respectively). Energy use efficiency, Energy productivity, Specific energy and Net energy were calculated as 0.433, 0.228 (kg/MJ), 4.38 (MJ/kg) and −27,161.722 (MJ/ha), respectively. According to the negative sign for Net energy, if special strategy is used, energy dismiss will decrease and negative effect of some parameters could be omitted. In the present case the amount is indicating decimate of production energy. In addition, energy efficiency was not high enough. Some of the input energies were applied to machinery, chemical fertilizer, water irrigation and electricity which had significant effect on increasing production and MPP (marginal physical productivity) was determined for variables. This parameter was positive for energy groups namely; machinery, diesel fuel, chemical fertilizer, water irrigation and electricity while it was negative for other kind of energy such as chemical pesticides and human labor. Finally, there is a need to pursue a new policy to force producers to undertake energy-efficient practices to establish sustainable production systems without disrupting the natural resources. In addition, extension activities are needed to improve the efficiency of energy consumption and to sustain the natural resources. - Highlights: • Replacing non-renewable energy with renewable

  4. Optimizing production with energy and GHG emission constraints in Greece: An input-output analysis

    International Nuclear Information System (INIS)

    Hristu-Varsakelis, D.; Karagianni, S.; Pempetzoglou, M.; Sfetsos, A.

    2010-01-01

    Under its Kyoto and EU obligations, Greece has committed to a greenhouse gas (GHG) emissions increase of at most 25% compared to 1990 levels, to be achieved during the period 2008-2012. Although this restriction was initially regarded as being realistic, information derived from GHG emissions inventories shows that an increase of approximately 28% has already taken place between 1990 and 2005, highlighting the need for immediate action. This paper explores the reallocation of production in Greece, on a sector-by-sector basis, in order to meet overall demand constraints and GHG emissions targets. We pose a constrained optimization problem, taking into account the Greek environmental input-output matrix for 2005, the amount of utilized energy and pollution reduction options. We examine two scenarios, limiting fluctuations in sectoral production to at most 10% and 15%, respectively, compared to baseline (2005) values. Our results indicate that (i) GHG emissions can be reduced significantly with relatively limited effects on GVP growth rates, and that (ii) greater cutbacks in GHG emissions can be achieved as more flexible production scenarios are allowed.

  5. Fertilizing growth: Agricultural inputs and their effects in economic development.

    Science.gov (United States)

    McArthur, John W; McCord, Gordon C

    2017-07-01

    This paper estimates the role of agronomic inputs in cereal yield improvements and the consequences for countries' processes of structural change. The results suggest a clear role for fertilizer, modern seeds and water in boosting yields. We then test for respective empirical links between agricultural yields and economic growth, labor share in agriculture and non-agricultural value added per worker. The identification strategy includes a novel instrumental variable that exploits the unique economic geography of fertilizer production and transport costs to countries' agricultural heartlands. We estimate that a half ton increase in staple yields generates a 14 to 19 percent higher GDP per capita and a 4.6 to 5.6 percentage point lower labor share in agriculture five years later. The results suggest a strong role for agricultural productivity as a driver of structural change.

  6. Development the Controller Input Power of Peripheral Interfacing Controller Using Other Micro controller

    International Nuclear Information System (INIS)

    Syirrazie Che Soh; Harzawardi Hashim; Nor Arymaswati Abdullah; Nur Aira Abdul Rahman; Mohd Ashhar Khalid

    2011-01-01

    This Controller Input Power of a Peripheral Interfacing Controller was developed using the other micro controller. This paper discuss the switching technique are practiced using proper electronic device to develop the controller, thus enable to control the input power of a PIC in order to expand their interfacing capacity and control. This may allow the PIC could be used to acquire input and control output signal from electronic and electromechanical device and instrument as well as software in wide scale and application. (author)

  7. Developing optimized prioritizing road maintenance

    Directory of Open Access Journals (Sweden)

    Ewadh Hussein Ali

    2018-01-01

    Full Text Available Increased demand for efficient maintenance of the existing roadway system needs optimal usage of the allocated funds. The paper demonstrates optimized methods for prioritizing maintenance implementation projects. A selected zone of roadway system in Kerbala city represents the study area to demonstrate the application of the developed prioritization process. Paver system PAVER integrated with GIS is used to estimate and display the pavement condition index PCI, thereby to establish a priority of maintenance. In addition to simple ranking method by PCI produced by the output of PAVER, the paper introduces PCI measure for each section of roadway. The paper introduces ranking by multiple measures investigated through expert knowledge about measures that affect prioritization and their irrespective weights due to a predesigned questionnaire. The maintenance priority index (MPI is related to cost of suitable proposed maintenance, easiness of proposed maintenance, average daily traffic and functional classification of the roadway in addition to PCI. Further, incremental benefit-cost analysis ranking provide an optimized process due to benefit and cost of maintenance. The paper introduces efficient display of layout and ranking for the selected zone of roadway system based on MPI index and incremental BCR method. Although the two developed methods introduce different layout display for priority, statistical test shows that no significant difference between ranking of all methods of prioritization.

  8. Optimization modeling of U.S. renewable electricity deployment using local input variables

    Science.gov (United States)

    Bernstein, Adam

    For the past five years, state Renewable Portfolio Standard (RPS) laws have been a primary driver of renewable electricity (RE) deployments in the United States. However, four key trends currently developing: (i) lower natural gas prices, (ii) slower growth in electricity demand, (iii) challenges of system balancing intermittent RE within the U.S. transmission regions, and (iv) fewer economical sites for RE development, may limit the efficacy of RPS laws over the remainder of the current RPS statutes' lifetime. An outsized proportion of U.S. RE build occurs in a small number of favorable locations, increasing the effects of these variables on marginal RE capacity additions. A state-by-state analysis is necessary to study the U.S. electric sector and to generate technology specific generation forecasts. We used LP optimization modeling similar to the National Renewable Energy Laboratory (NREL) Renewable Energy Development System (ReEDS) to forecast RE deployment across the 8 U.S. states with the largest electricity load, and found state-level RE projections to Year 2031 significantly lower than thoseimplied in the Energy Information Administration (EIA) 2013 Annual Energy Outlook forecast. Additionally, the majority of states do not achieve their RPS targets in our forecast. Combined with the tendency of prior research and RE forecasts to focus on larger national and global scale models, we posit that further bottom-up state and local analysis is needed for more accurate policy assessment, forecasting, and ongoing revision of variables as parameter values evolve through time. Current optimization software eliminates much of the need for algorithm coding and programming, allowing for rapid model construction and updating across many customized state and local RE parameters. Further, our results can be tested against the empirical outcomes that will be observed over the coming years, and the forecast deviation from the actuals can be attributed to discrete parameter

  9. Imported Input Varieties and Product Innovation : Evidence from Five Developing Countries

    NARCIS (Netherlands)

    Bos, Marijke; Vannoorenberghe, Gonzague

    We examine how access to imported intermediate inputs affects firm-level product innovation in five developing counties. We combine trade data with survey data on innovation and develop a method to determine whether new inputs were essential for the product innovation. We find evidence that the

  10. Summary of FY-1978 consultation input for Scenario Methodology Development

    International Nuclear Information System (INIS)

    Scott, B.L.; Benson, G.L.; Craig, R.A.; Harwell, M.A.

    1979-11-01

    The Scenario Methodology Development task is concerned with evaluating the geologic system surrounding an underground repository and describing the phenomena (volcanic, seismic, meteorite, hydrologic, tectonic, climate, etc.) which could perturb the system and possibly cause loss of repository integrity. This document includes 14 individual papers. Separate abstracts were prepared for all 14 papers

  11. Development of the GUI environments of MIDAS code for convenient input and output processing

    International Nuclear Information System (INIS)

    Kim, K. L.; Kim, D. H.

    2003-01-01

    MIDAS is being developed at KAERI as an integrated Severe Accident Analysis Code with easy model modification and addition by restructuring the data transfer scheme. In this paper, the input file management system, IEDIT and graphic simulation system, SATS, are presented as MIDAS input and output GUI systems. These two systems would form the basis of the MIDAS GUI system for input and output processing, and they are expected to be useful tools for severe accidents analysis and simulation

  12. Categorical Inputs, Sensitivity Analysis, Optimization and Importance Tempering with tgp Version 2, an R Package for Treed Gaussian Process Models

    Directory of Open Access Journals (Sweden)

    Robert B. Gramacy

    2010-02-01

    Full Text Available This document describes the new features in version 2.x of the tgp package for R, implementing treed Gaussian process (GP models. The topics covered include methods for dealing with categorical inputs and excluding inputs from the tree or GP part of the model; fully Bayesian sensitivity analysis for inputs/covariates; sequential optimization of black-box functions; and a new Monte Carlo method for inference in multi-modal posterior distributions that combines simulated tempering and importance sampling. These additions extend the functionality of tgp across all models in the hierarchy: from Bayesian linear models, to classification and regression trees (CART, to treed Gaussian processes with jumps to the limiting linear model. It is assumed that the reader is familiar with the baseline functionality of the package, outlined in the first vignette (Gramacy 2007.

  13. Dynamic PET of human liver inflammation: impact of kinetic modeling with optimization-derived dual-blood input function.

    Science.gov (United States)

    Wang, Guobao; Corwin, Michael T; Olson, Kristin A; Badawi, Ramsey D; Sarkar, Souvik

    2018-05-30

    The hallmark of nonalcoholic steatohepatitis is hepatocellular inflammation and injury in the setting of hepatic steatosis. Recent work has indicated that dynamic 18F-FDG PET with kinetic modeling has the potential to assess hepatic inflammation noninvasively, while static FDG-PET did not show a promise. Because the liver has dual blood supplies, kinetic modeling of dynamic liver PET data is challenging in human studies. The objective of this study is to evaluate and identify a dual-input kinetic modeling approach for dynamic FDG-PET of human liver inflammation. Fourteen human patients with nonalcoholic fatty liver disease were included in the study. Each patient underwent one-hour dynamic FDG-PET/CT scan and had liver biopsy within six weeks. Three models were tested for kinetic analysis: traditional two-tissue compartmental model with an image-derived single-blood input function (SBIF), model with population-based dual-blood input function (DBIF), and modified model with optimization-derived DBIF through a joint estimation framework. The three models were compared using Akaike information criterion (AIC), F test and histopathologic inflammation reference. The results showed that the optimization-derived DBIF model improved the fitting of liver time activity curves and achieved lower AIC values and higher F values than the SBIF and population-based DBIF models in all patients. The optimization-derived model significantly increased FDG K1 estimates by 101% and 27% as compared with traditional SBIF and population-based DBIF. K1 by the optimization-derived model was significantly associated with histopathologic grades of liver inflammation while the other two models did not provide a statistical significance. In conclusion, modeling of DBIF is critical for kinetic analysis of dynamic liver FDG-PET data in human studies. The optimization-derived DBIF model is more appropriate than SBIF and population-based DBIF for dynamic FDG-PET of liver inflammation. © 2018

  14. Semidefinite Relaxation-Based Optimization of Multiple-Input Wireless Power Transfer Systems

    Science.gov (United States)

    Lang, Hans-Dieter; Sarris, Costas D.

    2017-11-01

    An optimization procedure for multi-transmitter (MISO) wireless power transfer (WPT) systems based on tight semidefinite relaxation (SDR) is presented. This method ensures physical realizability of MISO WPT systems designed via convex optimization -- a robust, semi-analytical and intuitive route to optimizing such systems. To that end, the nonconvex constraints requiring that power is fed into rather than drawn from the system via all transmitter ports are incorporated in a convex semidefinite relaxation, which is efficiently and reliably solvable by dedicated algorithms. A test of the solution then confirms that this modified problem is equivalent (tight relaxation) to the original (nonconvex) one and that the true global optimum has been found. This is a clear advantage over global optimization methods (e.g. genetic algorithms), where convergence to the true global optimum cannot be ensured or tested. Discussions of numerical results yielded by both the closed-form expressions and the refined technique illustrate the importance and practicability of the new method. It, is shown that this technique offers a rigorous optimization framework for a broad range of current and emerging WPT applications.

  15. Development of the RETRAN input model for Ulchin 3/4 visual system analyzer

    International Nuclear Information System (INIS)

    Lee, S. W.; Kim, K. D.; Lee, Y. J.; Lee, W. J.; Chung, B. D.; Jeong, J. J.; Hwang, M. K.

    2004-01-01

    As a part of the Long-Term Nuclear R and D program, KAERI has developed the so-called Visual System Analyzer (ViSA) based on best-estimate codes. The MARS and RETRAN codes are used as the best-estimate codes for ViSA. Between these two codes, the RETRAN code is used for realistic analysis of Non-LOCA transients and small-break loss-of-coolant accidents, of which break size is less than 3 inch diameter. So it is necessary to develop the RETRAN input model for Ulchin 3/4 plants (KSNP). In recognition of this, the RETRAN input model for Ulchin 3/4 plants has been developed. This report includes the input model requirements and the calculation note for the input data generation (see the Appendix). In order to confirm the validity of the input data, the calculations are performed for a steady state at 100 % power operation condition, inadvertent reactor trip and RCP trip. The results of the steady-state calculation agree well with the design data. The results of the other transient calculations seem to be reasonable and consistent with those of other best-estimate calculations. Therefore, the RETRAN input data can be used as a base input deck for the RETRAN transient analyzer for Ulchin 3/4. Moreover, it is found that Core Protection Calculator (CPC) module, which is modified by Korea Electric Power Research Institute (KEPRI), is well adapted to ViSA

  16. Development of NUPREP PC Version and Input Structures for NUCIRC Single Channel Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Churl; Jun, Ji Su; Park, Joo Hwan

    2007-12-15

    The input file for a steady-state thermal-hydraulic code NUCIRC consists of common channel input data and specific channel input data in a case of single channel analysis. Even when all the data is ready for the 380 channels' single channel analyses, it takes long time and requires enormous effort to compose an input file by hand-editing. The automatic pre-processor for this tedious job is a NUPREP code. In this study, a NUPREP PC version has been developed from the source list in the program manual of NUCIRC-MOD2.000 that is imported in a form of an execution file. In this procedure, some errors found in PC executions and lost statements are fixed accordingly. It is confirmed that the developed NUPREP code produces input file correctly for the CANDU-6 single channel analysis. Additionally, the NUCIRC input structure and data format are summarized for a single channel analysis and the input CARDs required for the creep information of aged channels are listed.

  17. Development of NUPREP PC Version and Input Structures for NUCIRC Single Channel Analyses

    International Nuclear Information System (INIS)

    Yoon, Churl; Jun, Ji Su; Park, Joo Hwan

    2007-12-01

    The input file for a steady-state thermal-hydraulic code NUCIRC consists of common channel input data and specific channel input data in a case of single channel analysis. Even when all the data is ready for the 380 channels' single channel analyses, it takes long time and requires enormous effort to compose an input file by hand-editing. The automatic pre-processor for this tedious job is a NUPREP code. In this study, a NUPREP PC version has been developed from the source list in the program manual of NUCIRC-MOD2.000 that is imported in a form of an execution file. In this procedure, some errors found in PC executions and lost statements are fixed accordingly. It is confirmed that the developed NUPREP code produces input file correctly for the CANDU-6 single channel analysis. Additionally, the NUCIRC input structure and data format are summarized for a single channel analysis and the input CARDs required for the creep information of aged channels are listed

  18. Development of an Input Model to MELCOR 1.8.5 for the Ringhals 3 PWR

    International Nuclear Information System (INIS)

    Nilsson, Lars

    2004-12-01

    An input file to the severe accident code MELCOR 1.8.5 has been developed for the Swedish pressurized water reactor Ringhals 3. The aim was to produce a file that can be used for calculations of various postulated severe accident scenarios, although the first application is specifically on cases involving large hydrogen production. The input file is rather detailed with individual modelling of all three cooling loops. The report describes the basis for the Ringhals 3 model and the input preparation step by step and is illustrated by nodalization schemes of the different plant systems. Present version of the report is restricted to the fundamental MELCOR input preparation, and therefore most of the figures of Ringhals 3 measurements and operating parameters are excluded here. These are given in another, complete version of the report, for limited distribution, which includes tables for pertinent data of all components. That version contains appendices with a complete listing of the input files as well as tables of data compiled from a RELAP5 file, that was a major basis for the MELCOR input for the cooling loops. The input was tested in steady-state calculations in order to simulate the initial conditions at current nominal operating conditions in Ringhals 3 for 2775 MW thermal power. The results of the steady-state calculations are presented in the report. Calculations with the MELCOR model will then be carried out of certain accident sequences for comparison with results from earlier MAAP4 calculations. That work will be reported separately

  19. Input vector optimization of feed-forward neural networks for fitting ab initio potential-energy databases

    Science.gov (United States)

    Malshe, M.; Raff, L. M.; Hagan, M.; Bukkapatnam, S.; Komanduri, R.

    2010-05-01

    to permit error minimization with respect to n as well as the weights and biases of the NN, the optimum powers were all found to lie in the range of 1.625-2.38 for the four systems studied. No statistically significant increase in fitting accuracy was achieved for vinyl bromide when a different value of n was employed and optimized for each bond type. The rate of change in the fitting error with n is found to be very small when n is near its optimum value. Consequently, good fitting accuracy can be achieved by employing a value of n in the middle of the above range. The use of interparticle distances as elements of the input vector rather than the Z-matrix variables employed in the electronic structure calculations is found to reduce the rms fitting errors by factors of 8.86 and 1.67 for Si5 and vinyl bromide, respectively. If the interparticle distances are replaced with input elements of the form Rij-n with n optimized, further reductions in the rms error by a factor of 1.31 to 2.83 for the four systems investigated are obtained. A major advantage of using this procedure to increase NN fitting accuracy rather than increasing the number of neurons or the size of the database is that the required increase in computational effort is very small.

  20. Dependence of Computational Models on Input Dimension: Tractability of Approximation and Optimization Tasks

    Czech Academy of Sciences Publication Activity Database

    Kainen, P.C.; Kůrková, Věra; Sanguineti, M.

    2012-01-01

    Roč. 58, č. 2 (2012), s. 1203-1214 ISSN 0018-9448 R&D Projects: GA MŠk(CZ) ME10023; GA ČR GA201/08/1744; GA ČR GAP202/11/1368 Grant - others:CNR-AV ČR(CZ-IT) Project 2010–2012 Complexity of Neural -Network and Kernel Computational Models Institutional research plan: CEZ:AV0Z10300504 Keywords : dictionary-based computational models * high-dimensional approximation and optimization * model complexity * polynomial upper bounds Subject RIV: IN - Informatics, Computer Science Impact factor: 2.621, year: 2012

  1. Development and validation of gui based input file generation code for relap

    International Nuclear Information System (INIS)

    Anwar, M.M.; Khan, A.A.; Chughati, I.R.; Chaudri, K.S.; Inyat, M.H.; Hayat, T.

    2009-01-01

    Reactor Excursion and Leak Analysis Program (RELAP) is a widely acceptable computer code for thermal hydraulics modeling of Nuclear Power Plants. It calculates thermal- hydraulic transients in water-cooled nuclear reactors by solving approximations to the one-dimensional, two-phase equations of hydraulics in an arbitrarily connected system of nodes. However, the preparation of input file and subsequent analysis of results in this code is a tedious task. The development of a Graphical User Interface (GUI) for preparation of the input file for RELAP-5 is done with the validation of GUI generated Input File. The GUI is developed in Microsoft Visual Studio using Visual C Sharp (C) as programming language. The Nodalization diagram is drawn graphically and the program contains various component forms along with the starting data form, which are launched for properties assignment to generate Input File Cards serving as GUI for the user. The GUI is provided with Open / Save function to store and recall the Nodalization diagram along with Components' properties. The GUI generated Input File is validated for several case studies and individual component cards are compared with the originally required format. The generated Input File of RELAP is found consistent with the requirement of RELAP. The GUI provided a useful platform for simulating complex hydrodynamic problems efficiently with RELAP. (author)

  2. Development of the MARS input model for Kori nuclear units 1 transient analyzer

    International Nuclear Information System (INIS)

    Hwang, M.; Kim, K. D.; Lee, S. W.; Lee, Y. J.; Lee, W. J.; Chung, B. D.; Jeong, J. J.

    2004-11-01

    KAERI has been developing the 'NSSS transient analyzer' based on best-estimate codes for Kori Nuclear Units 1 plants. The MARS and RETRAN codes have been used as the best-estimate codes for the NSSS transient analyzer. Among these codes, the MARS code is adopted for realistic analysis of small- and large-break loss-of-coolant accidents, of which break size is greater than 2 inch diameter. So it is necessary to develop the MARS input model for Kori Nuclear Units 1 plants. This report includes the input model (hydrodynamic component and heat structure models) requirements and the calculation note for the MARS input data generation for Kori Nuclear Units 1 plant analyzer (see the Appendix). In order to confirm the validity of the input data, we performed the calculations for a steady state at 100 % power operation condition and a double-ended cold leg break LOCA. The results of the steady-state calculation agree well with the design data. The results of the LOCA calculation seem to be reasonable and consistent with those of other best-estimate calculations. Therefore, the MARS input data can be used as a base input deck for the MARS transient analyzer for Kori Nuclear Units 1

  3. Adaptive near-optimal neuro controller for continuous-time nonaffine nonlinear systems with constrained input.

    Science.gov (United States)

    Esfandiari, Kasra; Abdollahi, Farzaneh; Talebi, Heidar Ali

    2017-09-01

    In this paper, an identifier-critic structure is introduced to find an online near-optimal controller for continuous-time nonaffine nonlinear systems having saturated control signal. By employing two Neural Networks (NNs), the solution of Hamilton-Jacobi-Bellman (HJB) equation associated with the cost function is derived without requiring a priori knowledge about system dynamics. Weights of the identifier and critic NNs are tuned online and simultaneously such that unknown terms are approximated accurately and the control signal is kept between the saturation bounds. The convergence of NNs' weights, identification error, and system states is guaranteed using Lyapunov's direct method. Finally, simulation results are performed on two nonlinear systems to confirm the effectiveness of the proposed control strategy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Development of a Math Input Interface with Flick Operation for Mobile Devices

    Science.gov (United States)

    Nakamura, Yasuyuki; Nakahara, Takahiro

    2016-01-01

    Developing online test environments for e-learning for mobile devices will be useful to increase drill practice opportunities. In order to provide a drill practice environment for calculus using an online math test system, such as STACK, we develop a flickable math input interface that can be easily used on mobile devices. The number of taps…

  5. The Use of Input-Output Control System Analysis for Sustainable Development of Multivariable Environmental Systems

    Science.gov (United States)

    Koliopoulos, T. C.; Koliopoulou, G.

    2007-10-01

    We present an input-output solution for simulating the associated behavior and optimized physical needs of an environmental system. The simulations and numerical analysis determined the accurate boundary loads and areas that were required to interact for the proper physical operation of a complicated environmental system. A case study was conducted to simulate the optimum balance of an environmental system based on an artificial intelligent multi-interacting input-output numerical scheme. The numerical results were focused on probable further environmental management techniques, with the objective of minimizing any risks and associated environmental impact to protect the quality of public health and the environment. Our conclusions allowed us to minimize the associated risks, focusing on probable cases in an emergency to protect the surrounded anthropogenic or natural environment. Therefore, the lining magnitude could be determined for any useful associated technical works to support the environmental system under examination, taking into account its particular boundary necessities and constraints.

  6. Development of an Input Model to MELCOR 1.8.5 for the Oskarshamn 3 BWR

    Energy Technology Data Exchange (ETDEWEB)

    Nilsson, Lars [Lentek, Nykoeping (Sweden)

    2006-05-15

    An input model has been prepared to the code MELCOR 1.8.5 for the Swedish Oskarshamn 3 Boiling Water Reactor (O3). This report describes the modelling work and the various files which comprise the input deck. Input data are mainly based on original drawings and system descriptions made available by courtesy of OKG AB. Comparison and check of some primary system data were made against an O3 input file to the SCDAP/RELAP5 code that was used in the SARA project. Useful information was also obtained from the FSAR (Final Safety Analysis Report) for O3 and the SKI report '2003 Stoerningshandboken BWR'. The input models the O3 reactor at its current state with the operating power of 3300 MW{sub th}. One aim with this work is that the MELCOR input could also be used for power upgrading studies. All fuel assemblies are thus assumed to consist of the new Westinghouse-Atom's SVEA-96 Optima2 fuel. MELCOR is a severe accident code developed by Sandia National Laboratory under contract from the U.S. Nuclear Regulatory Commission (NRC). MELCOR is a successor to STCP (Source Term Code Package) and has thus a long evolutionary history. The input described here is adapted to the latest version 1.8.5 available when the work began. It was released the year 2000, but a new version 1.8.6 was distributed recently. Conversion to the new version is recommended. (During the writing of this report still another code version, MELCOR 2.0, has been announced to be released within short.) In version 1.8.5 there is an option to describe the accident progression in the lower plenum and the melt-through of the reactor vessel bottom in more detail by use of the Bottom Head (BH) package developed by Oak Ridge National Laboratory especially for BWRs. This is in addition to the ordinary MELCOR COR package. Since problems arose running with the BH input two versions of the O3 input deck were produced, a NONBH and a BH deck. The BH package is no longer a separate package in the new 1

  7. On optimal development and becoming an optimiser

    NARCIS (Netherlands)

    de Ruyter, D.J.

    2012-01-01

    The article aims to provide a justification for the claim that optimal development and becoming an optimiser are educational ideals that parents should pursue in raising their children. Optimal development is conceptualised as enabling children to grow into flourishing persons, that is persons who

  8. User input verification and test driven development in the NJOY21 nuclear data processing code

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Conlin, Jeremy Lloyd [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); McCartney, Austin Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-21

    Before physically-meaningful data can be used in nuclear simulation codes, the data must be interpreted and manipulated by a nuclear data processing code so as to extract the relevant quantities (e.g. cross sections and angular distributions). Perhaps the most popular and widely-trusted of these processing codes is NJOY, which has been developed and improved over the course of 10 major releases since its creation at Los Alamos National Laboratory in the mid-1970’s. The current phase of NJOY development is the creation of NJOY21, which will be a vast improvement from its predecessor, NJOY2016. Designed to be fast, intuitive, accessible, and capable of handling both established and modern formats of nuclear data, NJOY21 will address many issues that many NJOY users face, while remaining functional for those who prefer the existing format. Although early in its development, NJOY21 is quickly providing input validation to check user input. By providing rapid and helpful responses to users while writing input files, NJOY21 will prove to be more intuitive and easy to use than any of its predecessors. Furthermore, during its development, NJOY21 is subject to regular testing, such that its test coverage must strictly increase with the addition of any production code. This thorough testing will allow developers and NJOY users to establish confidence in NJOY21 as it gains functionality. This document serves as a discussion regarding the current state input checking and testing practices of NJOY21.

  9. F-18 High Alpha Research Vehicle (HARV) parameter identification flight test maneuvers for optimal input design validation and lateral control effectiveness

    Science.gov (United States)

    Morelli, Eugene A.

    1995-01-01

    Flight test maneuvers are specified for the F-18 High Alpha Research Vehicle (HARV). The maneuvers were designed for open loop parameter identification purposes, specifically for optimal input design validation at 5 degrees angle of attack, identification of individual strake effectiveness at 40 and 50 degrees angle of attack, and study of lateral dynamics and lateral control effectiveness at 40 and 50 degrees angle of attack. Each maneuver is to be realized by applying square wave inputs to specific control effectors using the On-Board Excitation System (OBES). Maneuver descriptions and complete specifications of the time/amplitude points define each input are included, along with plots of the input time histories.

  10. Study and development of an input coupler for the future TESLA collider

    International Nuclear Information System (INIS)

    Dupery, C.

    1996-01-01

    The TESLA (TeV Superconducting Linear Accelerator) is operating with a high frequency cavity resonator input coupler. Some technical restraints (such as thermal, mechanical, electrical, vacuum, multipactor discharge phenomena) constrain the development of this coupler. In order to solve these problems, studies have been performed at the French Atomic Energy Commission (CEA) and are presented in this paper

  11. Predictors of Morphosyntactic Growth in Typically Developing Toddlers: Contributions of Parent Input and Child Sex

    Science.gov (United States)

    Hadley, Pamela A.; Rispoli, Matthew; Fitzgerald, Colleen; Bahnsen, Alison

    2011-01-01

    Purpose: Theories of morphosyntactic development must account for between-child differences in morphosyntactic growth rates. This study extends Legate and Yang's (2007) theoretically motivated cross-linguistic approach to determine if variation in properties of parent input accounts for differences in the growth of tense productivity. Method:…

  12. Solving Optimization Problems via Vortex Optimization Algorithm and Cognitive Development Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Ahmet Demir

    2017-01-01

    Full Text Available In the fields which require finding the most appropriate value, optimization became a vital approach to employ effective solutions. With the use of optimization techniques, many different fields in the modern life have found solutions to their real-world based problems. In this context, classical optimization techniques have had an important popularity. But after a while, more advanced optimization problems required the use of more effective techniques. At this point, Computer Science took an important role on providing software related techniques to improve the associated literature. Today, intelligent optimization techniques based on Artificial Intelligence are widely used for optimization problems. The objective of this paper is to provide a comparative study on the employment of classical optimization solutions and Artificial Intelligence solutions for enabling readers to have idea about the potential of intelligent optimization techniques. At this point, two recently developed intelligent optimization algorithms, Vortex Optimization Algorithm (VOA and Cognitive Development Optimization Algorithm (CoDOA, have been used to solve some multidisciplinary optimization problems provided in the source book Thomas' Calculus 11th Edition and the obtained results have compared with classical optimization solutions. 

  13. An alternative methodology for optimizing the data input for cadastral systems

    Directory of Open Access Journals (Sweden)

    Guilherme H. B. de Souza

    2004-07-01

    Full Text Available Nowadays, most of the city governments that have started the process of implementing a GIS use two types of database: one for the cadaster system and another for GIS. Therefore, a lot of time is spent, on both systems databases updating. So that, the cadaster system database is usually priorized, once tax collection is necessary for the city expenditure. Manipulating this data for different purposes, within only one database, it’s extremely effective, because the data may be updated easilier, besides not presenting duplicity. This paper presents a set of applications developed, having as the main objective the integration of the cadaster database, implanted for property tax and other taxes calculus, with a relational database, which may be used for analysis in strategical planning. The importance of such work is in the fact of the possibility for availability of systematized information with more efficiency and at a lower cost. Besides that, the methodology developed will provide better conditions for periodic cadaster updating. This is effective through of the use of an optical reading device of the cadaster information reports. Such procedure will naturally reduce cost, enabling the operation. The cadaster information being updated in short periods of time, increases the reliability of the system, providing more confidence for information availability, besides amplify the opportunity of offering services through the internet.

  14. Development of the MARS input model for Ulchin 1/2 transient analyzer

    International Nuclear Information System (INIS)

    Jeong, J. J.; Kim, K. D.; Lee, S. W.; Lee, Y. J.; Chung, B. D.; Hwang, M.

    2003-03-01

    KAERI has been developing the NSSS transient analyzer based on best-estimate codes for Ulchin 1/2 plants. The MARS and RETRAN code are used as the best-estimate codes for the NSSS transient analyzer. Among the two codes, the MARS code is to be used for realistic analysis of small- and large-break loss-of-coolant accidents, of which break size is greater than 2 inch diameter. This report includes the input model requirements and the calculation note for the Ulchin 1/2 MARS input data generation (see the Appendix). In order to confirm the validity of the input data, we performed the calculations for a steady state at 100 % power operation condition and a double-ended cold leg break LOCA. The results of the steady-state calculation agree well with the design data. The results of the LOCA calculation seem to be reasonable and consistent with those of other best-estimate calculations. Therefore, the MARS input data can be used as a base input deck for the MARS transient analyzer for Ulchin 1/2

  15. Development of the MARS input model for Ulchin 3/4 transient analyzer

    International Nuclear Information System (INIS)

    Jeong, J. J.; Kim, K. D.; Lee, S. W.; Lee, Y. J.; Lee, W. J.; Chung, B. D.; Hwang, M. G.

    2003-12-01

    KAERI has been developing the NSSS transient analyzer based on best-estimate codes.The MARS and RETRAN code are adopted as the best-estimate codes for the NSSS transient analyzer. Among these two codes, the MARS code is to be used for realistic analysis of small- and large-break loss-of-coolant accidents, of which break size is greater than 2 inch diameter. This report includes the MARS input model requirements and the calculation note for the MARS input data generation (see the Appendix) for Ulchin 3/4 plant analyzer. In order to confirm the validity of the input data, we performed the calculations for a steady state at 100 % power operation condition and a double-ended cold leg break LOCA. The results of the steady-state calculation agree well with the design data. The results of the LOCA calculation seem to be reasonable and consistent with those of other best-estimate calculations. Therefore, the MARS input data can be used as a base input deck for the MARS transient analyzer for Ulchin 3/4

  16. Study and development of a generalised input-output system for data base management systems

    International Nuclear Information System (INIS)

    Zidi, Noureddine

    1975-01-01

    This thesis reports a study which aimed at designing and developing a software for the management and execution of all input-output actions of data base management systems. This software is also an interface between data base management systems and the various operating systems. After a recall of general characteristics of database management systems, the author presents the previously developed GRISBI system (rational management of information stored in an integrated database), and describes difficulties faced to adapt this system to the new access method (VSAM, virtual sequential access method). This lead to the search for a more general solution, the development of which is presented in the second part of this thesis: environment of the input-output generalised system, architecture, internal specifications. The last part presents flowcharts and statements of the various routines [fr

  17. DEVELOPING A SMARTPHONE APPLICATION TO IMPROVE CARE AND OUTCOMES IN ADOLESCENT ARTHRITIS THROUGH PATIENT INPUT

    Directory of Open Access Journals (Sweden)

    Alice Ran Cai

    2015-09-01

    identify five major themes that informed the development of the app. First was Monitoring Information, which included JIA-related symptoms, mood and stress, exercise, missed medications, medication side-effects, and completing the health assessment questionnaire. Second was Setting Reminders for medications and appointments. Third theme was Education and Support, such as including practical advice and links to social support groups. Fourth theme pertains to Motivating Factors for Using the App, such as providing feedback for personal input and having a rewards system. The last theme relates to the Design for the App, such as how to make it visually appealing and easy to navigate. Qualitative feedbacks from CYP and HCPs during phase 2 indicated that the app is acceptable, comprehensive, interesting, and useful. Conclusions: The current study employed a qualitative user-centered approach to develop an acceptable and developmentally appropriate smartphone application that can benefit YP with JIA. Current qualitative data showed that complementing traditional therapies with new mobile technology may be an affordable and effective method to improve patient’s understanding of their condition and help YP become more independent in their own healthcare. Collecting more frequent and accurate data using the app may also improve treatments and interactions between patients and HCPs, which optimizes health and wellbeing.

  18. Alternative input medium development for wheelchair user with severe spinal cord injury

    Science.gov (United States)

    Ihsan, Izzat Aqmar; Tomari, Razali; Zakaria, Wan Nurshazwani Wan; Othman, Nurmiza

    2017-09-01

    Quadriplegia or tetraplegia patients have restricted four limbs as well as torso movement caused by severe spinal cord injury. Undoubtedly, these patients face difficulties when operating their powered electric wheelchair since they are unable to control the wheelchair by means of a standard joystick. Due to total loss of both sensory and motor function of the four limbs and torso, an alternative input medium for the wheelchair will be developed to assist the user in operating the wheelchair. In this framework, the direction of the wheelchair movement is determined by the user's conscious intent through a brain control interface (BCI) based on Electroencephalogram (EEG) signal. A laser range finder (LFR) is used to perceive environment information for determining a safety distance of the wheelchair's surrounding. Local path planning algorithm will be developed to provide navigation planner along with user's input to prevent collision during control operation.

  19. China’s language input system in the digital age affects children’s reading development

    OpenAIRE

    Tan, Li Hai; Xu, Min; Chang, Chun Qi; Siok, Wai Ting

    2012-01-01

    Written Chinese as a logographic system was developed over 3,000 y ago. Historically, Chinese children have learned to read by learning to associate the visuo-graphic properties of Chinese characters with lexical meaning, typically through handwriting. In recent years, however, many Chinese children have learned to use electronic communication devices based on the pinyin input method, which associates phonemes and English letters with characters. When children use pinyin to key in letters, th...

  20. Development of Input/Output System for the Reactor Transient Analysis System (RETAS)

    International Nuclear Information System (INIS)

    Suh, Jae Seung; Kang, Doo Hyuk; Cho, Yeon Sik; Ahn, Seung Hoon; Cho, Yong Jin

    2009-01-01

    A Korea Institute of Nuclear Safety Reactor Transient Analysis System (KINS-RETAS) aims at providing a realistic prediction of core and RCS response to the potential or actual event scenarios in Korean nuclear power plants (NPPs). A thermal hydraulic system code MARS is a pivot code of the RETAS, and used to predict thermal hydraulic (TH) behaviors in the core and associated systems. MARS alone can be applied to many types of transients, but is sometimes coupled with the other codes developed for different objectives. Many tools have been developed to aid users in preparing input and displaying the transient information and output data. Output file and Graphical User Interfaces (GUI) that help prepare input decks, as seen in SNAP (Gitnick, 1998), VISA (K.D. Kim, 2007) and display aids include the eFAST (KINS, 2007). The tools listed above are graphical interfaces. The input deck builders allow the user to create a functional diagram of the plant, pictorially on the screen. The functional diagram, when annotated with control volume and junction numbers, is a nodalization diagram. Data required for an input deck is entered for volumes and junctions through a mouse-driven menu and pop-up dialog; after the information is complete, an input deck is generated. Display GUIs show data from MARS calculations, either during or after the transient. The RETAS requires the user to first generate a set of 'input', two dimensional pictures of the plant on which some of the data is displayed either numerically or with a color map. The RETAS can generate XY-plots of the data. Time histories of plant conditions can be seen via the plots or through the RETAS's replay mode. The user input was combined with design input from MARS developers and experts from both the GUI and ergonomics fields. A partial list of capabilities follows. - 3D display for neutronics. - Easier method (less user time and effort) to generate 'input' for the 3D displays. - Detailed view of data at volume or

  1. Development of Input/Output System for the Reactor Transient Analysis System (RETAS)

    Energy Technology Data Exchange (ETDEWEB)

    Suh, Jae Seung; Kang, Doo Hyuk; Cho, Yeon Sik [ENESYS, Daejeon (Korea, Republic of); Ahn, Seung Hoon; Cho, Yong Jin [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2009-05-15

    A Korea Institute of Nuclear Safety Reactor Transient Analysis System (KINS-RETAS) aims at providing a realistic prediction of core and RCS response to the potential or actual event scenarios in Korean nuclear power plants (NPPs). A thermal hydraulic system code MARS is a pivot code of the RETAS, and used to predict thermal hydraulic (TH) behaviors in the core and associated systems. MARS alone can be applied to many types of transients, but is sometimes coupled with the other codes developed for different objectives. Many tools have been developed to aid users in preparing input and displaying the transient information and output data. Output file and Graphical User Interfaces (GUI) that help prepare input decks, as seen in SNAP (Gitnick, 1998), VISA (K.D. Kim, 2007) and display aids include the eFAST (KINS, 2007). The tools listed above are graphical interfaces. The input deck builders allow the user to create a functional diagram of the plant, pictorially on the screen. The functional diagram, when annotated with control volume and junction numbers, is a nodalization diagram. Data required for an input deck is entered for volumes and junctions through a mouse-driven menu and pop-up dialog; after the information is complete, an input deck is generated. Display GUIs show data from MARS calculations, either during or after the transient. The RETAS requires the user to first generate a set of 'input', two dimensional pictures of the plant on which some of the data is displayed either numerically or with a color map. The RETAS can generate XY-plots of the data. Time histories of plant conditions can be seen via the plots or through the RETAS's replay mode. The user input was combined with design input from MARS developers and experts from both the GUI and ergonomics fields. A partial list of capabilities follows. - 3D display for neutronics. - Easier method (less user time and effort) to generate 'input' for the 3D displays. - Detailed view

  2. Westinghouse corporate development of a decision software program for Radiological Evaluation Decision Input (REDI)

    International Nuclear Information System (INIS)

    Bush, T.S.

    1995-01-01

    In December 1992, the Department of Energy (DOE) implemented the DOE Radiological Control Manual (RCM). Westinghouse Idaho Nuclear Company, Inc. (WINCO) submitted an implementation plan showing how compliance with the manual would be achieved. This implementation plan was approved by DOE in November 1992. Although WINCO had already been working under a similar Westinghouse RCM, the DOE RCM brought some new and challenging requirements. One such requirement was that of having procedure writers and job planners create the radiological input in work control procedures. Until this time, that information was being provided by radiological engineering or a radiation safety representative. As a result of this requirement, Westinghouse developed the Radiological Evaluation Decision Input (REDI) program

  3. Westinghouse corporate development of a decision software program for Radiological Evaluation Decision Input (REDI)

    Energy Technology Data Exchange (ETDEWEB)

    Bush, T.S. [Westinghosue Idaho Nuclear Co., Inc., Idaho Falls, ID (United States)

    1995-03-01

    In December 1992, the Department of Energy (DOE) implemented the DOE Radiological Control Manual (RCM). Westinghouse Idaho Nuclear Company, Inc. (WINCO) submitted an implementation plan showing how compliance with the manual would be achieved. This implementation plan was approved by DOE in November 1992. Although WINCO had already been working under a similar Westinghouse RCM, the DOE RCM brought some new and challenging requirements. One such requirement was that of having procedure writers and job planners create the radiological input in work control procedures. Until this time, that information was being provided by radiological engineering or a radiation safety representative. As a result of this requirement, Westinghouse developed the Radiological Evaluation Decision Input (REDI) program.

  4. Application and optimization of input parameter spaces in mass flow modelling: a case study with r.randomwalk and r.ranger

    Science.gov (United States)

    Krenn, Julia; Zangerl, Christian; Mergili, Martin

    2017-04-01

    r.randomwalk is a GIS-based, multi-functional, conceptual open source model application for forward and backward analyses of the propagation of mass flows. It relies on a set of empirically derived, uncertain input parameters. In contrast to many other tools, r.randomwalk accepts input parameter ranges (or, in case of two or more parameters, spaces) in order to directly account for these uncertainties. Parameter spaces represent a possibility to withdraw from discrete input values which in most cases are likely to be off target. r.randomwalk automatically performs multiple calculations with various parameter combinations in a given parameter space, resulting in the impact indicator index (III) which denotes the fraction of parameter value combinations predicting an impact on a given pixel. Still, there is a need to constrain the parameter space used for a certain process type or magnitude prior to performing forward calculations. This can be done by optimizing the parameter space in terms of bringing the model results in line with well-documented past events. As most existing parameter optimization algorithms are designed for discrete values rather than for ranges or spaces, the necessity for a new and innovative technique arises. The present study aims at developing such a technique and at applying it to derive guiding parameter spaces for the forward calculation of rock avalanches through back-calculation of multiple events. In order to automatize the work flow we have designed r.ranger, an optimization and sensitivity analysis tool for parameter spaces which can be directly coupled to r.randomwalk. With r.ranger we apply a nested approach where the total value range of each parameter is divided into various levels of subranges. All possible combinations of subranges of all parameters are tested for the performance of the associated pattern of III. Performance indicators are the area under the ROC curve (AUROC) and the factor of conservativeness (FoC). This

  5. Development the interface system of master-slave manipulator and external input device on the graphic simulator

    International Nuclear Information System (INIS)

    Song, T. J.; Lee, J. Y.; Kim, S. H.; Yoon, J. S.

    2002-01-01

    The master-slave manipulator is the generally used as remote handling device in the hot cell, in which the high level radioactive materials such as spent fuels are handled. To analyze the motion of remote handling device and to simulate the remote handling operation task in the hot cell, the 3D graphic simulator which has been installed the master-slave manipulator is established. Also the interface program of external input device with 6 DOF(degree of Freedom) is developed and connected to graphic simulator with LLTI(Low Level Tele-operation Interface) which provides a uniquely optimized, high speed, bidirectional communication interface to one or more of system and processes

  6. Optimal Control Development System for Electrical Drives

    Directory of Open Access Journals (Sweden)

    Marian GAICEANU

    2008-08-01

    Full Text Available In this paper the optimal electrical drive development system is presented. It consists of both electrical drive types: DC and AC. In order to implement the optimal control for AC drive system an Altivar 71 inverter, a Frato magnetic particle brake (as load, three-phase induction machine, and dSpace 1104 controller have been used. The on-line solution of the matrix Riccati differential equation (MRDE is computed by dSpace 1104 controller, based on the corresponding feedback signals, generating the optimal speed reference for the AC drive system. The optimal speed reference is tracked by Altivar 71 inverter, conducting to energy reduction in AC drive. The classical control (consisting of rotor field oriented control with PI controllers and the optimal one have been implemented by designing an adequate ControlDesk interface. The three-phase induction machine (IM is controlled at constant flux. Therefore, the linear dynamic mathematical model of the IM has been obtained. The optimal control law provides transient regimes with minimal energy consumption. The obtained solution by integration of the MRDE is orientated towards the numerical implementation-by using a zero order hold. The development system is very useful for researchers, doctoral students or experts training in electrical drive. The experimental results are shown.

  7. Development of GPT-based optimization algorithm

    International Nuclear Information System (INIS)

    White, J.R.; Chapman, D.M.; Biswas, D.

    1985-01-01

    The University of Lowell and Westinghouse Electric Corporation are involved in a joint effort to evaluate the potential benefits of generalized/depletion perturbation theory (GPT/DTP) methods for a variety of light water reactor (LWR) physics applications. One part of that work has focused on the development of a GPT-based optimization algorithm for the overall design, analysis, and optimization of LWR reload cores. The use of GPT sensitivity data in formulating the fuel management optimization problem is conceptually straightforward; it is the actual execution of the concept that is challenging. Thus, the purpose of this paper is to address some of the major difficulties, to outline our approach to these problems, and to present some illustrative examples of an efficient GTP-based optimization scheme

  8. Development of a compact and cost effective multi-input digital signal processing system

    Science.gov (United States)

    Darvish-Molla, Sahar; Chin, Kenrick; Prestwich, William V.; Byun, Soo Hyun

    2018-01-01

    A prototype digital signal processing system (DSP) was developed using a microcontroller interfaced with a 12-bit sampling ADC, which offers a considerably inexpensive solution for processing multiple detectors with high throughput. After digitization of the incoming pulses, in order to maximize the output counting rate, a simple algorithm was employed for pulse height analysis. Moreover, an algorithm aiming at the real-time pulse pile-up deconvolution was implemented. The system was tested using a NaI(Tl) detector in comparison with a traditional analogue and commercial digital systems for a variety of count rates. The performance of the prototype system was consistently superior to the analogue and the commercial digital systems up to the input count rate of 61 kcps while was slightly inferior to the commercial digital system but still superior to the analogue system in the higher input rates. Considering overall cost, size and flexibility, this custom made multi-input digital signal processing system (MMI-DSP) was the best reliable choice for the purpose of the 2D microdosimetric data collection, or for any measurement in which simultaneous multi-data collection is required.

  9. Patient input into the development and enhancement of ED discharge instructions: a focus group study.

    Science.gov (United States)

    Buckley, Barbara A; McCarthy, Danielle M; Forth, Victoria E; Tanabe, Paula; Schmidt, Michael J; Adams, James G; Engel, Kirsten G

    2013-11-01

    Previous research indicates that patients have difficulty understanding ED discharge instructions; these findings have important implications for adherence and outcomes. The objective of this study was to obtain direct patient input to inform specific revisions to discharge documents created through a literacy-guided approach and to identify common themes within patient feedback that can serve as a framework for the creation of discharge documents in the future. Based on extensive literature review and input from ED providers, subspecialists, and health literacy and communication experts, discharge instructions were created for 5 common ED diagnoses. Participants were recruited from a federally qualified health center to participate in a series of 5 focus group sessions. Demographic information was obtained and a Rapid Estimate of Adult Literacy in Medicine (REALM) assessment was performed. During each of the 1-hour focus group sessions, participants reviewed discharge instructions for 1 of 5 diagnoses. Participants were asked to provide input into the content, organization, and presentation of the documents. Using qualitative techniques, latent and manifest content analysis was performed to code for emergent themes across all 5 diagnoses. Fifty-seven percent of participants were female and the average age was 32 years. The average REALM score was 57.3. Through qualitative analysis, 8 emergent themes were identified from the focus groups. Patient input provides meaningful guidance in the development of diagnosis-specific discharge instructions. Several themes and patterns were identified, with broad significance for the design of ED discharge instructions. Copyright © 2013 Emergency Nurses Association. Published by Mosby, Inc. All rights reserved.

  10. Design and development of cell queuing, processing, and scheduling modules for the iPOINT input-buffered ATM testbed

    Science.gov (United States)

    Duan, Haoran

    1997-12-01

    heuristic strategy that leads to 'socially optimal' solutions, yielding a maximum number of contention-free cells being scheduled. A novel mixed digital-analog circuit has been designed to implement the MUCS core functionality. The MUCS circuit maps the cell scheduling computation to the capacitor charging and discharging procedures that are conducted fully in parallel. The design has a uniform circuit structure, low interconnect counts, and low chip I/O counts. Using 2 μm CMOS technology, the design operates on a 100 MHz clock and finds a near-optimal solution within a linear processing time. The circuit has been verified at the transistor level by HSPICE simulation. During this research, a five-port IQ-based optoelectronic iPOINT ATM switch has been developed and demonstrated. It has been fully functional with an aggregate throughput of 800 Mb/s. The second-generation IQ-based switch is currently under development. Equipped with iiQueue modules and MUCS module, the new switch system will deliver a multi-gigabit aggregate throughput, eliminate HOL blocking, provide per-VC QoS, and achieve near-100% link bandwidth utilization. Complete documentation of input modules and trunk module for the existing testbed, and complete documentation of 3DQ, iiQueue, and MUCS for the second-generation testbed are given in this dissertation.

  11. Development of algorithm for depreciation costs allocation in dynamic input-output industrial enterprise model

    Directory of Open Access Journals (Sweden)

    Keller Alevtina

    2017-01-01

    Full Text Available The article considers the issue of allocation of depreciation costs in the dynamic inputoutput model of an industrial enterprise. Accounting the depreciation costs in such a model improves the policy of fixed assets management. It is particularly relevant to develop the algorithm for the allocation of depreciation costs in the construction of dynamic input-output model of an industrial enterprise, since such enterprises have a significant amount of fixed assets. Implementation of terms of the adequacy of such an algorithm itself allows: evaluating the appropriateness of investments in fixed assets, studying the final financial results of an industrial enterprise, depending on management decisions in the depreciation policy. It is necessary to note that the model in question for the enterprise is always degenerate. It is caused by the presence of zero rows in the matrix of capital expenditures by lines of structural elements unable to generate fixed assets (part of the service units, households, corporate consumers. The paper presents the algorithm for the allocation of depreciation costs for the model. This algorithm was developed by the authors and served as the basis for further development of the flowchart for subsequent implementation with use of software. The construction of such algorithm and its use for dynamic input-output models of industrial enterprises is actualized by international acceptance of the effectiveness of the use of input-output models for national and regional economic systems. This is what allows us to consider that the solutions discussed in the article are of interest to economists of various industrial enterprises.

  12. Development of a simulation program to study error propagation in the reprocessing input accountancy measurements

    International Nuclear Information System (INIS)

    Sanfilippo, L.

    1987-01-01

    A physical model and a computer program have been developed to simulate all the measurement operations involved with the Isotopic Dilution Analysis technique currently applied in the Volume - Concentration method for the Reprocessing Input Accountancy, together with their errors or uncertainties. The simulator is apt to easily solve a number of problems related to the measurement sctivities of the plant operator and the inspector. The program, written in Fortran 77, is based on a particular Montecarlo technique named ''Random Sampling''; a full description of the code is reported

  13. Development and Optimization of controlled drug release ...

    African Journals Online (AJOL)

    The aim of this study is to develop and optimize an osmotically controlled drug delivery system of diclofenac sodium. Osmotically controlled oral drug delivery systems utilize osmotic pressure for controlled delivery of active drugs. Drug delivery from these systems, to a large extent, is independent of the physiological factors ...

  14. The Czech longitudinal study of optimal development

    Czech Academy of Sciences Publication Activity Database

    Kebza, V.; Šolcová, Iva; Kodl, M.; Kernová, V.

    2012-01-01

    Roč. 47, Suppl. 1 (2012), s. 266-266 ISSN 0020-7594. [International Congress of Psychology /30./. 22.07.2012-27.07.2012, Cape Town] R&D Projects: GA ČR GAP407/10/2410 Institutional support: RVO:68081740 Keywords : optimal development * Prague longitudinal study Subject RIV: AN - Psychology

  15. Optimization and Development of Swellable Controlled Porosity ...

    African Journals Online (AJOL)

    Purpose: To develop swellable controlled porosity osmotic pump tablet of theophylline and to define the formulation and process variables responsible for drug release by applying statistical optimization technique. Methods: Formulations were prepared based on Taguchi Orthogonal Array design and Fraction Factorial ...

  16. Development of Input Function Measurement System for Small Animal PET Study

    International Nuclear Information System (INIS)

    Kim, Jong Guk; Kim, Byung Su; Kim, Jin Su

    2010-01-01

    For quantitative measurement of radioactivity concentration in tissue and a validated tracer kinetic model, the high sensitive detection system has been required for blood sampling. With the accurate measurement of time activity curves (TACs) of labeled compounds in blood (plasma) enable to provide quantitative information on biological parameters of interest in local tissue. Especially, the development of new tracers for PET imaging requires knowledge of the kinetics of the tracer in the body and in arterial blood and plasma. Conventional approaches of obtaining an input function are to sample arterial blood sequentially by manual as a function of time. Several continuous blood sampling systems have been developed and used in nuclear medicine research field to overcome the limited temporal resolution in sampling by the conventional method. In this work, we developed the high sensitive and unique geometric design of GSO detector for small animal blood activity measurement

  17. Development of an Input Suite for an Orthotropic Composite Material Model

    Science.gov (United States)

    Hoffarth, Canio; Shyamsunder, Loukham; Khaled, Bilal; Rajan, Subramaniam; Goldberg, Robert K.; Carney, Kelly S.; Dubois, Paul; Blankenhorn, Gunther

    2017-01-01

    An orthotropic three-dimensional material model suitable for use in modeling impact tests has been developed that has three major components elastic and inelastic deformations, damage and failure. The material model has been implemented as MAT213 into a special version of LS-DYNA and uses tabulated data obtained from experiments. The prominent features of the constitutive model are illustrated using a widely-used aerospace composite the T800S3900-2B[P2352W-19] BMS8-276 Rev-H-Unitape fiber resin unidirectional composite. The input for the deformation model consists of experimental data from 12 distinct experiments at a known temperature and strain rate: tension and compression along all three principal directions, shear in all three principal planes, and off axis tension or compression tests in all three principal planes, along with other material constants. There are additional input associated with the damage and failure models. The steps in using this model are illustrated composite characterization tests, verification tests and a validation test. The results show that the developed and implemented model is stable and yields acceptably accurate results.

  18. Development of a 6 DOF force-reflecting master input device

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Ji Sup; Yoon, Ho Sik

    1999-05-01

    The teleoperator is a very effective tool for various tasks of nuclear application in that it can reduce the operators' exposure to the radiation. For the utmost performances of the teleoperator, the force reflection capability is essential. This capability represents a function of transmitting the contact force of teleoperator with the object to the human operator. With this function the human operator in the remote area can effectively guide the motion of the teleoperator so that it can follow a safety guaranteed path. In this research a fully force reflectible input device 96 axis is developed. To develop the force reflecting device, the state of art is surveyed. Based on this survey, the 6 DOF manipulator which controls a power manipulator is fabricated and its performance is investigated. Also, various force reflection algorithms analyzed and the enhanced algorithm is proposed. (author). 18 refs., 4 tabs., 26 fi0008.

  19. Development of a 6 DOF force-reflecting master input device

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Ji Sup; Yoon, Ho Sik

    1999-05-01

    The teleoperator is a very effective tool for various tasks of nuclear application in that it can reduce the operators' exposure to the radiation. For the utmost performances of the teleoperator, the force reflection capability is essential. This capability represents a function of transmitting the contact force of teleoperator with the object to the human operator. With this function the human operator in the remote area can effectively guide the motion of the teleoperator so that it can follow a safety guaranteed path. In this research a fully force reflectible input device 96 axis is developed. To develop the force reflecting device, the state of art is surveyed. Based on this survey, the 6 DOF manipulator which controls a power manipulator is fabricated and its performance is investigated. Also, various force reflection algorithms analyzed and the enhanced algorithm is proposed. (author). 18 refs., 4 tabs., 26 fi0008.

  20. Development of a 6 DOF force-reflecting master input device

    International Nuclear Information System (INIS)

    Yoon, Ji Sup; Yoon, Ho Sik

    1999-05-01

    The teleoperator is a very effective tool for various tasks of nuclear application in that it can reduce the operators' exposure to the radiation. For the utmost performances of the teleoperator, the force reflection capability is essential. This capability represents a function of transmitting the contact force of teleoperator with the object to the human operator. With this function the human operator in the remote area can effectively guide the motion of the teleoperator so that it can follow a safety guaranteed path. In this research a fully force reflectible input device 96 axis) is developed. To develop the force reflecting device, the state of art is surveyed. Based on this survey, the 6 DOF manipulator which controls a power manipulator is fabricated and its performance is investigated. Also, various force reflection algorithms analyzed and the enhanced algorithm is proposed. (author). 18 refs., 4 tabs., 26 figs

  1. New hybrid multivariate analysis approach to optimize multiple response surfaces considering correlations in both inputs and outputs

    OpenAIRE

    Hejazi, Taha Hossein; Amirkabir University of Technology - Iran; Seyyed-Esfahani, Mirmehdi; Amirkabir University of Technology - Iran; Ramezani, Majid; Amirkabir University of Technology - Iran

    2014-01-01

    Quality control in industrial and service systems requires the correct setting of input factors by which the outputs result at minimum cost with desirable characteristics. There are often more than one input and output in such systems. Response surface methodology in its multiple variable forms is one of the most applied methods to estimate and improve the quality characteristics of products with respect to control factors. When there is some degree of correlation among the variables, the exi...

  2. Tourism and Economic Development in Romania: Input-Output Analysis Perspective

    Directory of Open Access Journals (Sweden)

    MARIUS SURUGIU

    2010-12-01

    Full Text Available Tourism provides a lot of opportunities for sustainable economic development. At local level, by its triggering effect it could represent a factor of economic recovery, by putting to good use the local material and human potential. By its position of predominantly final-branch, tourism exercises to a large impact on national economy by the vector of final demand, for which the possible and/or desirable variant for the future is an economic-social demand that must be satisfied by variants of total output. Using the input-output model (IO model a comparison was made of the matrix of direct technical coefficients (aij and the one of the total requirement coefficients (bij with the assistance of which the direct and propagated effects were determined for this activity by the indicators defining the dimensions of national economy.

  3. Development of Monte Carlo input code for proton, alpha and heavy ion microdosimetric trac structure simulations

    International Nuclear Information System (INIS)

    Douglass, M.; Bezak, E.

    2010-01-01

    Full text: Radiobiology science is important for cancer treatment as it improves our understanding of radiation induced cell death. Monte Carlo simulations playa crucial role in developing improved knowledge of cellular processes. By model Ii ng the cell response to radiation damage and verifying with experimental data, understanding of cell death through direct radiation hits and bystander effects can be obtained. A Monte Carlo input code was developed using 'Geant4' to simulate cellular level radiation interactions. A physics list which enables physically accurate interactions of heavy ions to energies below 100 e V was implemented. A simple biological cell model was also implemented. Each cell consists of three concentric spheres representing the nucleus, cytoplasm and the membrane. This will enable all critical cell death channels to be investigated (i.e. membrane damage, nucleus/DNA). The current simulation has the ability to predict the positions of ionization events within the individual cell components on I micron scale. We have developed a Geant4 simulation for investigation of radiation damage to cells on sub-cellular scale (∼I micron). This code currently allows the positions of the ionisation events within the individual components of the cell enabling a more complete picture of cell death to be developed. The next stage will include expansion of the code to utilise non-regular cell lattice. (author)

  4. Robust input design for nonlinear dynamic modeling of AUV.

    Science.gov (United States)

    Nouri, Nowrouz Mohammad; Valadi, Mehrdad

    2017-09-01

    Input design has a dominant role in developing the dynamic model of autonomous underwater vehicles (AUVs) through system identification. Optimal input design is the process of generating informative inputs that can be used to generate the good quality dynamic model of AUVs. In a problem with optimal input design, the desired input signal depends on the unknown system which is intended to be identified. In this paper, the input design approach which is robust to uncertainties in model parameters is used. The Bayesian robust design strategy is applied to design input signals for dynamic modeling of AUVs. The employed approach can design multiple inputs and apply constraints on an AUV system's inputs and outputs. Particle swarm optimization (PSO) is employed to solve the constraint robust optimization problem. The presented algorithm is used for designing the input signals for an AUV, and the estimate obtained by robust input design is compared with that of the optimal input design. According to the results, proposed input design can satisfy both robustness of constraints and optimality. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Forecasting the development of regional economy on the basis of input — output tables

    Directory of Open Access Journals (Sweden)

    Yury Konstantinovich Mashunin

    2014-06-01

    Full Text Available The article presents a practical technology of forecasting the development of the regional economy, including the statement of the problem, the construction of a mathematical model, and its implementation. At the constructing of a model, the standard statistical data for the previous period (2011, built on the basis of the table “input — output” are used. A unit of output of final demand, resulting from investments is added. As a result, a model of the regional economy made in the form of a vector mathematical programming problem that takes into account the investment processes in a region is obtained. Its purpose is to maximize the production of final demand in a region (all industries in a region within the constraints of the input-output balance, investments, resource costs and capacities. For solving linear programming problems of vector, methods, based on the principle of normalization criteria and guaranteed result are used. Vector dynamics problem is solved in a specified number of years. The factors taking into account the rate of growth: gross volumes (resources, final demand, investment in every sector of the region are introduced. Numerical implementation of the prediction is shown in the test case economic modeling of Primorsky Krai, including fifteen branches of a three-year period in accordance with the requirements of the Budget Code. Results of the solution include the major economic indicators for a region: gross, gross regional product (GRP, investments (including broken by industry, as well as payroll taxes and other. All these economic indicators are the basis for the formation of budget revenues in a region.

  6. Effects of Textual Enhancement and Input Enrichment on L2 Development

    Science.gov (United States)

    Rassaei, Ehsan

    2015-01-01

    Research on second language (L2) acquisition has recently sought to include formal instruction into second and foreign language classrooms in a more unobtrusive and implicit manner. Textual enhancement and input enrichment are two techniques which are aimed at drawing learners' attention to specific linguistic features in input and at the same…

  7. Going beyond Input Quantity: "Wh"-Questions Matter for Toddlers' Language and Cognitive Development

    Science.gov (United States)

    Rowe, Meredith L.; Leech, Kathryn A.; Cabrera, Natasha

    2017-01-01

    There are clear associations between the overall quantity of input children are exposed to and their vocabulary acquisition. However, by uncovering specific features of the input that matter, we can better understand the mechanisms involved in vocabulary learning. We examine whether exposure to "wh"-questions, a challenging quality of…

  8. The input of geomorphology to oil-related developments in Shetland and Northeast Scotland

    International Nuclear Information System (INIS)

    Ritchie, W.

    1991-01-01

    In essence, the input of coastal geomorphology to most oil-related developments at the coastline has been descriptive environmental classification. The uses to which this information has been put are twofold: (1) as background reconnaissance data that are prepared in advance of a development, such as the exploitation of a nearshore drilling lease or a pipeline landfall, and (2) as a basic element in oil spill contingency mapping. A more specialized use of geomorphology has been environmental management advice relating to the construction, restoration, and operation of large-diameter oil and gas pipeline landfalls - all of which make their approach in Northeast Scotland through beach and dune complexes. The techniques consist of traditional morphological mapping considering form, aspect, materials, energy, and estimations of contemporary processes. Implicit in this mapping is the recognition of vulnerability which, in turn, relates closely to habitat recognition. Time is rarely available for process-type measurements. There is also a dependence on existing maps, aerial photographs, and reports. The survey may be done on foot, from boats, fixed-wing aircraft, or helicopters. Airborne video is increasingly being used as a supplementary means of data acquisition. Vertical airborne video used with an image-processing and G.I.S. system shows great potential and has been used experimentally for pipeline route selection

  9. Trajectory of Sewerage System Development Optimization

    Science.gov (United States)

    Chupin, R. V.; Mayzel, I. V.; Chupin, V. R.

    2017-11-01

    The transition to market relations has determined a new technology for our country to manage the development of urban engineering systems. This technology has shifted to the municipal level and it can, in large, be presented in two stages. The first is the development of a scheme for the development of the water supply and sanitation system, the second is the implementation of this scheme on the basis of investment programs of utilities. In the investment programs, financial support is provided for the development and reconstruction of water disposal systems due to the investment component in the tariff, connection fees for newly commissioned capital construction projects and targeted financing for selected state and municipal programs, loans and credits. Financial provision with the development of sewerage systems becomes limited and the problem arises in their rational distribution between the construction of new water disposal facilities and the reconstruction of existing ones. The paper suggests a methodology for developing options for the development of sewerage systems, selecting the best of them by the life cycle cost criterion, taking into account the limited investments in their construction, models and methods of analysis, optimizing their reconstruction and development, taking into account reliability and seismic resistance.

  10. Development a computer codes to couple PWR-GALE output and PC-CREAM input

    Science.gov (United States)

    Kuntjoro, S.; Budi Setiawan, M.; Nursinta Adi, W.; Deswandri; Sunaryo, G. R.

    2018-02-01

    Radionuclide dispersion analysis is part of an important reactor safety analysis. From the analysis it can be obtained the amount of doses received by radiation workers and communities around nuclear reactor. The radionuclide dispersion analysis under normal operating conditions is carried out using the PC-CREAM code, and it requires input data such as source term and population distribution. Input data is derived from the output of another program that is PWR-GALE and written Population Distribution data in certain format. Compiling inputs for PC-CREAM programs manually requires high accuracy, as it involves large amounts of data in certain formats and often errors in compiling inputs manually. To minimize errors in input generation, than it is make coupling program for PWR-GALE and PC-CREAM programs and a program for writing population distribution according to the PC-CREAM input format. This work was conducted to create the coupling programming between PWR-GALE output and PC-CREAM input and programming to written population data in the required formats. Programming is done by using Python programming language which has advantages of multiplatform, object-oriented and interactive. The result of this work is software for coupling data of source term and written population distribution data. So that input to PC-CREAM program can be done easily and avoid formatting errors. Programming sourceterm coupling program PWR-GALE and PC-CREAM is completed, so that the creation of PC-CREAM inputs in souceterm and distribution data can be done easily and according to the desired format.

  11. Characteristic features of determining the labor input and estimated cost of the development and manufacture of equipment

    Science.gov (United States)

    Kurmanaliyev, T. I.; Breslavets, A. V.

    1974-01-01

    The difficulties in obtaining exact calculation data for the labor input and estimated cost are noted. The method of calculating the labor cost of the design work using the provisional normative indexes with respect to individual forms of operations is proposed. Values of certain coefficients recommended for use in the practical calculations of the labor input for the development of new scientific equipment for space research are presented.

  12. The long-term development of the energy input in transportation, 1970-2020

    Energy Technology Data Exchange (ETDEWEB)

    Meiren, P B [E.F.C.E.E., Mechelen (Belgium)

    1996-12-01

    This paper is a - modest - statistical and economic analysis of the energy input in the transportation sector over the past twenty-five years (1970 - 1995) and an attempt at looking ahead over the next twenty-five years (1995 - 2020). After World War II passenger cars and trucks became the means of transportation par excellence and are still the main vehicle for moving around, both men and freight. Energy input statistics were born. Let us see what they teach us. (EG)

  13. Boosting antibody developability through rational sequence optimization.

    Science.gov (United States)

    Seeliger, Daniel; Schulz, Patrick; Litzenburger, Tobias; Spitz, Julia; Hoerer, Stefan; Blech, Michaela; Enenkel, Barbara; Studts, Joey M; Garidel, Patrick; Karow, Anne R

    2015-01-01

    The application of monoclonal antibodies as commercial therapeutics poses substantial demands on stability and properties of an antibody. Therapeutic molecules that exhibit favorable properties increase the success rate in development. However, it is not yet fully understood how the protein sequences of an antibody translates into favorable in vitro molecule properties. In this work, computational design strategies based on heuristic sequence analysis were used to systematically modify an antibody that exhibited a tendency to precipitation in vitro. The resulting series of closely related antibodies showed improved stability as assessed by biophysical methods and long-term stability experiments. As a notable observation, expression levels also improved in comparison with the wild-type candidate. The methods employed to optimize the protein sequences, as well as the biophysical data used to determine the effect on stability under conditions commonly used in the formulation of therapeutic proteins, are described. Together, the experimental and computational data led to consistent conclusions regarding the effect of the introduced mutations. Our approach exemplifies how computational methods can be used to guide antibody optimization for increased stability.

  14. The development of optimization protocol in SRS

    International Nuclear Information System (INIS)

    Oh, S. J.; Suh, T. S.; Lee, H. K.; Choe, B. Y.

    2002-01-01

    In an operation of stereotactic radiosurgery(SRS), a high dose must be delivered to a target region while a normal tissue region must be spared. Using dose distribution which fits in a target region satisfies this purpose. This is solved by using data bases through the simple patient model simulating the brain model and the tumor region. The objective of this research is to develop brain model with tumor based on pseudo coordinate and systematic optimization protocol and to construct data base(DB) about beam parameters such as position and number of isocenter and collimator size. The normal tissue region of patient can be spared by DB in a operation of SRS

  15. The development of optimization protocol in SRS

    Energy Technology Data Exchange (ETDEWEB)

    Oh, S. J.; Suh, T. S.; Lee, H. K.; Choe, B. Y. [The Catholic Univ., of Korea, Seoul (Korea, Republic of)

    2002-07-01

    In an operation of stereotactic radiosurgery(SRS), a high dose must be delivered to a target region while a normal tissue region must be spared. Using dose distribution which fits in a target region satisfies this purpose. This is solved by using data bases through the simple patient model simulating the brain model and the tumor region. The objective of this research is to develop brain model with tumor based on pseudo coordinate and systematic optimization protocol and to construct data base(DB) about beam parameters such as position and number of isocenter and collimator size. The normal tissue region of patient can be spared by DB in a operation of SRS.

  16. Providing disabled persons in developing countries access to computer games through a novel gaming input device

    CSIR Research Space (South Africa)

    Smith, Andrew C

    2011-01-01

    Full Text Available A novel input device is presented for use with a personal computer by persons with physical disabilities who would otherwise not be able to enjoy computer gaming. This device is simple to manufacture and low in cost. A gaming application...

  17. The Effects of Input Enhancement and Recasts on the Development of Second Language Pragmatic Competence

    Science.gov (United States)

    Nguyen, Minh Thi Thuy; Pham, Hanh Thi; Pham, Tam Minh

    2017-01-01

    This study investigates the combined effects of input enhancement and recasts on a group of Vietnamese EFL learners' performance of constructive criticism during peer review activities. Particularly, the study attempts to find out whether the instruction works for different aspects of pragmatic learning, including the learners' sociopragmatic and…

  18. 77 FR 16116 - Public Input on the Development and Potential Issuance of Treasury Floating Rate Notes

    Science.gov (United States)

    2012-03-19

    ... to the issuance of this type of debt. Treasury has not made a decision to issue floating rate notes... as part of this Request for Information will serve as valuable input into this decision. DATES... structures of debt issuance is consistent with Treasury's mission of financing the government at the lowest...

  19. Optimizing development projects in mature basins

    Energy Technology Data Exchange (ETDEWEB)

    Swan, P.J. [BP Exploration, Aberdeen (United Kingdom)

    1995-08-01

    BP Exploration wishes to grow its gas business substantially and the Southern North Sea area expects to be a significant contributor to this growth. The Southern North Sea gas basin is characterised by a relatively large number of small prospects and discoveries lying within the catchment areas of existing pipeline systems serving larger fields currently in production. This growth will be achieved through expansion of production from existing large mature fields and new satellite developments, connected to existing pipeline systems. Significant modification to existing infrastructure will be required to bring the new production on stream. The low materiality of many of these new developments means that, based on current cost paradigms, they are sub-economic or do not offer returns commensurate with the risk. Also, implementation based on classical approaches tends to be resource-intensive in terms of key skills. Critical areas of concern in delivering growth objectives therefore relate to management of cost, implementation time and productivity of key human resources. The general approach adopted in pursuit of high performance includes a number of features: Innovative approaches to the service industries; simplification of equipment; streamlining of methodologies; application of novel technology; alignment of motivation of all contributors to overall objectives; and shifting the paradigm of risk. BP believes that this approach is a major breakthrough in extending and expanding the life of its assets in the Southern North Sea and is representative of the trend of optimization in the extended life of the Basin in general.

  20. Development and optimization of a diode laser for photodynamic therapy.

    Science.gov (United States)

    Lim, Hyun Soo

    2011-01-01

    This study demonstrated the development of a laser system for cancer treatment with photodynamic therapy (PDT) based on a 635 nm laser diode. In order to optimize efficacy in PDT, the ideal laser system should deliver a homogeneous nondivergent light energy with a variable spot size and specific wavelength at a stable output power. We developed a digital laser beam controller using the constant current method to protect the laser diode resonator from the current spikes and other fluctuations, and electrical faults. To improve the PDT effects, the laser system should deliver stable laser energy in continuous wave (CW), burst mode and super burst mode, with variable irradiation times depending on the tumor type and condition. The experimental results showed the diode laser system described herein was eminently suitable for PDT. The laser beam was homogeneous without diverging and the output power increased stably and in a linear manner from 10 mW to 1500 mW according to the increasing input current. Variation between the set and delivered output was less than 7%. The diode laser system developed by the author for use in PDT was compact, user-friendly, and delivered a stable and easily adjustable output power at a specific wavelength and user-set emission modes.

  1. A software development for establishing optimal production lots and its application in academic and business environments

    Directory of Open Access Journals (Sweden)

    Javier Valencia Mendez

    2014-09-01

    Full Text Available The recent global economic downturn has increased an already perceived need in organizations for cost savings. To cope with such need, companies can opt for different strategies. This paper focuses on optimizing processes and, more specifically, determining the optimal lot production. To determine the optimal lot of a specific production process, a new software was developed that not only incorporates various productive and logistical elements in its calculations but also affords users a practical way to manage the large number of input parameters required to determine the optimal batch. The developed software has not only been validated by several companies, both Spanish and Mexican, who achieved significant savings, but also used as a teaching tool in universities with highly satisfactory results from the point of view of student learning. A special contribution of this work is that the developed tool can be sent to the interested reader free of charge upon request.

  2. Development of a Low Input and sustainable Switchgrass Feedstock Production System Utilizing Beneficial Bacterial Endophytes

    Energy Technology Data Exchange (ETDEWEB)

    Mei, Chuansheng [IALR; Nowak, Jerzy [VPISU; Seiler, John [VPISU

    2014-10-24

    Switchgrass represents a promising feedstock crop for US energy sustainability. However, its broad utilization for bioenergy requires improvements of biomass yields and stress tolerance. In this DOE funded project, we have been working on harnessing beneficial bacterial endophytes to enhance switchgrass performance and to develop a low input feedstock production system for marginal lands that do not compete with the production of food crops. We have demonstrated that one of most promising plant growth-promoting bacterial endophytes, Burkholderia phytofirmans strain PsJN, is able to colonize roots and significantly promote growth of switchgrass cv. Alamo under in vitro, growth chamber, greenhouse, as well as field conditions. Furthermore, PsJN bacterization improved growth and development of switchgrass seedlings, significantly stimulated plant root and shoot growth, and tiller number in the field, and enhanced biomass accumulation on both poor (p<0.001) and rich (p<0.05) soils, with more effective stimulation of plant growth in low fertility soil. Plant physiology measurements showed that PsJN inoculated Alamo had consistently lower transpiration, lower stomatal conductance, and higher water use efficiency in greenhouse conditions. These physiological changes may significantly contribute to the recorded growth enhancement. PsJN inoculation rapidly results in an increase in photosynthetic rates which contributes to the advanced growth and development. Some evidence suggests that this initial growth advantage decreases with time when resources are not limited such as in greenhouse studies. Additionally, better drought resistance and drought hardening were observed in PsJN inoculated switchgrass. Using the DOE-funded switchgrass EST microarray, in a collaboration with the Genomics Core Facility at the Noble Foundation, we have determined gene expression profile changes in both responsive switchgrass cv. Alamo and non-responsive cv. Cave-in-Rock (CR) following Ps

  3. Look Who's Talking: Speech Style and Social Context in Language Input to Infants Are Linked to Concurrent and Future Speech Development

    Science.gov (United States)

    Ramírez-Esparza, Nairán; García-Sierra, Adrián; Kuhl, Patricia K.

    2014-01-01

    Language input is necessary for language learning, yet little is known about whether, in natural environments, the speech style and social context of language input to children impacts language development. In the present study we investigated the relationship between language input and language development, examining both the style of parental…

  4. Sequential estimation of intrinsic activity and synaptic input in single neurons by particle filtering with optimal importance density

    Science.gov (United States)

    Closas, Pau; Guillamon, Antoni

    2017-12-01

    This paper deals with the problem of inferring the signals and parameters that cause neural activity to occur. The ultimate challenge being to unveil brain's connectivity, here we focus on a microscopic vision of the problem, where single neurons (potentially connected to a network of peers) are at the core of our study. The sole observation available are noisy, sampled voltage traces obtained from intracellular recordings. We design algorithms and inference methods using the tools provided by stochastic filtering that allow a probabilistic interpretation and treatment of the problem. Using particle filtering, we are able to reconstruct traces of voltages and estimate the time course of auxiliary variables. By extending the algorithm, through PMCMC methodology, we are able to estimate hidden physiological parameters as well, like intrinsic conductances or reversal potentials. Last, but not least, the method is applied to estimate synaptic conductances arriving at a target cell, thus reconstructing the synaptic excitatory/inhibitory input traces. Notably, the performance of these estimations achieve the theoretical lower bounds even in spiking regimes.

  5. EPICS Input/Output Controller (IOC) application developer's guide. APS Release 3.12

    International Nuclear Information System (INIS)

    Kraimer, M.R.

    1994-11-01

    This document describes the core software that resides in an Input/Output Controller (IOC), one of the major components of EPICS. The basic components are: (OPI) Operator Interface; this is a UNIX based workstation which can run various EPICS tools; (IOC) Input/Output Controller; this is a VME/VXI based chassis containing a Motorola 68xxx processor, various I/O modules, and VME modules that provide access to other I/O buses such as GPIB, (LAN), Local Area Network; and this is the communication network which allows the IOCs and OPIs to communicate. Epics provides a software component, Channel Access, which provides network transparent communication between a Channel Access client and an arbitrary number of Channel Access servers

  6. Computer-Mediated Input, Output and Feedback in the Development of L2 Word Recognition from Speech

    Science.gov (United States)

    Matthews, Joshua; Cheng, Junyu; O'Toole, John Mitchell

    2015-01-01

    This paper reports on the impact of computer-mediated input, output and feedback on the development of second language (L2) word recognition from speech (WRS). A quasi-experimental pre-test/treatment/post-test research design was used involving three intact tertiary level English as a Second Language (ESL) classes. Classes were either assigned to…

  7. The Impact of Bimodal Bilingual Parental Input on the Communication and Language Development of a Young Deaf Child

    Science.gov (United States)

    Levesque, Elizabeth; Brown, P. Margaret; Wigglesworth, Gillian

    2014-01-01

    This study explores the impact of bimodal bilingual parental input on the communication and language development of a young deaf child. The participants in this case study were a severe-to-profoundly deaf boy and his hearing parents, who were enrolled in a bilingual (English and Australian Sign Language) homebased early intervention programme. The…

  8. Synaptic inputs compete during rapid formation of the calyx of Held: a new model system for neural development.

    Science.gov (United States)

    Holcomb, Paul S; Hoffpauir, Brian K; Hoyson, Mitchell C; Jackson, Dakota R; Deerinck, Thomas J; Marrs, Glenn S; Dehoff, Marlin; Wu, Jonathan; Ellisman, Mark H; Spirou, George A

    2013-08-07

    Hallmark features of neural circuit development include early exuberant innervation followed by competition and pruning to mature innervation topography. Several neural systems, including the neuromuscular junction and climbing fiber innervation of Purkinje cells, are models to study neural development in part because they establish a recognizable endpoint of monoinnervation of their targets and because the presynaptic terminals are large and easily monitored. We demonstrate here that calyx of Held (CH) innervation of its target, which forms a key element of auditory brainstem binaural circuitry, exhibits all of these characteristics. To investigate CH development, we made the first application of serial block-face scanning electron microscopy to neural development with fine temporal resolution and thereby accomplished the first time series for 3D ultrastructural analysis of neural circuit formation. This approach revealed a growth spurt of added apposed surface area (ASA)>200 μm2/d centered on a single age at postnatal day 3 in mice and an initial rapid phase of growth and competition that resolved to monoinnervation in two-thirds of cells within 3 d. This rapid growth occurred in parallel with an increase in action potential threshold, which may mediate selection of the strongest input as the winning competitor. ASAs of competing inputs were segregated on the cell body surface. These data suggest mechanisms to select "winning" inputs by regional reinforcement of postsynaptic membrane to mediate size and strength of competing synaptic inputs.

  9. The Effectiveness of Visual Input Enhancement on the Noticing and L2 Development of the Spanish Past Tense

    Science.gov (United States)

    Loewen, Shawn; Inceoglu, Solène

    2016-01-01

    Textual manipulation is a common pedagogic tool used to emphasize specific features of a second language (L2) text, thereby facilitating noticing and, ideally, second language development. Visual input enhancement has been used to investigate the effects of highlighting specific grammatical structures in a text. The current study uses a…

  10. A special role for binocular visual input during development and as a component of occlusion therapy for treatment of amblyopia.

    Science.gov (United States)

    Mitchell, Donald E

    2008-01-01

    To review work on animal models of deprivation amblyopia that points to a special role for binocular visual input in the development of spatial vision and as a component of occlusion (patching) therapy for amblyopia. The studies reviewed employ behavioural methods to measure the effects of various early experiential manipulations on the development of the visual acuity of the two eyes. Short periods of concordant binocular input, if continuous, can offset much longer daily periods of monocular deprivation to allow the development of normal visual acuity in both eyes. It appears that the visual system does not weigh all visual input equally in terms of its ability to impact on the development of vision but instead places greater weight on concordant binocular exposure. Experimental models of patching therapy for amblyopia imposed on animals in which amblyopia had been induced by a prior period of early monocular deprivation, indicate that the benefits of patching therapy may be only temporary and decline rapidly after patching is discontinued. However, when combined with critical amounts of binocular visual input each day, the benefits of patching can be both heightened and made permanent. Taken together with demonstrations of retained binocular connections in the visual cortex of monocularly deprived animals, a strong argument is made for inclusion of specific training of stereoscopic vision for part of the daily periods of binocular exposure that should be incorporated as part of any patching protocol for amblyopia.

  11. Developing optimal input design strategies in cancer systems biology with applications to microfluidic device engineering

    NARCIS (Netherlands)

    Menolascina, F.; Bellomo, D.; Maiwald, T.; Bevilacqua, V.; Ciminelli, C.; Paradiso, A.; Tommasi, S.

    2009-01-01

    Background: Mechanistic models are becoming more and more popular in Systems Biology; identification and control of models underlying biochemical pathways of interest in oncology is a primary goal in this field. Unfortunately the scarce availability of data still limits our understanding of the

  12. BRAIN Journal - Solving Optimization Problems via Vortex Optimization Algorithm and Cognitive Development Optimization Algorithm

    OpenAIRE

    Ahmet Demir; Utku Kose

    2016-01-01

    ABSTRACT In the fields which require finding the most appropriate value, optimization became a vital approach to employ effective solutions. With the use of optimization techniques, many different fields in the modern life have found solutions to their real-world based problems. In this context, classical optimization techniques have had an important popularity. But after a while, more advanced optimization problems required the use of more effective techniques. At this point, Computer Sc...

  13. Solving Optimization Problems via Vortex Optimization Algorithm and Cognitive Development Optimization Algorithm

    OpenAIRE

    Ahmet Demir; Utku kose

    2017-01-01

    In the fields which require finding the most appropriate value, optimization became a vital approach to employ effective solutions. With the use of optimization techniques, many different fields in the modern life have found solutions to their real-world based problems. In this context, classical optimization techniques have had an important popularity. But after a while, more advanced optimization problems required the use of more effective techniques. At this point, Computer Science took an...

  14. Big data optimization recent developments and challenges

    CERN Document Server

    2016-01-01

    The main objective of this book is to provide the necessary background to work with big data by introducing some novel optimization algorithms and codes capable of working in the big data setting as well as introducing some applications in big data optimization for both academics and practitioners interested, and to benefit society, industry, academia, and government. Presenting applications in a variety of industries, this book will be useful for the researchers aiming to analyses large scale data. Several optimization algorithms for big data including convergent parallel algorithms, limited memory bundle algorithm, diagonal bundle method, convergent parallel algorithms, network analytics, and many more have been explored in this book.

  15. Procedure for developing biological input for the design, location, or modification of water-intake structures

    Energy Technology Data Exchange (ETDEWEB)

    Neitzel, D.A.; McKenzie, D.H.

    1981-12-01

    To minimize adverse impact on aquatic ecosystems resulting from the operation of water intake structures, design engineers must have relevant information on the behavior, physiology and ecology of local fish and shellfish. Identification of stimulus/response relationships and the environmental factors that influence them is the first step in incorporating biological information in the design, location or modification of water intake structures. A procedure is presented in this document for providing biological input to engineers who are designing, locating or modifying a water intake structure. The authors discuss sources of stimuli at water intakes, historical approaches in assessing potential/actual impact and review biological information needed for intake design.

  16. Development of High Heat Input Welding Offshore Steel as Normalized Condition

    Science.gov (United States)

    Deng, Wei; Qin, Xiaomei

    The heavy plate used for offshore structure is one of the important strategic products. In recent years, there is an increasing demand for heavy shipbuilding steel plate with excellent weldability in high heat input welding. During the thermal cycle, the microstructure of the heat affected zone (HAZ) of plates was damaged, and this markedly reduced toughness of HAZ. So, how to improve the toughness of HAZ has been a key subject in the fields of steel research. Oxide metallurgy is considered as an effective way to improve toughness of HAZ, because it could be used to retard grain growth by fine particles, which are stable at the high temperature.The high strength steel plate, which satisfies the low temperature specification, has been applied to offshore structure. Excellent properties of the plates and welded joints were obtained by oxide metallurgy technology, latest controlled rolling and accelerated cooling technology using Ultra-Fast Cooling (an on-line accelerated cooling system). The 355MPa-grade high strength steel plates with normalizing condition were obtained, and the steels have excellent weldability with heat input energy of 79 287kJ/cm, and the nil ductility transition (NDT) temperature was -70°C, which can satisfy the construction of offshore structure in cold regions.

  17. Developing Common Set of Weights with Considering Nondiscretionary Inputs and Using Ideal Point Method

    Directory of Open Access Journals (Sweden)

    Reza Kiani Mavi

    2013-01-01

    Full Text Available Data envelopment analysis (DEA is used to evaluate the performance of decision making units (DMUs with multiple inputs and outputs in a homogeneous group. In this way, the acquired relative efficiency score for each decision making unit lies between zero and one where a number of them may have an equal efficiency score of one. DEA successfully divides them into two categories of efficient DMUs and inefficient DMUs. A ranking for inefficient DMUs is given but DEA does not provide further information about the efficient DMUs. One of the popular methods for evaluating and ranking DMUs is the common set of weights (CSW method. We generate a CSW model with considering nondiscretionary inputs that are beyond the control of DMUs and using ideal point method. The main idea of this approach is to minimize the distance between the evaluated decision making unit and the ideal decision making unit (ideal point. Using an empirical example we put our proposed model to test by applying it to the data of some 20 bank branches and rank their efficient units.

  18. Development of tools for optimization of HWC

    International Nuclear Information System (INIS)

    Wikmark, Gunnar; Lundgren, Klas; Wijkstroem, Hjalmar; Pein, Katarina; Ullberg, Mats

    2004-06-01

    An ECP model for the Swedish Boiling Water Reactors (BWRs) was developed in a previous project sponsored by the Swedish Nuclear Power Inspectorate. The present work is an extension of that effort. The model work has been extended in three ways. Some potential problem areas of the ECP sub-model have been treated in full detail. A comprehensive calibration data set has been assembled from plant data and from laboratory and in-plant experiments. The model has been fitted to the calibration data set and the model parameters adjusted. The work on the ECP sub-model has demonstrated that the generalised Butler Volmer equation, as previously used, adequately describes the electrochemistry. Thus, there is no need to treat the system surface oxides as semiconductors or to take double layer effects into account. The existence of a pseudo potential for the reaction of oxygen on stainless steel is confirmed. The concentration dependence and temperature dependence of the exchange current densities are still unclear. An experimental investigation of these is therefore desirable. An interesting alternative to a conventional experimental set-up is to combine modelling with simpler and more easily controlled experiments. In addition to a calibration data set, the survey of plant data has also led to an improved understanding of the necessary parameters of an ECP model. Thus, variations of the H 2 injection rate at constant reactor power level and constant recirculation flow rate were traced to variations of the relative power level of the fuel elements in the core periphery. The power level in the core periphery determines the dose rate in the down comer and controls the recombination reaction that is fundamental to Hydrogen Water Chemistry (HWC). To accurately model ECP as a function of hydrogen injection rate and other plant parameters, the relative power level of the core periphery is a necessary model parameter that has to be regularly updated from core management codes

  19. Developing a workable public input process for aesthetics and recreational needs during hydropower licensing

    International Nuclear Information System (INIS)

    Howe, D.; Stimac, M.

    1993-01-01

    Aesthetics and recreation are becoming increasingly important issues during hydropower licensing. A variety of regulations and legislation mandate the protection of instream flows for aesthetic and recreational resources. These have provided impetus for determining the effects of instream flows on recreation and aesthetic resources. A public survey designed for a proposed small hydropower project, located in a heavily used recreation area, attempts to determine the aesthetic and recreational preferences for instream flows. The major components in designing the survey are discussed. The public input process is still underway, however, preliminary results indicate lower flows in the river are generally preferable by visitors of the area. This is likely because of the types of users and the recreation activities performed

  20. Development, nutritional evaluation and optimization of instant ...

    African Journals Online (AJOL)

    In this study, different instant porridges were formulated from broken fractions of rice blended with bambara groundnut flour through extrusion cooking. Response Surface Methodology (RSM) and Central Composite Rotatable Design (CCRD) were used to optimize the production variables. The objective was to locate the ...

  1. Recent developments in cooperative control and optimization

    CERN Document Server

    Murphey, Robert; Pardalos, Panos

    2004-01-01

    Over the past several years, cooperative control and optimization has un­ questionably been established as one of the most important areas of research in the military sciences. Even so, cooperative control and optimization tran­ scends the military in its scope -having become quite relevant to a broad class of systems with many exciting, commercial, applications. One reason for all the excitement is that research has been so incredibly diverse -spanning many scientific and engineering disciplines. This latest volume in the Cooperative Systems book series clearly illustrates this trend towards diversity and creative thought. And no wonder, cooperative systems are among the hardest systems control science has endeavored to study, hence creative approaches to model­ ing, analysis, and synthesis are a must! The definition of cooperation itself is a slippery issue. As you will see in this and previous volumes, cooperation has been cast into many different roles and therefore has assumed many diverse meanings. P...

  2. Optimization Models and Methods Developed at the Energy Systems Institute

    OpenAIRE

    N.I. Voropai; V.I. Zorkaltsev

    2013-01-01

    The paper presents shortly some optimization models of energy system operation and expansion that have been created at the Energy Systems Institute of the Siberian Branch of the Russian Academy of Sciences. Consideration is given to the optimization models of energy development in Russia, a software package intended for analysis of power system reliability, and model of flow distribution in hydraulic systems. A general idea of the optimization methods developed at the Energy Systems Institute...

  3. Recent developments of discrete material optimization of laminated composite structures

    DEFF Research Database (Denmark)

    Lund, Erik; Sørensen, Rene

    2015-01-01

    This work will give a quick summary of recent developments of the Discrete Material Optimization approach for structural optimization of laminated composite structures. This approach can be seen as a multi-material topology optimization approach for selecting the best ply material and number...... of plies in a laminated composite structure. The conceptual combinatorial design problem is relaxed to a continuous problem such that well-established gradient based optimization techniques can be applied, and the optimization problem is solved on basis of interpolation schemes with penalization...

  4. Development of inhibitory synaptic inputs on layer 2/3 pyramidal neurons in the rat medial prefrontal cortex

    KAUST Repository

    Virtanen, Mari A.; Lacoh, Claudia Marvine; Fiumelli, Hubert; Kosel, Markus; Tyagarajan, Shiva; de Roo, Mathias; Vutskits, Laszlo

    2018-01-01

    Inhibitory control of pyramidal neurons plays a major role in governing the excitability in the brain. While spatial mapping of inhibitory inputs onto pyramidal neurons would provide important structural data on neuronal signaling, studying their distribution at the single cell level is difficult due to the lack of easily identifiable anatomical proxies. Here, we describe an approach where in utero electroporation of a plasmid encoding for fluorescently tagged gephyrin into the precursors of pyramidal cells along with ionotophoretic injection of Lucifer Yellow can reliably and specifically detect GABAergic synapses on the dendritic arbour of single pyramidal neurons. Using this technique and focusing on the basal dendritic arbour of layer 2/3 pyramidal cells of the medial prefrontal cortex, we demonstrate an intense development of GABAergic inputs onto these cells between postnatal days 10 and 20. While the spatial distribution of gephyrin clusters was not affected by the distance from the cell body at postnatal day 10, we found that distal dendritic segments appeared to have a higher gephyrin density at later developmental stages. We also show a transient increase around postnatal day 20 in the percentage of spines that are carrying a gephyrin cluster, indicative of innervation by a GABAergic terminal. Since the precise spatial arrangement of synaptic inputs is an important determinant of neuronal responses, we believe that the method described in this work may allow a better understanding of how inhibition settles together with excitation, and serve as basics for further modelling studies focusing on the geometry of dendritic inhibition during development.

  5. Development of inhibitory synaptic inputs on layer 2/3 pyramidal neurons in the rat medial prefrontal cortex

    KAUST Repository

    Virtanen, Mari A.

    2018-01-10

    Inhibitory control of pyramidal neurons plays a major role in governing the excitability in the brain. While spatial mapping of inhibitory inputs onto pyramidal neurons would provide important structural data on neuronal signaling, studying their distribution at the single cell level is difficult due to the lack of easily identifiable anatomical proxies. Here, we describe an approach where in utero electroporation of a plasmid encoding for fluorescently tagged gephyrin into the precursors of pyramidal cells along with ionotophoretic injection of Lucifer Yellow can reliably and specifically detect GABAergic synapses on the dendritic arbour of single pyramidal neurons. Using this technique and focusing on the basal dendritic arbour of layer 2/3 pyramidal cells of the medial prefrontal cortex, we demonstrate an intense development of GABAergic inputs onto these cells between postnatal days 10 and 20. While the spatial distribution of gephyrin clusters was not affected by the distance from the cell body at postnatal day 10, we found that distal dendritic segments appeared to have a higher gephyrin density at later developmental stages. We also show a transient increase around postnatal day 20 in the percentage of spines that are carrying a gephyrin cluster, indicative of innervation by a GABAergic terminal. Since the precise spatial arrangement of synaptic inputs is an important determinant of neuronal responses, we believe that the method described in this work may allow a better understanding of how inhibition settles together with excitation, and serve as basics for further modelling studies focusing on the geometry of dendritic inhibition during development.

  6. Adjoint-based Mesh Optimization Method: The Development and Application for Nuclear Fuel Analysis

    International Nuclear Information System (INIS)

    Son, Seongmin; Lee, Jeong Ik

    2016-01-01

    In this research, methods for optimizing mesh distribution is proposed. The proposed method uses adjoint base optimization method (adjoint method). The optimized result will be obtained by applying this meshing technique to the existing code input deck and will be compared to the results produced from the uniform meshing method. Numerical solutions are calculated form an in-house 1D Finite Difference Method code while neglecting the axial conduction. The fuel radial node optimization was first performed to match the Fuel Centerline Temperature (FCT) the best. This was followed by optimizing the axial node which the Peak Cladding Temperature (PCT) is matched the best. After obtaining the optimized radial and axial nodes, the nodalization is implemented into the system analysis code and transient analyses were performed to observe the optimum nodalization performance. The developed adjoint-based mesh optimization method in the study is applied to MARS-KS, which is a nuclear system analysis code. Results show that the newly established method yields better results than that of the uniform meshing method from the numerical point of view. It is again stressed that the optimized mesh for the steady state can also give better numerical results even during a transient analysis

  7. Input frequency and lexical variability in phonological development: a survival analysis of word-initial cluster production.

    Science.gov (United States)

    Ota, Mitsuhiko; Green, Sam J

    2013-06-01

    Although it has been often hypothesized that children learn to produce new sound patterns first in frequently heard words, the available evidence in support of this claim is inconclusive. To re-examine this question, we conducted a survival analysis of word-initial consonant clusters produced by three children in the Providence Corpus (0 ; 11-4 ; 0). The analysis took account of several lexical factors in addition to lexical input frequency, including the age of first production, production frequency, neighborhood density and number of phonemes. The results showed that lexical input frequency was a significant predictor of the age at which the accuracy level of cluster production in each word first reached 80%. The magnitude of the frequency effect differed across cluster types. Our findings indicate that some of the between-word variance found in the development of sound production can indeed be attributed to the frequency of words in the child's ambient language.

  8. Prospects of supervising service development as the tool of input quality control

    Science.gov (United States)

    Sizov, A.; Tretyakov, K.; Boyarko, G.; Shenderova, I.; Ostranitsyn, I.

    2016-09-01

    Supervising provides a foothold in the Russian Federation domestic market of oilfield services. Despite the rapid growth of supervising services market, there is still a definite demand in developing this domain. The authors consider the implementation of supervising in Russian oil and gas industry sector, as well as the possible execution paths of its improvement and development.

  9. Comparative evaluation of various optimization methods and the development of an optimization code system SCOOP

    International Nuclear Information System (INIS)

    Suzuki, Tadakazu

    1979-11-01

    Thirty two programs for linear and nonlinear optimization problems with or without constraints have been developed or incorporated, and their stability, convergence and efficiency have been examined. On the basis of these evaluations, the first version of the optimization code system SCOOP-I has been completed. The SCOOP-I is designed to be an efficient, reliable, useful and also flexible system for general applications. The system enables one to find global optimization point for a wide class of problems by selecting the most appropriate optimization method built in it. (author)

  10. The Development and Microstructure Analysis of High Strength Steel Plate NVE36 for Large Heat Input Welding

    Science.gov (United States)

    Peng, Zhang; Liangfa, Xie; Ming, Wei; Jianli, Li

    In the shipbuilding industry, the welding efficiency of the ship plate not only has a great effect on the construction cost of the ship, but also affects the construction speed and determines the delivery cycle. The steel plate used for large heat input welding was developed sufficiently. In this paper, the composition of the steel with a small amount of Nb, Ti and large amount of Mn had been designed in micro-alloyed route. The content of C and the carbon equivalent were also designed to a low level. The technology of oxide metallurgy was used during the smelting process of the steel. The rolling technology of TMCP was controlled at a low rolling temperature and ultra-fast cooling technology was used, for the purpose of controlling the transformation of the microstructure. The microstructure of the steel plate was controlled to be the mixed microstructure of low carbon bainite and ferrite. Large amount of oxide particles dispersed in the microstructure of steel, which had a positive effects on the mechanical property and welding performance of the steel. The mechanical property of the steel plate was excellent and the value of longitudinal Akv at -60 °C is more than 200 J. The toughness of WM and HAZ were excellent after the steel plate was welded with a large heat input of 100-250 kJ/cm. The steel plate processed by mentioned above can meet the requirement of large heat input welding.

  11. Optimization of space system development resources

    Science.gov (United States)

    Kosmann, William J.; Sarkani, Shahram; Mazzuchi, Thomas

    2013-06-01

    NASA has had a decades-long problem with cost growth during the development of space science missions. Numerous agency-sponsored studies have produced average mission level cost growths ranging from 23% to 77%. A new study of 26 historical NASA Science instrument set developments using expert judgment to reallocate key development resources has an average cost growth of 73.77%. Twice in history, a barter-based mechanism has been used to reallocate key development resources during instrument development. The mean instrument set development cost growth was -1.55%. Performing a bivariate inference on the means of these two distributions, there is statistical evidence to support the claim that using a barter-based mechanism to reallocate key instrument development resources will result in a lower expected cost growth than using the expert judgment approach. Agent-based discrete event simulation is the natural way to model a trade environment. A NetLogo agent-based barter-based simulation of science instrument development was created. The agent-based model was validated against the Cassini historical example, as the starting and ending instrument development conditions are available. The resulting validated agent-based barter-based science instrument resource reallocation simulation was used to perform 300 instrument development simulations, using barter to reallocate development resources. The mean cost growth was -3.365%. A bivariate inference on the means was performed to determine that additional significant statistical evidence exists to support a claim that using barter-based resource reallocation will result in lower expected cost growth, with respect to the historical expert judgment approach. Barter-based key development resource reallocation should work on spacecraft development as well as it has worked on instrument development. A new study of 28 historical NASA science spacecraft developments has an average cost growth of 46.04%. As barter-based key

  12. Optimization in underground mine planning - developments and opportunities

    OpenAIRE

    Musingwini, C.

    2016-01-01

    The application of mining-specific and generic optimization techniques in the mining industry is deeply rooted in the discipline of operations research (OR). OR has its origins in the British Royal Air Force and Army around the early 1930s. Its development continued during and after World War II. The application of OR techniques to optimization in the mining industry started to emerge in the early 1960s. Since then, optimization techniques have been applied to solve widely different mine plan...

  13. Development of a General Form CO2 and Brine Flux Input Model

    Energy Technology Data Exchange (ETDEWEB)

    Mansoor, K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sun, Y. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Carroll, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-08-01

    The National Risk Assessment Partnership (NRAP) project is developing a science-based toolset for the quantitative analysis of the potential risks associated with changes in groundwater chemistry from CO2 injection. In order to address uncertainty probabilistically, NRAP is developing efficient, reduced-order models (ROMs) as part of its approach. These ROMs are built from detailed, physics-based process models to provide confidence in the predictions over a range of conditions. The ROMs are designed to reproduce accurately the predictions from the computationally intensive process models at a fraction of the computational time, thereby allowing the utilization of Monte Carlo methods to probe variability in key parameters. This report presents the procedures used to develop a generalized model for CO2 and brine leakage fluxes based on the output of a numerical wellbore simulation. The resulting generalized parameters and ranges reported here will be used for the development of third-generation groundwater ROMs.

  14. Simultaneous bilingual language acquisition: The role of parental input on receptive vocabulary development.

    Science.gov (United States)

    Macleod, Andrea An; Fabiano-Smith, Leah; Boegner-Pagé, Sarah; Fontolliet, Salomé

    2013-02-01

    Parents often turn to educators and healthcare professionals for advice on how to best support their child's language development. These professionals frequently suggest implementing the 'one-parent-one-language' approach to ensure consistent exposure to both languages. The goal of this study was to understand how language exposure influences the receptive vocabulary development of simultaneous bilingual children. To this end, we targeted nine German-French children growing up in bilingual families. Their exposure to each language within and outside the home was measured, as were their receptive vocabulary abilities in German and French. The results indicate that children are receiving imbalanced exposure to each language. This imbalance is leading to a slowed development of the receptive vocabulary in the minority language, while the majority language is keeping pace with monolingual peers. The one-parent-one-language approach does not appear to support the development of both of the child's languages in the context described in the present study. Bilingual families may need to consider other options for supporting the bilingual language development of their children. As professionals, we need to provide parents with advice that is based on available data and that is flexible with regards to the current and future needs of the child and his family.

  15. Creating Optimal Environments for Talent Development

    DEFF Research Database (Denmark)

    Henriksen, Kristoffer; Storm, Louise Kamuk; Larsen, Carsten Hvid

    The holistic ecological approach (HEA) to talent development in sport shifts researchers’ attention from the individual athletes to the broader environment in which they develop. The HEA provides a theoretical grounding, ecologically inferred definitions of talent development, two working models......, and methodological guidelines. The HEA highlights two interconnected ways of analyzing athletic talent development environments (ATDE). First, there is a focus on the structure of the environment, particularly the roles and cooperation of key persons. Second, there is a focus on the organizational culture...... of the team. A number of in-depth case studies of successful talent development environments in Scandinavia have shown that while each environment is unique, they also share a number of features. They are characterized by proximal role modeling; an integration of efforts among the different agents (family...

  16. Design or development? Beyond the LP-STS debate; inputs from a Volvo truck case

    NARCIS (Netherlands)

    Kuipers, BS; De Witte, MC; van der Zwaan, AH

    2004-01-01

    In this paper, we will show that the debate between advocates of Lean production and the socio-technical approach has concentrated too much on the design aspect of the production structure, while neglecting the development aspect of teamwork. This paper addresses the question whether it is

  17. Experience of web-complex development of NPP thermophysical optimization

    International Nuclear Information System (INIS)

    Nikolaev, M.A.

    2014-01-01

    Current state of developing computation web complex (CWC) of thermophysical optimization of nuclear power plants is described. Main databases of CWC is realized on the MySQL platform. CWC information architecture, its functionality, optimization algorithms and CWC user interface are under consideration [ru

  18. DEVELOPING A SMARTPHONE APPLICATION TO IMPROVE CARE AND OUTCOMES IN ADOLESCENT ARTHRITIS THROUGH PATIENT INPUT

    OpenAIRE

    Alice Ran Cai; Dominik Beste

    2015-01-01

    Background: Juvenile idiopathic arthritis (JIA) affects around 1 in 1,000 young people (YP) in the UK. Flare-ups of JIA cause joint pain and swelling, and are often accompanied with fatigue, stiffness, sleep problems, higher negative emotions, and reduced participation in activities. As a result, JIA can negatively impact educational, psychosocial, and physical development and wellbeing, especially during puberty. In addition, missing medications, poor clinic attendance, as well as low levels...

  19. Developing and Optimizing Applications in Hadoop

    Science.gov (United States)

    Kothuri, P.; Garcia, D.; Hermans, J.

    2017-10-01

    This contribution is about sharing our recent experiences of building Hadoop based application. Hadoop ecosystem now offers myriad of tools which can overwhelm new users, yet there are successful ways these tools can be leveraged to solve problems. We look at factors to consider when using Hadoop to model and store data, best practices for moving data in and out of the system and common processing patterns, at each stage relating with the real world experience gained while developing such application. We share many of the design choices, tools developed and how to profile a distributed application which can be applied for other scenarios as well. In conclusion, the goal of the presentation is to provide guidance to architect Hadoop based application and share some of the reusable components developed in this process.

  20. Developing an evaluation framework for consumer-centred collaborative care of depression using input from stakeholders.

    Science.gov (United States)

    McCusker, Jane; Yaffe, Mark; Sussman, Tamara; Kates, Nick; Mulvale, Gillian; Jayabarathan, Ajantha; Law, Susan; Haggerty, Jeannie

    2013-03-01

    To develop a framework for research and evaluation of collaborative mental health care for depression, which includes attributes or domains of care that are important to consumers. A literature review on collaborative mental health care for depression was completed and used to guide discussion at an interactive workshop with pan-Canadian participants comprising people treated for depression with collaborative mental health care, as well as their family members; primary care and mental health practitioners; decision makers; and researchers. Thematic analysis of qualitative data from the workshop identified key attributes of collaborative care that are important to consumers and family members, as well as factors that may contribute to improved consumer experiences. The workshop identified an overarching theme of partnership between consumers and practitioners involved in collaborative care. Eight attributes of collaborative care were considered to be essential or very important to consumers and family members: respectfulness; involvement of consumers in treatment decisions; accessibility; provision of information; coordination; whole-person care; responsiveness to changing needs; and comprehensiveness. Three inter-related groups of factors may affect the consumer experience of collaborative care, namely, organizational aspects of care; consumer characteristics and personal resources; and community resources. A preliminary evaluation framework was developed and is presented here to guide further evaluation and research on consumer-centred collaborative mental health care for depression.

  1. Development and validation of a virtual reality simulator: human factors input to interventional radiology training.

    Science.gov (United States)

    Johnson, Sheena Joanne; Guediri, Sara M; Kilkenny, Caroline; Clough, Peter J

    2011-12-01

    This study developed and validated a virtual reality (VR) simulator for use by interventional radiologists. Research in the area of skill acquisition reports practice as essential to become a task expert. Studies on simulation show skills learned in VR can be successfully transferred to a real-world task. Recently, with improvements in technology, VR simulators have been developed to allow complex medical procedures to be practiced without risking the patient. Three studies are reported. In Study I, 35 consultant interventional radiologists took part in a cognitive task analysis to empirically establish the key competencies of the Seldinger procedure. In Study 2, 62 participants performed one simulated procedure, and their performance was compared by expertise. In Study 3, the transferability of simulator training to a real-world procedure was assessed with 14 trainees. Study I produced 23 key competencies that were implemented as performance measures in the simulator. Study 2 showed the simulator had both face and construct validity, although some issues were identified. Study 3 showed the group that had undergone simulator training received significantly higher mean performance ratings on a subsequent patient procedure. The findings of this study support the centrality of validation in the successful design of simulators and show the utility of simulators as a training device. The studies show the key elements of a validation program for a simulator. In addition to task analysis and face and construct validities, the authors highlight the importance of transfer of training in validation studies.

  2. Estimation of Global 1km-grid Terrestrial Carbon Exchange Part I: Developing Inputs and Modelling

    Science.gov (United States)

    Sasai, T.; Murakami, K.; Kato, S.; Matsunaga, T.; Saigusa, N.; Hiraki, K.

    2015-12-01

    Global terrestrial carbon cycle largely depends on a spatial pattern in land cover type, which is heterogeneously-distributed over regional and global scales. However, most studies, which aimed at the estimation of carbon exchanges between ecosystem and atmosphere, remained within several tens of kilometers grid spatial resolution, and the results have not been enough to understand the detailed pattern of carbon exchanges based on ecological community. Improving the sophistication of spatial resolution is obviously necessary to enhance the accuracy of carbon exchanges. Moreover, the improvement may contribute to global warming awareness, policy makers and other social activities. In this study, we show global terrestrial carbon exchanges (net ecosystem production, net primary production, and gross primary production) with 1km-grid resolution. As methodology for computing the exchanges, we 1) developed a global 1km-grid climate and satellite dataset based on the approach in Setoyama and Sasai (2013); 2) used the satellite-driven biosphere model (Biosphere model integrating Eco-physiological And Mechanistic approaches using Satellite data: BEAMS) (Sasai et al., 2005, 2007, 2011); 3) simulated the carbon exchanges by using the new dataset and BEAMS by the use of a supercomputer that includes 1280 CPU and 320 GPGPU cores (GOSAT RCF of NIES). As a result, we could develop a global uniform system for realistically estimating terrestrial carbon exchange, and evaluate net ecosystem production in each community level; leading to obtain highly detailed understanding of terrestrial carbon exchanges.

  3. MDS MIC Catalog Inputs

    Science.gov (United States)

    Johnson-Throop, Kathy A.; Vowell, C. W.; Smith, Byron; Darcy, Jeannette

    2006-01-01

    This viewgraph presentation reviews the inputs to the MDS Medical Information Communique (MIC) catalog. The purpose of the group is to provide input for updating the MDS MIC Catalog and to request that MMOP assign Action Item to other working groups and FSs to support the MITWG Process for developing MIC-DDs.

  4. Development and optimization of containment structure concepts

    International Nuclear Information System (INIS)

    Reuter, H.R.; Whitcraft, J.S. Jr.

    1976-01-01

    The development of prestressed concrete containment structures for nuclear power plants designed, constructed, and tested by Bechtel Power Corporation has been divided into three general stages or generations. The distinctions that characterize these generations are: the shape of the dome, the number of buttresses, the size and arrangement of the post-tensioning tendons, and the design level of the prestressing forces. (author)

  5. Population health metrics: crucial inputs to the development of evidence for health policy

    Directory of Open Access Journals (Sweden)

    Salomon Joshua A

    2003-04-01

    Full Text Available Abstract Valid, reliable and comparable measures of the health states of individuals and of the health status of populations are critical components of the evidence base for health policy. We need to develop population health measurement strategies that coherently address the relationships between epidemiological measures (such as risk exposures, incidence, and mortality rates and multi-domain measures of population health status, while ensuring validity and cross-population comparability. Studies reporting on descriptive epidemiology of major diseases, injuries and risk factors, and on the measurement of health at the population level – either for monitoring trends in health levels or inequalities or for measuring broad outcomes of health systems and social interventions – are not well-represented in traditional epidemiology journals, which tend to concentrate on causal studies and on quasi-experimental design. In particular, key methodological issues relating to the clear conceptualisation of, and the validity and comparability of measures of population health are currently not addressed coherently by any discipline, and cross-disciplinary debate is fragmented and often conducted in mutually incomprehensible language or paradigms. Population health measurement potentially bridges a range of currently disjoint fields of inquiry relating to health: biology, demography, epidemiology, health economics, and broader social science disciplines relevant to assessment of health determinants, health state valuations and health inequalities. This new journal will focus on the importance of a population based approach to measurement as a way to characterize the complexity of people's health, the diseases and risks that affect it, its distribution, and its valuation, and will attempt to provide a forum for innovative work and debate that bridge the many fields of inquiry relevant to population health in order to contribute to the development of valid

  6. Beam Delivery Simulation: BDSIM - Development & Optimization

    CERN Document Server

    Nevay, Laurence James; Garcia-Morales, H; Gibson, S M; Kwee-Hinzmann, R; Snuverink, J; Deacon, L C

    2014-01-01

    Beam Delivery Simulation (BDSIM) is a Geant4 and C++ based particle tracking code that seamlessly tracks particles through accelerators and detectors, including the full range of particle interaction physics processes from Geant4. BDSIM has been successfully used to model beam loss and background conditions for many current and future linear accelerators such as the Accelerator Test Facility 2 (ATF2) and the International Linear Collider (ILC). Current developments extend its application for use with storage rings, in particular for the Large Hadron Collider (LHC) and the High Luminosity upgrade project (HL-LHC). This paper presents the latest results from using BDSIM to model the LHC as well as the developments underway to improve performance.

  7. Developing new methods to investigate nuclear physics input to the cosmological lithium problem

    Directory of Open Access Journals (Sweden)

    Cook K.J.

    2013-12-01

    Full Text Available A significant challenge to nuclear astrophysics is the cosmological lithium problem, where models of Big Bang nucleosynthesis indicate abundances of 7Li two to four times larger than what is inferred via spectroscopic measurements of metal-poor stars. Recent experimental techniques developed for nuclear reaction studies at energies near the fusion barrier, if extended to reactions of astrophysical interest, may help understand nuclear reactions that can affect the production of 7Li during the Big Bang. Experiments at the ANU, using new experimental techniques, have provided complete pictures of the breakup mechanisms of light nuclei in collisions with heavy targets, such as 208Pb and 209Bi [1]. These experiments revealed dominant breakup mechanisms which had not even been considered in theoretical models. The study of the breakup of 6Li and 7Li following interactions with 58,64Ni and 27Al acts as a stepping stone from this previous work towards future experimental studies of breakup reactions of astrophysical relevance. In all cases studied, breakup is dominantly triggered by nucleon transfer between the colliding partners, but the transfer mechanisms are different. The findings of these experiments and experimental considerations for extensions to reactions of light nuclei, such as d +7Be, will be presented.

  8. Multiobjective optimization of low impact development stormwater controls

    Science.gov (United States)

    Eckart, Kyle; McPhee, Zach; Bolisetti, Tirupati

    2018-07-01

    Green infrastructure such as Low Impact Development (LID) controls are being employed to manage the urban stormwater and restore the predevelopment hydrological conditions besides improving the stormwater runoff water quality. Since runoff generation and infiltration processes are nonlinear, there is a need for identifying optimal combination of LID controls. A coupled optimization-simulation model was developed by linking the U.S. EPA Stormwater Management Model (SWMM) to the Borg Multiobjective Evolutionary Algorithm (Borg MOEA). The coupled model is capable of performing multiobjective optimization which uses SWMM simulations as a tool to evaluate potential solutions to the optimization problem. The optimization-simulation tool was used to evaluate low impact development (LID) stormwater controls. A SWMM model was developed, calibrated, and validated for a sewershed in Windsor, Ontario and LID stormwater controls were tested for three different return periods. LID implementation strategies were optimized using the optimization-simulation model for five different implementation scenarios for each of the three storm events with the objectives of minimizing peak flow in the stormsewers, reducing total runoff, and minimizing cost. For the sewershed in Windsor, Ontario, the peak run off and total volume of the runoff were found to reduce by 13% and 29%, respectively.

  9. RDS; A systematic approach towards system thermal hydraulics input code development for a comprehensive deterministic safety analysis

    International Nuclear Information System (INIS)

    Mohd Faiz Salim; Ridha Roslan; Mohd Rizal Mamat

    2013-01-01

    Full-text: Deterministic Safety Analysis (DSA) is one of the mandatory requirements conducted for Nuclear Power Plant licensing process, with the aim of ensuring safety compliance with relevant regulatory acceptance criteria. DSA is a technique whereby a set of conservative deterministic rules and requirements are applied for the design and operation of facilities or activities. Computer codes are normally used to assist in performing all required analysis under DSA. To ensure a comprehensive analysis, the conduct of DSA should follow a systematic approach. One of the methodologies proposed is the Standardized and Consolidated Reference Experimental (and Calculated) Database (SCRED) developed by University of Pisa. Based on this methodology, the use of Reference Data Set (RDS) as a pre-requisite reference document for developing input nodalization was proposed. This paper shall describe the application of RDS with the purpose of assessing its effectiveness. Two RDS documents were developed for an Integral Test Facility of LOBIMOD2 and associated Test A1-83. Data and information from various reports and drawings were referred in preparing the RDS. The results showed that by developing RDS, it has made possible to consolidate all relevant information in one single document. This is beneficial as it enables preservation of information, promotes quality assurance, allows traceability, facilitates continuous improvement, promotes solving of contradictions and finally assisting in developing thermal hydraulic input regardless of whichever code selected. However, some disadvantages were also recognized such as the need for experience in making engineering judgments, language barrier in accessing foreign information and limitation of resources. Some possible improvements are suggested to overcome these challenges. (author)

  10. RDS - A systematic approach towards system thermal hydraulics input code development for a comprehensive deterministic safety analysis

    International Nuclear Information System (INIS)

    Salim, Mohd Faiz; Roslan, Ridha; Ibrahim, Mohd Rizal Mamat

    2014-01-01

    Deterministic Safety Analysis (DSA) is one of the mandatory requirements conducted for Nuclear Power Plant licensing process, with the aim of ensuring safety compliance with relevant regulatory acceptance criteria. DSA is a technique whereby a set of conservative deterministic rules and requirements are applied for the design and operation of facilities or activities. Computer codes are normally used to assist in performing all required analysis under DSA. To ensure a comprehensive analysis, the conduct of DSA should follow a systematic approach. One of the methodologies proposed is the Standardized and Consolidated Reference Experimental (and Calculated) Database (SCRED) developed by University of Pisa. Based on this methodology, the use of Reference Data Set (RDS) as a pre-requisite reference document for developing input nodalization was proposed. This paper shall describe the application of RDS with the purpose of assessing its effectiveness. Two RDS documents were developed for an Integral Test Facility of LOBI-MOD2 and associated Test A1-83. Data and information from various reports and drawings were referred in preparing the RDS. The results showed that by developing RDS, it has made possible to consolidate all relevant information in one single document. This is beneficial as it enables preservation of information, promotes quality assurance, allows traceability, facilitates continuous improvement, promotes solving of contradictions and finally assisting in developing thermal hydraulic input regardless of whichever code selected. However, some disadvantages were also recognized such as the need for experience in making engineering judgments, language barrier in accessing foreign information and limitation of resources. Some possible improvements are suggested to overcome these challenges

  11. RDS - A systematic approach towards system thermal hydraulics input code development for a comprehensive deterministic safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Salim, Mohd Faiz, E-mail: mohdfaizs@tnb.com.my [Nuclear Energy Department, Tenaga Nasional Berhad, Level 32, Dua Sentral, 50470 Kuala Lumpur (Malaysia); Roslan, Ridha [Nuclear Installation Division, Atomic Energy Licensing Board, Batu 24, Jalan Dengkil, 43800 Dengkil, Selangor (Malaysia); Ibrahim, Mohd Rizal Mamat [Technical Support Division, Malaysian Nuclear Agency, Bangi, 43000 Kajang, Selangor (Malaysia)

    2014-02-12

    Deterministic Safety Analysis (DSA) is one of the mandatory requirements conducted for Nuclear Power Plant licensing process, with the aim of ensuring safety compliance with relevant regulatory acceptance criteria. DSA is a technique whereby a set of conservative deterministic rules and requirements are applied for the design and operation of facilities or activities. Computer codes are normally used to assist in performing all required analysis under DSA. To ensure a comprehensive analysis, the conduct of DSA should follow a systematic approach. One of the methodologies proposed is the Standardized and Consolidated Reference Experimental (and Calculated) Database (SCRED) developed by University of Pisa. Based on this methodology, the use of Reference Data Set (RDS) as a pre-requisite reference document for developing input nodalization was proposed. This paper shall describe the application of RDS with the purpose of assessing its effectiveness. Two RDS documents were developed for an Integral Test Facility of LOBI-MOD2 and associated Test A1-83. Data and information from various reports and drawings were referred in preparing the RDS. The results showed that by developing RDS, it has made possible to consolidate all relevant information in one single document. This is beneficial as it enables preservation of information, promotes quality assurance, allows traceability, facilitates continuous improvement, promotes solving of contradictions and finally assisting in developing thermal hydraulic input regardless of whichever code selected. However, some disadvantages were also recognized such as the need for experience in making engineering judgments, language barrier in accessing foreign information and limitation of resources. Some possible improvements are suggested to overcome these challenges.

  12. Integrated design optimization research and development in an industrial environment

    Science.gov (United States)

    Kumar, V.; German, Marjorie D.; Lee, S.-J.

    1989-01-01

    An overview is given of a design optimization project that is in progress at the GE Research and Development Center for the past few years. The objective of this project is to develop a methodology and a software system for design automation and optimization of structural/mechanical components and systems. The effort focuses on research and development issues and also on optimization applications that can be related to real-life industrial design problems. The overall technical approach is based on integration of numerical optimization techniques, finite element methods, CAE and software engineering, and artificial intelligence/expert systems (AI/ES) concepts. The role of each of these engineering technologies in the development of a unified design methodology is illustrated. A software system DESIGN-OPT has been developed for both size and shape optimization of structural components subjected to static as well as dynamic loadings. By integrating this software with an automatic mesh generator, a geometric modeler and an attribute specification computer code, a software module SHAPE-OPT has been developed for shape optimization. Details of these software packages together with their applications to some 2- and 3-dimensional design problems are described.

  13. Development of a fuzzy logic method to build objective functions in optimization problems: application to BWR fuel lattice design

    International Nuclear Information System (INIS)

    Martin-del-Campo, C.; Francois, J.L.; Barragan, A.M.; Palomera, M.A.

    2005-01-01

    In this paper we develop a methodology based on the use of the Fuzzy Logic technique to build multi-objective functions to be used in optimization processes applied to in-core nuclear fuel management. As an example, we selected the problem of determining optimal radial fuel enrichment and gadolinia distributions in a typical 'Boiling Water Reactor (BWR)' fuel lattice. The methodology is based on the use of the mathematical capability of Fuzzy Logic to model nonlinear functions of arbitrary complexity. The utility of Fuzzy Logic is to map an input space into an output space, and the primary mechanism for doing this is a list of if-then statements called rules. The rules refer to variables and adjectives that describe those variables and, the Fuzzy Logic technique interprets the values in the input vectors and, based on the set of rules assigns values to the output vector. The methodology was developed for the radial optimization of a BWR lattice where the optimization algorithm employed is Tabu Search. The global objective is to find the optimal distribution of enrichments and burnable poison concentrations in a 10*10 BWR lattice. In order to do that, a fuzzy control inference system was developed using the Fuzzy Logic Toolbox of Matlab and it has been linked to the Tabu Search optimization process. Results show that Tabu Search combined with Fuzzy Logic performs very well, obtaining lattices with optimal fuel utilization. (authors)

  14. Beam Delivery Simulation - Recent Developments and Optimization

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00232566; Boogert, Stewart Takashi; Garcia-Morales, H; Gibson, Stephen; Kwee-Hinzmann, Regina; Nevay, Laurence James; Deacon, Lawrence Charles

    2015-01-01

    Beam Delivery Simulation (BDSIM) is a particle tracking code that simulates the passage of particles through both the magnetic accelerator lattice as well as their interaction with the material of the accelerator itself. The Geant4 toolkit is used to give a full range of physics processes needed to simulate both the interaction of primary particles and the production and subsequent propagation of secondaries. BDSIM has already been used to simulate linear accelerators such as the International Linear Collider (ILC) and the Compact Linear Collider (CLIC), but it has recently been adapted to simulate circular accelerators as well, producing loss maps for the Large Hadron Collider (LHC). In this paper the most recent developments, which extend BDSIM’s functionality as well as improve its efficiency are presented. Improvement and refactorisation of the tracking algorithms are presented alongside improved automatic geometry construction for increased particle tracking speed.

  15. Service Operations Optimization: Recent Development in Supply Chain Management

    Directory of Open Access Journals (Sweden)

    Bin Shen

    2015-01-01

    Full Text Available Services are the key of success in operation management. Designing the effective strategies by optimization techniques is the fundamental and important condition for performance increase in service operations (SOs management. In this paper, we mainly focus on investigating SOs optimization in the areas of supply chain management, which create the greatest business values. Specifically, we study the recent development of SOs optimization associated with supply chain by categorizing them into four different industries (i.e., e-commerce industry, consumer service industry, public sector, and fashion industry and four various SOs features (i.e., advertising, channel coordination, pricing, and inventory. Moreover, we conduct the technical review on the stylish industries/topics and typical optimization models. The classical optimization approaches for SOs management in supply chain are presented. The managerial implications of SOs in supply chain are discussed.

  16. Development of High Heat Input Welding High Strength Steel Plate for Oil Storage Tank in Xinyu Steel Company

    Science.gov (United States)

    Zhao, Hemin; Dong, Fujun; Liu, Xiaolin; Xiong, Xiong

    This essay introduces the developed high-heat input welding quenched and tempered pressure vessel steel 12MnNiVR for oil storage tank by Xinyu Steel, which passed the review by the Boiler and Pressure Vessel Standards Technical Committee in 2009. The review comments that compared to the domestic and foreign similar steel standard, the key technical index of enterprise standard were in advanced level. After the heat input of 100kJ/cm electro-gas welding, welded points were still with excellent low temperature toughness at -20°C. The steel plate may be constructed for oil storage tank, which has been permitted by thickness range from 10 to 40mm, and design temperature among -20°C-100°C. It studied microstructure genetic effects mechanical properties of the steel. Many production practices indicated that the mechanical properties of products and the steel by stress relief heat treatment of steel were excellent, with pretreatment of hot metal, converter refining, external refining, protective casting, TMCP and heat treatment process measurements. The stability of performance and matured technology of Xinyu Steel support the products could completely service the demand of steel constructed for 10-15 million cubic meters large oil storage tank.

  17. The effectiveness of visual input enhancement on the noticing and L2 development of the Spanish past tense

    Directory of Open Access Journals (Sweden)

    Shawn Loewen

    2016-03-01

    Full Text Available Textual manipulation is a common pedagogic tool used to emphasize specific features of a second language (L2 text, thereby facilitating noticing and, ideally, second language development. Visual input enhancement has been used to investigate the effects of highlighting specific grammatical structures in a text. The current study uses a quasi-experimental design to determine the extent to which textual manipulation increase (a learners’ perception of targeted forms and (b their knowledge of the forms. Input enhancement was used to highlight the Spanish preterit and imperfect verb forms and an eye tracker measured the frequency and duration of participants’ fixation on the targeted items. In addition, pretests and posttests of the Spanish past tense provided information about participants’ knowledge of the targeted forms. Results indicate that learners were aware of the highlighted grammatical forms in the text; however, there was no difference in the amount of attention between the enhanced and unenhanced groups. In addition, both groups improved in their knowledge of the L2 forms; however, again, there was no differential improvement between the two groups.

  18. Development of Optimal Stressor Scenarios for New Operational Energy Systems

    Science.gov (United States)

    2017-12-01

    OPTIMAL STRESSOR SCENARIOS FOR NEW OPERATIONAL ENERGY SYSTEMS by Geoffrey E. Fastabend December 2017 Thesis Advisor: Alejandro S... ENERGY SYSTEMS 5. FUNDING NUMBERS 6. AUTHOR(S) Geoffrey E. Fastabend 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Postgraduate School...developed and tested simulation model for operational energy related systems in order to develop better stressor scenarios for acceptance testing

  19. GIS-based approach for optimal siting and sizing of renewables considering techno-environmental constraints and the stochastic nature of meteorological inputs

    Science.gov (United States)

    Daskalou, Olympia; Karanastasi, Maria; Markonis, Yannis; Dimitriadis, Panayiotis; Koukouvinos, Antonis; Efstratiadis, Andreas; Koutsoyiannis, Demetris

    2016-04-01

    Following the legislative EU targets and taking advantage of its high renewable energy potential, Greece can obtain significant benefits from developing its water, solar and wind energy resources. In this context we present a GIS-based methodology for the optimal sizing and siting of solar and wind energy systems at the regional scale, which is tested in the Prefecture of Thessaly. First, we assess the wind and solar potential, taking into account the stochastic nature of the associated meteorological processes (i.e. wind speed and solar radiation, respectively), which is essential component for both planning (i.e., type selection and sizing of photovoltaic panels and wind turbines) and management purposes (i.e., real-time operation of the system). For the optimal siting, we assess the efficiency and economic performance of the energy system, also accounting for a number of constraints, associated with topographic limitations (e.g., terrain slope, proximity to road and electricity grid network, etc.), the environmental legislation and other land use constraints. Based on this analysis, we investigate favorable alternatives using technical, environmental as well as financial criteria. The final outcome is GIS maps that depict the available energy potential and the optimal layout for photovoltaic panels and wind turbines over the study area. We also consider a hypothetical scenario of future development of the study area, in which we assume the combined operation of the above renewables with major hydroelectric dams and pumped-storage facilities, thus providing a unique hybrid renewable system, extended at the regional scale.

  20. Development of 20 kW input power coupler for 1.3 GHz ERL main linac. Component test at 30 kW IOT test stand

    International Nuclear Information System (INIS)

    Sakai, Hiroshi; Umemori, Kensei; Sakanaka, Shogo; Takahashi, Takeshi; Furuya, Takaaki; Shinoe, Kenji; Ishii, Atsushi; Nakamura, Norio; Sawamura, Masaru

    2009-01-01

    We started to develop an input coupler for a 1.3 GHz ERL superconducting cavity. Required input power is about 20 kW for the cavity acceleration field of 20 MV/m and the beam current of 100 mA in energy recovery operation. The input coupler is designed based on the STF-BL input coupler and some modifications are applied to the design for the CW 20 kW power operation. We fabricated input coupler components such as ceramic windows and bellows and carried out the high-power test of the components by using a 30 kW IOT power source and a test stand constructed for the highpower test. In this report, we mainly describe the results of the high-power test of ceramic window and bellows. (author)

  1. PLEXOS Input Data Generator

    Energy Technology Data Exchange (ETDEWEB)

    2017-02-01

    The PLEXOS Input Data Generator (PIDG) is a tool that enables PLEXOS users to better version their data, automate data processing, collaborate in developing inputs, and transfer data between different production cost modeling and other power systems analysis software. PIDG can process data that is in a generalized format from multiple input sources, including CSV files, PostgreSQL databases, and PSS/E .raw files and write it to an Excel file that can be imported into PLEXOS with only limited manual intervention.

  2. A Novel Approach to Develop the Lower Order Model of Multi-Input Multi-Output System

    Science.gov (United States)

    Rajalakshmy, P.; Dharmalingam, S.; Jayakumar, J.

    2017-10-01

    A mathematical model is a virtual entity that uses mathematical language to describe the behavior of a system. Mathematical models are used particularly in the natural sciences and engineering disciplines like physics, biology, and electrical engineering as well as in the social sciences like economics, sociology and political science. Physicists, Engineers, Computer scientists, and Economists use mathematical models most extensively. With the advent of high performance processors and advanced mathematical computations, it is possible to develop high performing simulators for complicated Multi Input Multi Ouptut (MIMO) systems like Quadruple tank systems, Aircrafts, Boilers etc. This paper presents the development of the mathematical model of a 500 MW utility boiler which is a highly complex system. A synergistic combination of operational experience, system identification and lower order modeling philosophy has been effectively used to develop a simplified but accurate model of a circulation system of a utility boiler which is a MIMO system. The results obtained are found to be in good agreement with the physics of the process and with the results obtained through design procedure. The model obtained can be directly used for control system studies and to realize hardware simulators for boiler testing and operator training.

  3. The Impact of Input Quality on Early Sign Development in Native and Non-Native Language Learners

    Science.gov (United States)

    Lu, Jenny; Jones, Anna; Morgan, Gary

    2016-01-01

    There is debate about how input variation influences child language. Most deaf children are exposed to a sign language from their non-fluent hearing parents and experience a delay in exposure to accessible language. A small number of children receive language input from their deaf parents who are fluent signers. Thus it is possible to document the…

  4. Development of a Multi-Event Trajectory Optimization Tool for Noise-Optimized Approach Route Design

    NARCIS (Netherlands)

    Braakenburg, M.L.; Hartjes, S.; Visser, H.G.; Hebly, S.J.

    2011-01-01

    This paper presents preliminary results from an ongoing research effort towards the development of a multi-event trajectory optimization methodology that allows to synthesize RNAV approach routes that minimize a cumulative measure of noise, taking into account the total noise effect aggregated for

  5. Optimization and Simulation in Drug Development - Review and Analysis

    DEFF Research Database (Denmark)

    Schjødt-Eriksen, Jens; Clausen, Jens

    2003-01-01

    We give a review of pharmaceutical R&D and mathematical simulation and optimization methods used to support decision making within the pharmaceutical development process. The complex nature of drug development is pointed out through a description of the various phases of the pharmaceutical develo...... development process. A part of the paper is dedicated to the use of simulation techniques to support clinical trials. The paper ends with a section describing portfolio modelling methods in the context of the pharmaceutical industry....

  6. Gestures and multimodal input

    OpenAIRE

    Keates, Simeon; Robinson, Peter

    1999-01-01

    For users with motion impairments, the standard keyboard and mouse arrangement for computer access often presents problems. Other approaches have to be adopted to overcome this. In this paper, we will describe the development of a prototype multimodal input system based on two gestural input channels. Results from extensive user trials of this system are presented. These trials showed that the physical and cognitive loads on the user can quickly become excessive and detrimental to the interac...

  7. Development of optimized segmentation map in dual energy computed tomography

    Science.gov (United States)

    Yamakawa, Keisuke; Ueki, Hironori

    2012-03-01

    Dual energy computed tomography (DECT) has been widely used in clinical practice and has been particularly effective for tissue diagnosis. In DECT the difference of two attenuation coefficients acquired by two kinds of X-ray energy enables tissue segmentation. One problem in conventional DECT is that the segmentation deteriorates in some cases, such as bone removal. This is due to two reasons. Firstly, the segmentation map is optimized without considering the Xray condition (tube voltage and current). If we consider the tube voltage, it is possible to create an optimized map, but unfortunately we cannot consider the tube current. Secondly, the X-ray condition is not optimized. The condition can be set empirically, but this means that the optimized condition is not used correctly. To solve these problems, we have developed methods for optimizing the map (Method-1) and the condition (Method-2). In Method-1, the map is optimized to minimize segmentation errors. The distribution of the attenuation coefficient is modeled by considering the tube current. In Method-2, the optimized condition is decided to minimize segmentation errors depending on tube voltagecurrent combinations while keeping the total exposure constant. We evaluated the effectiveness of Method-1 by performing a phantom experiment under the fixed condition and of Method-2 by performing a phantom experiment under different combinations calculated from the total exposure constant. When Method-1 was followed with Method-2, the segmentation error was reduced from 37.8 to 13.5 %. These results demonstrate that our developed methods can achieve highly accurate segmentation while keeping the total exposure constant.

  8. Social and Linguistic Input in Low-Income African American Mother-Child Dyads from 1 Month through 2 Years: Relations to Vocabulary Development

    Science.gov (United States)

    Shimpi, Priya M.; Fedewa, Alicia; Hans, Sydney

    2012-01-01

    The relation of social and linguistic input measures to early vocabulary development was examined in 30 low-income African American mother-infant pairs. Observations were conducted when the child was 0 years, 1 month (0;1), 0;4, 0;8, 1;0, 1;6, and 2;0. Maternal input was coded for word types and tokens, contingent responsiveness, and…

  9. Development of EPICS Input Output Controller and User Interface for the PEFP Low Level RF Control System

    Energy Technology Data Exchange (ETDEWEB)

    Song, Young Gi; Kim, Han Sung; Seol, Kyung Tae; Kwon, Hyeok Jung; Cho, Yong Sub [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2010-05-15

    The Low-Level RF (LLRF) control system of the Proton Engineering Frontier Project (PEFP) was developed for handling the driving frequency for Quadrupole (RFQ) and the Draft Tube Linac (DTL) cavities in 2006. The RF amplitude and phase of the accelerating field were controlled within 1% and 1 degree by stability requirements, respectively. Operators have been using the LLRF control system under the windows based text console mode as an operator interface. The LLRF control system could not be integrated with Experimental Physics Industrial Control System (EPICS) Input Output Controllers (IOC) for each subsection of PEFP facility. The main objective of this study is to supply operators of the LLRF control system with user friendly and convenient operating environment. The new LLRF control system is composed of a Verse Module Eurocard (VME) baseboard, a PCI Mezzanine Card (PMC), Board Support Package (BSP), EPICS software tool and a Real-Time Operating System (RTOS) VxWorks. A test with a dummy cavity of the new LLRF control system shows that operators can control and monitor operation parameters for a desired feedback action by using EPICS Channel Access (CA).

  10. Development of EPICS Input Output Controller and User Interface for the PEFP Low Level RF Control System

    International Nuclear Information System (INIS)

    Song, Young Gi; Kim, Han Sung; Seol, Kyung Tae; Kwon, Hyeok Jung; Cho, Yong Sub

    2010-01-01

    The Low-Level RF (LLRF) control system of the Proton Engineering Frontier Project (PEFP) was developed for handling the driving frequency for Quadrupole (RFQ) and the Draft Tube Linac (DTL) cavities in 2006. The RF amplitude and phase of the accelerating field were controlled within 1% and 1 degree by stability requirements, respectively. Operators have been using the LLRF control system under the windows based text console mode as an operator interface. The LLRF control system could not be integrated with Experimental Physics Industrial Control System (EPICS) Input Output Controllers (IOC) for each subsection of PEFP facility. The main objective of this study is to supply operators of the LLRF control system with user friendly and convenient operating environment. The new LLRF control system is composed of a Verse Module Eurocard (VME) baseboard, a PCI Mezzanine Card (PMC), Board Support Package (BSP), EPICS software tool and a Real-Time Operating System (RTOS) VxWorks. A test with a dummy cavity of the new LLRF control system shows that operators can control and monitor operation parameters for a desired feedback action by using EPICS Channel Access (CA).

  11. Contributions for larval development optimization of Homarus gammarus

    Directory of Open Access Journals (Sweden)

    Pedro Tiago Fonseca Sá

    2014-06-01

    The seawater rising temperature resulted in a decrease of intermoult period in all larval development stages and at all tested temperatures, ranging from 4.77 (Z1 to 16.5 days (Z3 at 16°C, whereas at 23°C, ranged from 3:02 (Z1 and 9.75 days (Z3. The results obtained are an extremely useful guide for future optimization of protocols on larval development of H. gammarus.

  12. Development and optimization of a device for diferencial pressure measurement

    International Nuclear Information System (INIS)

    Santarine, G.A.

    1980-01-01

    The measurements of reduced values of diferencial pressure, are studied. Several situations are described where the diferencial pressure accurate measurement is necessary in routine works in the Thermohydraulic Laboratory, as well as, the major pressure measurement devices and their respective range are studied. The development of a device for diferencial pressure measurement followed by the design development of the calibration bench covering the foreseen range, start up tests realization, optimization, calibration, performance analysis and conclusions, is showed. (Author) [pt

  13. Optimal application of climate data to the development of design wind speeds

    DEFF Research Database (Denmark)

    Kruger, Andries C.; Retief, Johan V.; Goliger, Adam M.

    2014-01-01

    Africa (WASA project) focuses, amongst others, on the development of a Regional Extreme Wind Climate (REWC) for South Africa. Wind farms are planned for areas with relatively strong and sustained winds, with wind turbines classed according to their suitability for different wind conditions. The REWC...... statistics are used during the construction and design phase to make assumptions about the local strong wind climate that the wind turbines will be exposed to, with the local environment and topography as additional input. The simultaneous development of the REWC and revision of the extreme wind statistics...... of South Africa created an opportunity to bring together a range of expertise that could contribute to the optimal development of design wind speed information. These include the knowledge of the statistical extraction of extreme wind observations from reanalysis and model data, the quality control...

  14. Investigation, development and application of optimal output feedback theory. Volume 2: Development of an optimal, limited state feedback outer-loop digital flight control system for 3-D terminal area operation

    Science.gov (United States)

    Broussard, J. R.; Halyo, N.

    1984-01-01

    This report contains the development of a digital outer-loop three dimensional radio navigation (3-D RNAV) flight control system for a small commercial jet transport. The outer-loop control system is designed using optimal stochastic limited state feedback techniques. Options investigated using the optimal limited state feedback approach include integrated versus hierarchical control loop designs, 20 samples per second versus 5 samples per second outer-loop operation and alternative Type 1 integration command errors. Command generator tracking techniques used in the digital control design enable the jet transport to automatically track arbitrary curved flight paths generated by waypoints. The performance of the design is demonstrated using detailed nonlinear aircraft simulations in the terminal area, frequency domain multi-input sigma plots, frequency domain single-input Bode plots and closed-loop poles. The response of the system to a severe wind shear during a landing approach is also presented.

  15. Data Mining and Optimization Tools for Developing Engine Parameters Tools

    Science.gov (United States)

    Dhawan, Atam P.

    1998-01-01

    This project was awarded for understanding the problem and developing a plan for Data Mining tools for use in designing and implementing an Engine Condition Monitoring System. Tricia Erhardt and I studied the problem domain for developing an Engine Condition Monitoring system using the sparse and non-standardized datasets to be available through a consortium at NASA Lewis Research Center. We visited NASA three times to discuss additional issues related to dataset which was not made available to us. We discussed and developed a general framework of data mining and optimization tools to extract useful information from sparse and non-standard datasets. These discussions lead to the training of Tricia Erhardt to develop Genetic Algorithm based search programs which were written in C++ and used to demonstrate the capability of GA algorithm in searching an optimal solution in noisy, datasets. From the study and discussion with NASA LeRC personnel, we then prepared a proposal, which is being submitted to NASA for future work for the development of data mining algorithms for engine conditional monitoring. The proposed set of algorithm uses wavelet processing for creating multi-resolution pyramid of tile data for GA based multi-resolution optimal search.

  16. Optimization and Validation of the Developed Uranium Isotopic Analysis Code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. H.; Kang, M. Y.; Kim, Jinhyeong; Choi, H. D. [Seoul National Univ., Seoul (Korea, Republic of)

    2014-10-15

    γ-ray spectroscopy is a representative non-destructive assay for nuclear material, and less time-consuming and less expensive than the destructive analysis method. The destructive technique is more precise than NDA technique, however, there is some correction algorithm which can improve the performance of γ-spectroscopy. For this reason, an analysis code for uranium isotopic analysis is developed by Applied Nuclear Physics Group in Seoul National University. Overlapped γ- and x-ray peaks in the 89-101 keV X{sub α}-region are fitted with Gaussian and Lorentzian distribution peak functions, tail and background functions. In this study, optimizations for the full-energy peak efficiency calibration and fitting parameters of peak tail and background are performed, and validated with 24 hour acquisition of CRM uranium samples. The optimization of peak tail and background parameters are performed with the validation by using CRM uranium samples. The analysis performance is improved in HEU samples, but more optimization of fitting parameters is required in LEU sample analysis. In the future, the optimization research about the fitting parameters with various type of uranium samples will be performed. {sup 234}U isotopic analysis algorithms and correction algorithms (coincidence effect, self-attenuation effect) will be developed.

  17. Development of an optimized procedure bridging design and structural analysis codes for the automatized design of the SMART

    International Nuclear Information System (INIS)

    Kim, Tae Wan; Park, Keun Bae; Choi, Suhn; Kim, Kang Soo; Jeong, Kyeong Hoon; Lee, Gyu Mahn

    1998-09-01

    In this report, an optimized design and analysis procedure is established to apply to the SMART (System-integrated Modular Advanced ReacTor) development. The development of an optimized procedure is to minimize the time consumption and engineering effort by squeezing the design and feedback interactions. To achieve this goal, the data and information generated through the design development should be directly transferred to the analysis program with minimum operation. The verification of the design concept requires considerable effort since the communication between the design and analysis involves time consuming stage for the conversion of input information. In this report, an optimized procedure is established bridging the design and analysis stage utilizing the IDEAS, ABAQUS and ANSYS. (author). 3 refs., 2 tabs., 5 figs

  18. A three-phase to three-phase series-resonant power converter with optimal input current waveforms, Part II: implementation and results

    NARCIS (Netherlands)

    Huisman, H.

    1988-01-01

    For pt.I see ibid., vol.35, no.2, p.263-8 (1988). A 15 kW three-phase prototype series-resonant power converter is constructed. The converter features sinusoidal output voltage and sinusoidal input currents. The control concepts and necessary electronics, as well as the layout of the power circuit,

  19. A three-phase to three-phase series-resonant power converter with optimal input current waveforms, Part I: control strategy

    NARCIS (Netherlands)

    Huisman, H.

    1988-01-01

    A control strategy for multiphase-input multiphase-output AC to AC series-resonant (SR) power converters is presented. After reviewing some basics in SR power converters, a hierarchy of control mechanisms is presented, together with their respective theoretical backgrounds and practical limitations.

  20. Development of Inventory Optimization System for Operation Nuclear Plants

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Se-Jin; Park, Jong-Hyuk; Yoo, Sung-Soo; Lee, Sang-Guk [Korea Electric Power Research Institutes, Taejon (Korea, Republic of)

    2006-07-01

    Inventory control of spare parts plays an increasingly important role in operation management. This is why inventory management systems such as manufacturing resources planning(MRP) and enterprise resource planning(ERP) have been added. However, most of these contributions have similar theoretical background. This means the concepts and techniques are mainly based on mathematical assumptions and modeling inventory of spare parts situations. Nuclear utilities in Korea have several problems to manage the optimum level of spare parts though they used MRP System. Because most of items have long lead time and they are imported from United States, Canada, France and so on. We developed the inventory optimization system for Operation Nuclear Plants to resolve these problems. In this paper, we report a data flow process, data load and inventory calculation process. The main contribution of this paper is development of inventory optimization system which can be used in domestic power plants.

  1. Development of Inventory Optimization System for Operation Nuclear Plants

    International Nuclear Information System (INIS)

    Jang, Se-Jin; Park, Jong-Hyuk; Yoo, Sung-Soo; Lee, Sang-Guk

    2006-01-01

    Inventory control of spare parts plays an increasingly important role in operation management. This is why inventory management systems such as manufacturing resources planning(MRP) and enterprise resource planning(ERP) have been added. However, most of these contributions have similar theoretical background. This means the concepts and techniques are mainly based on mathematical assumptions and modeling inventory of spare parts situations. Nuclear utilities in Korea have several problems to manage the optimum level of spare parts though they used MRP System. Because most of items have long lead time and they are imported from United States, Canada, France and so on. We developed the inventory optimization system for Operation Nuclear Plants to resolve these problems. In this paper, we report a data flow process, data load and inventory calculation process. The main contribution of this paper is development of inventory optimization system which can be used in domestic power plants

  2. Development of a MODIS-Derived Surface Albedo Data Set: An Improved Model Input for Processing the NSRDB

    Energy Technology Data Exchange (ETDEWEB)

    Maclaurin, Galen [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sengupta, Manajit [National Renewable Energy Lab. (NREL), Golden, CO (United States); Xie, Yu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Gilroy, Nicholas [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-12-01

    A significant source of bias in the transposition of global horizontal irradiance to plane-of-array (POA) irradiance arises from inaccurate estimations of surface albedo. The current physics-based model used to produce the National Solar Radiation Database (NSRDB) relies on model estimations of surface albedo from a reanalysis climatalogy produced at relatively coarse spatial resolution compared to that of the NSRDB. As an input to spectral decomposition and transposition models, more accurate surface albedo data from remotely sensed imagery at finer spatial resolutions would improve accuracy in the final product. The National Renewable Energy Laboratory (NREL) developed an improved white-sky (bi-hemispherical reflectance) broadband (0.3-5.0 ..mu..m) surface albedo data set for processing the NSRDB from two existing data sets: a gap-filled albedo product and a daily snow cover product. The Moderate Resolution Imaging Spectroradiometer (MODIS) sensors onboard the Terra and Aqua satellites have provided high-quality measurements of surface albedo at 30 arc-second spatial resolution and 8-day temporal resolution since 2001. The high spatial and temporal resolutions and the temporal coverage of the MODIS sensor will allow for improved modeling of POA irradiance in the NSRDB. However, cloud and snow cover interfere with MODIS observations of ground surface albedo, and thus they require post-processing. The MODIS production team applied a gap-filling methodology to interpolate observations obscured by clouds or ephemeral snow. This approach filled pixels with ephemeral snow cover because the 8-day temporal resolution is too coarse to accurately capture the variability of snow cover and its impact on albedo estimates. However, for this project, accurate representation of daily snow cover change is important in producing the NSRDB. Therefore, NREL also used the Integrated Multisensor Snow and Ice Mapping System data set, which provides daily snow cover observations of the

  3. EXIOBASE 3: Developing a Time Series of Detailed Environmentally Extended Multi-Regional Input-Output Tables

    NARCIS (Netherlands)

    Stadler, K.; Wood, R.; Bulavskaya, T.; Södersten, C.J.; Simas, M.; Schmidt, S.; Usubiaga, A.; Acosta-Fernández, J.; Kuenen, J.; Bruckner, M.; Giljum, S.; Lutter, S.; Merciai, S.; Schmidt, J.H.; Theurl, M.C.; Plutzar, C.; Kastner, T.; Eisenmenger, N.; Erb, K.H.; Koning, A. de; Tukker, A.

    2018-01-01

    Environmentally extended multiregional input-output (EE MRIO) tables have emerged as a key framework to provide a comprehensive description of the global economy and analyze its effects on the environment. Of the available EE MRIO databases, EXIOBASE stands out as a database compatible with the

  4. Pharmaceutical development and optimization of azithromycin suppository for paediatric use

    Science.gov (United States)

    Kauss, Tina; Gaubert, Alexandra; Boyer, Chantal; Ba, Boubakar B.; Manse, Muriel; Massip, Stephane; Léger, Jean-Michel; Fawaz, Fawaz; Lembege, Martine; Boiron, Jean-Michel; Lafarge, Xavier; Lindegardh, Niklas; White, Nicholas J.; Olliaro, Piero; Millet, Pascal; Gaudin, Karen

    2013-01-01

    Pharmaceutical development and manufacturing process optimization work was undertaken in order to propose a potential paediatric rectal formulation of azithromycin as an alternative to existing oral or injectable formulations. The target product profile was to be easy-to-use, cheap and stable in tropical conditions, with bioavailability comparable to oral forms, rapidly achieving and maintaining bactericidal concentrations. PEG solid solution suppositories were characterized in vitro using visual, HPLC, DSC, FTIR and XRD analyses. In vitro drug release and in vivo bioavailability were assessed; a study in rabbits compared the bioavailability of the optimized solid solution suppository to rectal solution and intra-venous product (as reference) and to the previous, non-optimized formulation (suspended azithromycin suppository). The bioavailability of azithromycin administered as solid solution suppositories relative to intra-venous was 43%, which compared well to the target of 38% (oral product in humans). The results of 3-month preliminary stability and feasibility studies were consistent with industrial production scale-up. This product has potential both as a classical antibiotic and as a product for use in severely ill children in rural areas. Industrial partners for further development are being sought. PMID:23220079

  5. Pharmaceutical development and optimization of azithromycin suppository for paediatric use.

    Science.gov (United States)

    Kauss, Tina; Gaubert, Alexandra; Boyer, Chantal; Ba, Boubakar B; Manse, Muriel; Massip, Stephane; Léger, Jean-Michel; Fawaz, Fawaz; Lembege, Martine; Boiron, Jean-Michel; Lafarge, Xavier; Lindegardh, Niklas; White, Nicholas J; Olliaro, Piero; Millet, Pascal; Gaudin, Karen

    2013-01-30

    Pharmaceutical development and manufacturing process optimization work was undertaken in order to propose a potential paediatric rectal formulation of azithromycin as an alternative to existing oral or injectable formulations. The target product profile was to be easy-to-use, cheap and stable in tropical conditions, with bioavailability comparable to oral forms, rapidly achieving and maintaining bactericidal concentrations. PEG solid solution suppositories were characterized in vitro using visual, HPLC, DSC, FTIR and XRD analyses. In vitro drug release and in vivo bioavailability were assessed; a study in rabbits compared the bioavailability of the optimized solid solution suppository to rectal solution and intra-venous product (as reference) and to the previous, non-optimized formulation (suspended azithromycin suppository). The bioavailability of azithromycin administered as solid solution suppositories relative to intra-venous was 43%, which compared well to the target of 38% (oral product in humans). The results of 3-month preliminary stability and feasibility studies were consistent with industrial production scale-up. This product has potential both as a classical antibiotic and as a product for use in severely ill children in rural areas. Industrial partners for further development are being sought. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. Development of a methodology for maintenance optimization at Kozloduy NPP

    International Nuclear Information System (INIS)

    Kitchev, E.

    1997-01-01

    The paper presents the overview of a project for development of an applicable strategy and methods for Kozloduy NPP (KNPP) to optimize its maintenance program in order to meet the current risk based maintenance requirements. The strategy in a format of Integrated Maintenance Program (IMP) manual will define the targets of the optimization process, the major stages and elements of this process and their relationships. IMP embodies the aspects of the US NRC Maintenance Rule compliance and facilitates the integration of KNPP programs and processes which impact the plant maintenance and safety. The methods in a format of IMP Instructions (IM-PI) will define how the different IMP stages can be implemented and the IMP targets can be achieved at KNPP environment. (author). 8 refs

  7. Phasing Out a Polluting Input

    OpenAIRE

    Eriksson, Clas

    2015-01-01

    This paper explores economic policies related to the potential conflict between economic growth and the environment. It applies a model with directed technological change and focuses on the case with low elasticity of substitution between clean and dirty inputs in production. New technology is substituted for the polluting input, which results in a gradual decline in pollution along the optimal long-run growth path. In contrast to some recent work, the era of pollution and environmental polic...

  8. Design of an X-band accelerating structure using a newly developed structural optimization procedure

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Xiaoxia [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai 201800 (China); University of Chinese Academy of Sciences, Beijing 100049 (China); Fang, Wencheng; Gu, Qiang [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai 201800 (China); Zhao, Zhentang, E-mail: zhaozhentang@sinap.ac.cn [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai 201800 (China); University of Chinese Academy of Sciences, Beijing 100049 (China)

    2017-05-11

    An X-band high gradient accelerating structure is a challenging technology for implementation in advanced electron linear accelerator facilities. The present work discusses the design of an X-band accelerating structure for dedicated application to a compact hard X-ray free electron laser facility at the Shanghai Institute of Applied Physics, and numerous design optimizations are conducted with consideration for radio frequency (RF) breakdown, RF efficiency, short-range wakefields, and dipole/quadrupole field modes, to ensure good beam quality and a high accelerating gradient. The designed X-band accelerating structure is a constant gradient structure with a 4π/5 operating mode and input and output dual-feed couplers in a racetrack shape. The design process employs a newly developed effective optimization procedure for optimization of the X-band accelerating structure. In addition, the specific design of couplers providing high beam quality by eliminating dipole field components and reducing quadrupole field components is discussed in detail.

  9. A Development of a System Enables Character Input and PC Operation via Voice for a Physically Disabled Person with a Speech Impediment

    Science.gov (United States)

    Tanioka, Toshimasa; Egashira, Hiroyuki; Takata, Mayumi; Okazaki, Yasuhisa; Watanabe, Kenzi; Kondo, Hiroki

    We have designed and implemented a PC operation support system for a physically disabled person with a speech impediment via voice. Voice operation is an effective method for a physically disabled person with involuntary movement of the limbs and the head. We have applied a commercial speech recognition engine to develop our system for practical purposes. Adoption of a commercial engine reduces development cost and will contribute to make our system useful to another speech impediment people. We have customized commercial speech recognition engine so that it can recognize the utterance of a person with a speech impediment. We have restricted the words that the recognition engine recognizes and separated a target words from similar words in pronunciation to avoid misrecognition. Huge number of words registered in commercial speech recognition engines cause frequent misrecognition for speech impediments' utterance, because their utterance is not clear and unstable. We have solved this problem by narrowing the choice of input down in a small number and also by registering their ambiguous pronunciations in addition to the original ones. To realize all character inputs and all PC operation with a small number of words, we have designed multiple input modes with categorized dictionaries and have introduced two-step input in each mode except numeral input to enable correct operation with small number of words. The system we have developed is in practical level. The first author of this paper is physically disabled with a speech impediment. He has been able not only character input into PC but also to operate Windows system smoothly by using this system. He uses this system in his daily life. This paper is written by him with this system. At present, the speech recognition is customized to him. It is, however, possible to customize for other users by changing words and registering new pronunciation according to each user's utterance.

  10. Development of Earthquake Ground Motion Input for Preclosure Seismic Design and Postclosure Performance Assessment of a Geologic Repository at Yucca Mountain, NV

    International Nuclear Information System (INIS)

    I. Wong

    2004-01-01

    This report describes a site-response model and its implementation for developing earthquake ground motion input for preclosure seismic design and postclosure assessment of the proposed geologic repository at Yucca Mountain, Nevada. The model implements a random-vibration theory (RVT), one-dimensional (1D) equivalent-linear approach to calculate site response effects on ground motions. The model provides results in terms of spectral acceleration including peak ground acceleration, peak ground velocity, and dynamically-induced strains as a function of depth. In addition to documenting and validating this model for use in the Yucca Mountain Project, this report also describes the development of model inputs, implementation of the model, its results, and the development of earthquake time history inputs based on the model results. The purpose of the site-response ground motion model is to incorporate the effects on earthquake ground motions of (1) the approximately 300 m of rock above the emplacement levels beneath Yucca Mountain and (2) soil and rock beneath the site of the Surface Facilities Area. A previously performed probabilistic seismic hazard analysis (PSHA) (CRWMS M and O 1998a [DIRS 103731]) estimated ground motions at a reference rock outcrop for the Yucca Mountain site (Point A), but those results do not include these site response effects. Thus, the additional step of applying the site-response ground motion model is required to develop ground motion inputs that are used for preclosure and postclosure purposes

  11. Development of Earthquake Ground Motion Input for Preclosure Seismic Design and Postclosure Performance Assessment of a Geologic Repository at Yucca Mountain, NV

    Energy Technology Data Exchange (ETDEWEB)

    I. Wong

    2004-11-05

    This report describes a site-response model and its implementation for developing earthquake ground motion input for preclosure seismic design and postclosure assessment of the proposed geologic repository at Yucca Mountain, Nevada. The model implements a random-vibration theory (RVT), one-dimensional (1D) equivalent-linear approach to calculate site response effects on ground motions. The model provides results in terms of spectral acceleration including peak ground acceleration, peak ground velocity, and dynamically-induced strains as a function of depth. In addition to documenting and validating this model for use in the Yucca Mountain Project, this report also describes the development of model inputs, implementation of the model, its results, and the development of earthquake time history inputs based on the model results. The purpose of the site-response ground motion model is to incorporate the effects on earthquake ground motions of (1) the approximately 300 m of rock above the emplacement levels beneath Yucca Mountain and (2) soil and rock beneath the site of the Surface Facilities Area. A previously performed probabilistic seismic hazard analysis (PSHA) (CRWMS M&O 1998a [DIRS 103731]) estimated ground motions at a reference rock outcrop for the Yucca Mountain site (Point A), but those results do not include these site response effects. Thus, the additional step of applying the site-response ground motion model is required to develop ground motion inputs that are used for preclosure and postclosure purposes.

  12. Developing Green GDP Accounting for Thai Agricultural Sector Using the Economic Input Output - Life Cycle Assessment to Assess Green Growth

    OpenAIRE

    Attavanich, Witsanu; Mungkung, Rattanawan; Mahathanaseth, Itthipong; Sanglestsawai, Santi; Jirajari, Athiwatr

    2016-01-01

    There is no indicator measuring Thailand’s green growth by valuing the resource degradation and environmental damage costs. This article aims to estimate Thailand’s green gross domestic (GDP) that takes into account environmental damage costs with the detailed analysis on the agricultural sector using the Economic Input Output - Life Cycle Assessment (EIO-LCA) approach. The representative product in each sector was selected based on the available life cycle inventory data, economic values and...

  13. The Effects of Input Flood and Consciousness-Raising Approach on Collocation Knowledge Development of Language Learners

    Directory of Open Access Journals (Sweden)

    Elaheh Hamed Mahvelati

    2012-11-01

    Full Text Available Many researchers stress the importance of lexical coherence and emphasize the need for teaching collocations at all levels of language proficiency. Thus, this study was conducted to measure the relative effectiveness of explicit (consciousness-raising approach versus implicit (input flood collocation instruction with regard to learners’ knowledge of both lexical and grammatical collocations. Ninety-five upper-intermediate learners, who were randomly assigned to the control and experimental groups, served as the participants of this study. While one of the experimental groups was provided with input flood treatment, the other group received explicit collocation instruction. In contrast, the participants in the control group did not receive any instruction on learning collocations. The results of the study, which were collected through pre-test, immediate post-test and delayed post-test, revealed that although both methods of teaching collocations proved effective, the explicit method of consciousness-raising approach was significantly superior to the implicit method of input flood treatment.

  14. Word reading skill predicts anticipation of upcoming spoken language input: a study of children developing proficiency in reading.

    Science.gov (United States)

    Mani, Nivedita; Huettig, Falk

    2014-10-01

    Despite the efficiency with which language users typically process spoken language, a growing body of research finds substantial individual differences in both the speed and accuracy of spoken language processing potentially attributable to participants' literacy skills. Against this background, the current study took a look at the role of word reading skill in listeners' anticipation of upcoming spoken language input in children at the cusp of learning to read; if reading skills affect predictive language processing, then children at this stage of literacy acquisition should be most susceptible to the effects of reading skills on spoken language processing. We tested 8-year-olds on their prediction of upcoming spoken language input in an eye-tracking task. Although children, like in previous studies to date, were successfully able to anticipate upcoming spoken language input, there was a strong positive correlation between children's word reading skills (but not their pseudo-word reading and meta-phonological awareness or their spoken word recognition skills) and their prediction skills. We suggest that these findings are most compatible with the notion that the process of learning orthographic representations during reading acquisition sharpens pre-existing lexical representations, which in turn also supports anticipation of upcoming spoken words. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Development and innovation of system resources to optimize patient care.

    Science.gov (United States)

    Johnson, Thomas J; Brownlee, Michael J

    2018-04-01

    Various incremental and disruptive healthcare innovations that are occurring or may occur are discussed, with insights on how multihospital health systems can prepare for the future and optimize the continuity of patient care provided. Innovation in patient care is occurring at an ever-increasing rate, and this is especially true relative to the transition of patients through the care continuum. Health systems must leverage their ability to standardize and develop electronic health record (EHR) systems and other infrastructure necessary to support patient care and optimize outcomes; examples include 3D printing of patient-specific medication dosage forms to enhance precision medicine, the use of drones for medication delivery, and the expansion of telehealth capabilities to improve patient access to the services of pharmacists and other healthcare team members. Disruptive innovations in pharmacy services and delivery will alter how medications are prescribed and delivered to patients now and in the future. Further, technology may also fundamentally alter how and where pharmacists and pharmacy technicians care for patients. This article explores the various innovations that are occurring and that will likely occur in the future, particularly as they apply to multihospital health systems and patient continuity of care. Pharmacy departments that anticipate and are prepared to adapt to incremental and disruptive innovations can demonstrate value in the multihospital health system through strategies such as optimizing the EHR, identifying telehealth opportunities, supporting infrastructure, and integrating services. Copyright © 2018 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  16. Housing Development Building Management System (HDBMS For Optimized Electricity Bills

    Directory of Open Access Journals (Sweden)

    Weixian Li

    2017-08-01

    Full Text Available Smart Buildings is a modern building that allows residents to have sustainable comfort with high efficiency of electricity usage. These objectives could be achieved by applying appropriate, capable optimization algorithms and techniques. This paper presents a Housing Development Building Management System (HDBMS strategy inspired by Building Energy Management System (BEMS concept that will integrate with smart buildings using Supply Side Management (SSM and Demand Side Management (DSM System. HDBMS is a Multi-Agent System (MAS based decentralized decision making system proposed by various authors. MAS based HDBMS was created using JAVA on a IEEE FIPA compliant multi-agent platform named JADE. It allows agents to communicate, interact and negotiate with energy supply and demand of the smart buildings to provide the optimal energy usage and minimal electricity costs.  This results in reducing the load of the power distribution system in smart buildings which simulation studies has shown the potential of proposed HDBMS strategy to provide the optimal solution for smart building energy management.

  17. ICRF array module development and optimization for high power density

    International Nuclear Information System (INIS)

    Ryan, P.M.; Swain, D.W.

    1997-02-01

    This report describes the analysis and optimization of the proposed International Thermonuclear Experimental Reactor (ITER) Antenna Array for the ion cyclotron range of frequencies (ICRF). The objectives of this effort were to: (1) minimize the applied radiofrequency rf voltages occurring in vacuum by proper layout and shape of components, limit the component's surface/volumes where the rf voltage is high; (2) study the effects of magnetic insulation, as applied to the current design; (3) provide electrical characteristics of the antenna for the development and analysis of tuning, arc detection/suppression, and systems for discriminating between arcs and edge-localized modes (ELMs); (4) maintain close interface with mechanical design

  18. An Optimal Method for Developing Global Supply Chain Management System

    Directory of Open Access Journals (Sweden)

    Hao-Chun Lu

    2013-01-01

    Full Text Available Owing to the transparency in supply chains, enhancing competitiveness of industries becomes a vital factor. Therefore, many developing countries look for a possible method to save costs. In this point of view, this study deals with the complicated liberalization policies in the global supply chain management system and proposes a mathematical model via the flow-control constraints, which are utilized to cope with the bonded warehouses for obtaining maximal profits. Numerical experiments illustrate that the proposed model can be effectively solved to obtain the optimal profits in the global supply chain environment.

  19. Development and optimization of a stove-powered thermoelectric generator

    Science.gov (United States)

    Mastbergen, Dan

    Almost a third of the world's population still lacks access to electricity. Most of these people use biomass stoves for cooking which produce significant amounts of wasted thermal energy, but no electricity. Less than 1% of this energy in the form of electricity would be adequate for basic tasks such as lighting and communications. However, an affordable and reliable means of accomplishing this is currently nonexistent. The goal of this work is to develop a thermoelectric generator to convert a small amount of wasted heat into electricity. Although this concept has been around for decades, previous attempts have failed due to insufficient analysis of the system as a whole, leading to ineffective and costly designs. In this work, a complete design process is undertaken including concept generation, prototype testing, field testing, and redesign/optimization. Detailed component models are constructed and integrated to create a full system model. The model encompasses the stove operation, thermoelectric module, heat sinks, charging system and battery. A 3000 cycle endurance test was also conducted to evaluate the effects of operating temperature, module quality, and thermal interface quality on the generator's reliability, lifetime and cost effectiveness. The results from this testing are integrated into the system model to determine the lowest system cost in $/Watt over a five year period. Through this work the concept of a stove-based thermoelectric generator is shown to be technologically and economically feasible. In addition, a methodology is developed for optimizing the system for specific regional stove usage habits.

  20. Development and design optimization of water hydraulic manipulator for ITER

    International Nuclear Information System (INIS)

    Kekaelaeinen, Teemu; Mattila, Jouni; Virvalo, Tapio

    2009-01-01

    This paper describes one of the research projects carried out in The Preparation of Remote Handling Engineers for ITER (PREFIT) program within the European Fusion Training Scheme (EFTS). This research project is focusing on the design and optimization of water hydraulic manipulators used to test several remote handling tasks of ITER at Divertor Test Platform 2 (DTP2), Tampere, Finland, and later in ITER. In this project, a water hydraulic manipulator designed and build by Department of Intelligent Hydraulics and Automation in Tampere University of Technology, Finland (TUT/IHA) is further optimized as a case study for a given manipulator requirement specification in order to illustrate and verify developed comprehensive design guidelines and performance metrics. Without meaningful manipulator performance parameters, the evaluation of alternative robot manipulators designs remains ad hoc at best. Therefore, more comprehensive design guidelines and performance metrics are needed for comparing and improving different existing manipulators versus task requirements or for comparing different digital prototypes at early design phase of manipulators. In this paper the description of the project, its background and developments are presented and discussed.

  1. An analytical fuzzy-based approach to ?-gain optimal control of input-affine nonlinear systems using Newton-type algorithm

    Science.gov (United States)

    Milic, Vladimir; Kasac, Josip; Novakovic, Branko

    2015-10-01

    This paper is concerned with ?-gain optimisation of input-affine nonlinear systems controlled by analytic fuzzy logic system. Unlike the conventional fuzzy-based strategies, the non-conventional analytic fuzzy control method does not require an explicit fuzzy rule base. As the first contribution of this paper, we prove, by using the Stone-Weierstrass theorem, that the proposed fuzzy system without rule base is universal approximator. The second contribution of this paper is an algorithm for solving a finite-horizon minimax problem for ?-gain optimisation. The proposed algorithm consists of recursive chain rule for first- and second-order derivatives, Newton's method, multi-step Adams method and automatic differentiation. Finally, the results of this paper are evaluated on a second-order nonlinear system.

  2. Hypothesis: Low frequency heart rate variability (LF-HRV) is an input for undisclosed yet biological adaptive control, governing the cardiovascular regulations to assure optimal functioning.

    Science.gov (United States)

    Gabbay, Uri; Bobrovsky, Ben Zion

    2012-02-01

    Cardiovascular regulation is considered today as having three levels: autoregulations, neural regulations and hormonal regulations. We hypothesize that the cardiovascular regulation has an additional (fourth) control level which is outer, hierarchical (adaptive) loop where LF-HRV amplitude serves as a reference input which the neural cardiovascular center detects and responses in order to maintain LF-HRV around some prescribed level. Supporting evidences: LF-HRV absence during artificial cardiac pacing may be associated with "pacemaker syndrome" which had not been sufficiently understood regardless of apparently unimpaired cardiovascular performance. The hypothesis may provide an essential basis for understanding several cardiovascular morbidities and insight toward diagnostic measures and treatments (including but not limited to adding variability to the pulse generator of artificial pacemakers to eliminate "pace maker syndrome"). Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Vision-based coaching: optimizing resources for leader development

    Science.gov (United States)

    Passarelli, Angela M.

    2015-01-01

    Leaders develop in the direction of their dreams, not in the direction of their deficits. Yet many coaching interactions intended to promote a leader’s development fail to leverage the benefits of the individual’s personal vision. Drawing on intentional change theory, this article postulates that coaching interactions that emphasize a leader’s personal vision (future aspirations and core identity) evoke a psychophysiological state characterized by positive emotions, cognitive openness, and optimal neurobiological functioning for complex goal pursuit. Vision-based coaching, via this psychophysiological state, generates a host of relational and motivational resources critical to the developmental process. These resources include: formation of a positive coaching relationship, expansion of the leader’s identity, increased vitality, activation of learning goals, and a promotion–orientation. Organizational outcomes as well as limitations to vision-based coaching are discussed. PMID:25926803

  4. Reliability Evaluation for Optimizing Electricity Supply in a Developing Country

    Directory of Open Access Journals (Sweden)

    Mark Ndubuka NWOHU

    2007-09-01

    Full Text Available The reliability standards for electricity supply in a developing country, like Nigeria, have to be determined on past engineering principles and practice. Because of the high demand of electrical power due to rapid development, industrialization and rural electrification; the economic, social and political climate in which the electric power supply industry now operates should be critically viewed to ensure that the production of electrical power should be augmented and remain uninterrupted. This paper presents an economic framework that can be used to optimize electric power system reliability. Finally the cost models are investigated to take into account the economic analysis of system reliability, which can be periodically updated to improve overall reliability of electric power system.

  5. Vision-based coaching: Optimizing resources for leader development

    Directory of Open Access Journals (Sweden)

    Angela M. Passarelli

    2015-04-01

    Full Text Available Leaders develop in the direction of their dreams, not in the direction of their deficits. Yet many coaching interactions intended to promote a leader’s development fail to leverage the developmental benefits of the individual’s personal vision. Drawing on Intentional Change Theory, this article postulates that coaching interactions that emphasize a leader’s personal vision (future aspirations and core identity evoke a psychophysiological state characterized by positive emotions, cognitive openness, and optimal neurobiological functioning for complex goal pursuit. Vision-based coaching, via this psychophysiological state, generates a host of relational and motivational resources critical to the developmental process. These resources include: formation of a positive coaching relationship, expansion of the leader’s identity, increased vitality, activation of learning goals, and a promotion-orientation. Organizational outcomes as well as limitations to vision-based coaching are discussed.

  6. Enabling optimal energy options under the Clean Development Mechanism

    International Nuclear Information System (INIS)

    Gilau, Asmerom M.; Van Buskirk, Robert; Small, Mitchell J.

    2007-01-01

    This paper addresses the cost effectiveness of renewable energy technologies in achieving low abatement costs and promoting sustainable developments under the Clean Development Mechanism (CDM). According to the results of our optimal energy option's analysis, at project scale, compared with a diesel-only energy option, photovoltaic (PV)-diesel (PVDB), wind-diesel (WDB) and PV-wind-diesel (PVWDB) hybrids are very cost-effective energy options. Moreover, energy options with high levels of renewable energy, including 100% renewables, have the lowest net present cost and they are already cost effective without CDM. On the other hand, while the removal of about 87% carbon dioxide emissions could be achieved at negative cost, initial investment could increase by a factor of 40, which is one of the primary barriers hindering wider renewable energy applications in developing countries, among others. Thus, in order to increase developing countries' participation in the carbon market, CDM policy should shift from a purely market-oriented approach to investigating how to facilitate renewable energy projects through barrier removal. Thus, we recommend that further research should focus on how to efficiently remove renewable energy implementation barriers as a means to improve developing countries' participation in meaningful emission reduction while at the same time meeting the needs of sustainable economic development

  7. A planning support system to optimize approval of private housing development projects

    Science.gov (United States)

    Hussnain, M. Q.; Wakil, K.; Waheed, A.; Tahir, A.

    2016-06-01

    Out of 182 million population of Pakistan, 38% reside in urban areas having an average growth rate of 1.6%, raising the urban housing demand significantly. Poor state response to fulfil the housing needs has resulted in a mushroom growth of private housing schemes (PHS) over the years. Consequently, only in five major cities of Punjab, there are 383 legal and 150 illegal private housing development projects against 120 public sector housing schemes. A major factor behind the cancerous growth of unapproved PHS is the prolonged and delayed approval process in concerned approval authorities requiring 13 months on average. Currently, manual and paper-based approaches are used for vetting and for granting the permission which is highly subjective and non-transparent. This study aims to design a flexible planning support system (PSS) to optimize the vetting process of PHS projects under any development authority in Pakistan by reducing time and cost required for site and documents investigations. Relying on the review of regulatory documents and interviews with professional planners and land developers, this study describes the structure of a PSS developed using open- source geo-spatial tools such as OpenGeo Suite, PHP, and PostgreSQL. It highlights the development of a Knowledge Module (based on regulatory documents) containing equations related to scheme type, size (area), location, access road, components of layout plan, planning standards and other related approval checks. Furthermore, it presents the architecture of the database module and system data requirements categorized as base datasets (built-in part of PSS) and input datasets (related to the housing project under approval). It is practically demonstrated that developing a customized PSS to optimize PHS approval process in Pakistan is achievable with geospatial technology. With the provision of such a system, the approval process for private housing schemes not only becomes quicker and user-friendly but also

  8. A model of optimization for local energy infrastructure development

    International Nuclear Information System (INIS)

    Juroszek, Zbigniew; Kudelko, Mariusz

    2016-01-01

    The authors present a non-linear, optimization model supporting the planning of local energy systems development. The model considers two forms of final energy – heat and electricity. The model reflects both private and external costs and is designed to show the social perspective. It considers the variability of the marginal costs attributed to local renewable resources. In order to demonstrate the capacity of the model, the authors present a case study by modelling the development of the energy infrastructure in a municipality located in the south of Poland. The ensuing results show that a swift and significant shift in the local energy policy of typical central European municipalities is needed. The modelling is done in two scenarios – with and without the internalization of external environmental costs. The results confirm that the internalization of the external costs of energy production on a local scale leads to a significant improvement in the allocation of resources. - Highlights: • A model for municipal energy system development in Central European environment has been developed. • The variability of marginal costs of local, renewable fuels is considered. • External, environmental costs are considered. • The model reflects both network and individual energy infrastructure (e.g. individual housing boilers). • A swift change in Central European municipal energy infrastructure is necessary.

  9. Development of a pump-turbine runner based on multiobjective optimization

    International Nuclear Information System (INIS)

    Xuhe, W; Baoshan, Z; Lei, T; Jie, Z; Shuliang, C

    2014-01-01

    As a key component of reversible pump-turbine unit, pump-turbine runner rotates at pump or turbine direction according to the demand of power grid, so higher efficiencies under both operating modes have great importance for energy saving. In the present paper, a multiobjective optimization design strategy, which includes 3D inverse design method, CFD calculations, response surface method (RSM) and multiobjective genetic algorithm (MOGA), is introduced to develop a model pump-turbine runner for middle-high head pumped storage plant. Parameters that controlling blade shape, such as blade loading and blade lean angle at high pressure side are chosen as input parameters, while runner efficiencies under both pump and turbine modes are selected as objective functions. In order to validate the availability of the optimization design system, one runner configuration from Pareto front is manufactured for experimental research. Test results show that the highest unit efficiency is 91.0% under turbine mode and 90.8% under pump mode for the designed runner, of which prototype efficiencies are 93.88% and 93.27% respectively. Viscous CFD calculations for full passage model are also conducted, which aim at finding out the hydraulic improvement from internal flow analyses

  10. Developing optimal nurses work schedule using integer programming

    Science.gov (United States)

    Shahidin, Ainon Mardhiyah; Said, Mohd Syazwan Md; Said, Noor Hizwan Mohamad; Sazali, Noor Izatie Amaliena

    2017-08-01

    Time management is the art of arranging, organizing and scheduling one's time for the purpose of generating more effective work and productivity. Scheduling is the process of deciding how to commit resources between varieties of possible tasks. Thus, it is crucial for every organization to have a good work schedule for their staffs. The job of Ward nurses at hospitals runs for 24 hours every day. Therefore, nurses will be working using shift scheduling. This study is aimed to solve the nurse scheduling problem at an emergency ward of a private hospital. A 7-day work schedule for 7 consecutive weeks satisfying all the constraints set by the hospital will be developed using Integer Programming. The work schedule for the nurses obtained gives an optimal solution where all the constraints are being satisfied successfully.

  11. Summary report of the 3. research co-ordination meeting on development of reference input parameter library for nuclear model calculations of nuclear data (Phase 1: Starter File)

    International Nuclear Information System (INIS)

    Oblozinsky, P.

    1997-09-01

    The report contains the summary of the third and the last Research Co-ordination Meeting on ''Development of Reference Input Parameter Library for Nuclear Model Calculations of Nuclear Data (Phase I: Starter File)'', held at the ICTP, Trieste, Italy, from 26 to 29 May 1997. Details are given on the status of the Handbook and the Starter File - two major results of the project. (author)

  12. Optimization of Application of Nitrogen Fertilizers for Growth and Yield of Forage Sorghum under Low-Input and Conventional Farming Systems

    Directory of Open Access Journals (Sweden)

    M. Pourazizi

    2013-10-01

    Full Text Available In order to maintain sustainable agriculture and prevent excessive use of chemical fertilizers, supplying part of the plant needs by organic fertilizers is necessary. In this respect, effects of nitrogen (N source and rate on yield and yield components of forage sorghum was evaluated as a factorial experiment arranged in randomized complete blocks design with three replications at the Research Farm of Shahrekord University in 2010. Treatments consisted of three N sources (urea fertilizer, cow manure and equal combination of urea fertilizer + cow manure and three N levels 80, 160 and 240 kg/ha N, equivalent to 174, 348 and 522 kg/ha urea and 26.2, 52.5 and 78.7 Mg/ha of cow manure and equal combination of urea fertilizer + cow manure at each nitrogen level, respectively. The results showed that increase of N utilization, with increase in leaf, stem and panicle weights and stem diameter, caused a linear increase of forage yield in urea fertilizer and cow manure treatments and a quadratic increase in the combined fertilizer. The highest leaf, stem and panicle weight (600, 3789 and 823 g/m2 and also fresh forage yield (44 Mg/ha were observed in 240 kg/ha N treatment in combined treatment. But, there was no significant difference in forage yield between this treatment and the 160 kg/ha N treatment. Overall, the results indicated that the potential of sorghum production can be increased by conjunctive use of animal manure and chemical fertilizers, even in low levels of these fertilizers, or low-input agriculture.

  13. A nonlinear optimal control approach to stabilization of a macroeconomic development model

    Science.gov (United States)

    Rigatos, G.; Siano, P.; Ghosh, T.; Sarno, D.

    2017-11-01

    A nonlinear optimal (H-infinity) control approach is proposed for the problem of stabilization of the dynamics of a macroeconomic development model that is known as the Grossman-Helpman model of endogenous product cycles. The dynamics of the macroeconomic development model is divided in two parts. The first one describes economic activities in a developed country and the second part describes variation of economic activities in a country under development which tries to modify its production so as to serve the needs of the developed country. The article shows that through control of the macroeconomic model of the developed country, one can finally control the dynamics of the economy in the country under development. The control method through which this is achieved is the nonlinear H-infinity control. The macroeconomic model for the country under development undergoes approximate linearization round a temporary operating point. This is defined at each time instant by the present value of the system's state vector and the last value of the control input vector that was exerted on it. The linearization is based on Taylor series expansion and the computation of the associated Jacobian matrices. For the linearized model an H-infinity feedback controller is computed. The controller's gain is calculated by solving an algebraic Riccati equation at each iteration of the control method. The asymptotic stability of the control approach is proven through Lyapunov analysis. This assures that the state variables of the macroeconomic model of the country under development will finally converge to the designated reference values.

  14. Recent developments in KTF. Code optimization and improved numerics

    International Nuclear Information System (INIS)

    Jimenez, Javier; Avramova, Maria; Sanchez, Victor Hugo; Ivanov, Kostadin

    2012-01-01

    The rapid increase of computer power in the last decade facilitated the development of high fidelity simulations in nuclear engineering allowing a more realistic and accurate optimization as well as safety assessment of reactor cores and power plants compared to the legacy codes. Thermal hydraulic subchannel codes together with time dependent neutron transport codes are the options of choice for an accurate prediction of local safety parameters. Moreover, fast running codes with the best physical models are needed for high fidelity coupled thermal hydraulic / neutron kinetic solutions. Hence at KIT, different subchannel codes such as SUBCHANFLOW and KTF are being improved, validated and coupled with different neutron kinetics solutions. KTF is a subchannel code developed for best-estimate analysis of both Pressurized Water Reactor (PWR) and BWR. It is based on the Pennsylvania State University (PSU) version of COBRA-TF (Coolant Boling in Rod Arrays Two Fluids) named CTF. In this paper, the investigations devoted to the enhancement of the code numeric and informatics structure are presented and discussed. By some examples the gain on code speed-up will be demonstrated and finally an outlook of further activities concentrated on the code improvements will be given. (orig.)

  15. Recent developments in KTF. Code optimization and improved numerics

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, Javier; Avramova, Maria; Sanchez, Victor Hugo; Ivanov, Kostadin [Karlsruhe Institute of Technology (KIT) (Germany). Inst. for Neutron Physics and Reactor Technology (INR)

    2012-11-01

    The rapid increase of computer power in the last decade facilitated the development of high fidelity simulations in nuclear engineering allowing a more realistic and accurate optimization as well as safety assessment of reactor cores and power plants compared to the legacy codes. Thermal hydraulic subchannel codes together with time dependent neutron transport codes are the options of choice for an accurate prediction of local safety parameters. Moreover, fast running codes with the best physical models are needed for high fidelity coupled thermal hydraulic / neutron kinetic solutions. Hence at KIT, different subchannel codes such as SUBCHANFLOW and KTF are being improved, validated and coupled with different neutron kinetics solutions. KTF is a subchannel code developed for best-estimate analysis of both Pressurized Water Reactor (PWR) and BWR. It is based on the Pennsylvania State University (PSU) version of COBRA-TF (Coolant Boling in Rod Arrays Two Fluids) named CTF. In this paper, the investigations devoted to the enhancement of the code numeric and informatics structure are presented and discussed. By some examples the gain on code speed-up will be demonstrated and finally an outlook of further activities concentrated on the code improvements will be given. (orig.)

  16. Development of a tokamak plasma optimized for stability and confinement

    International Nuclear Information System (INIS)

    Politzer, P.A.

    1995-02-01

    Design of an economically attractive tokamak fusion reactor depends on producing steady-state plasma operation with simultaneous high energy density (β) and high energy confinement (τ E ); either of these, by itself, is insufficient. In operation of the DIII-D tokamak, both high confinement enhancement (H≡ τ E /τ ITER-89P = 4) and high normalized β (β N ≡ β/(I/aB) = 6%-m-T/MA) have been obtained. For the present, these conditions have been produced separately and in transient discharges. The DIII-D advanced tokamak development program is directed toward developing an understanding of the characteristics which lead to high stability and confinement, and to use that understanding to demonstrate stationary, high performance operation through active control of the plasma shape and profiles. The authors have identified some of the features of the operating modes in DIII-D that contribute to better performance. These are control of the plasma shape, control of both bulk plasma rotation and shear in the rotation and Er profiles, and particularly control of the toroidal current profiles. In order to guide their future experiments, they are developing optimized scenarios based on their anticipated plasma control capabilities, particularly using fast wave current drive (on-axis) and electron cyclotron current drive (off-axis). The most highly developed model is the second-stable core VH-mode, which has a reversed magnetic shear safety factor profile [q(O) = 3.9, q min = 2.6, and q 95 = 6]. This model plasma uses profiles which the authors expect to be realizable. At β N ≥ 6, it is stable to n=l kink modes and ideal ballooning modes, and is expected to reach H ≥ 3 with VH-mode-like confinement

  17. Developing an Integrated Design Strategy for Chip Layout Optimization

    NARCIS (Netherlands)

    Wits, Wessel Willems; Jauregui Becker, Juan Manuel; van Vliet, Frank Edward; te Riele, G.J.

    2011-01-01

    This paper presents an integrated design strategy for chip layout optimization. The strategy couples both electric and thermal aspects during the conceptual design phase to improve chip performances; thermal management being one of the major topics. The layout of the chip circuitry is optimized

  18. Optimizing the Noticing of Recasts via Computer-Delivered Feedback: Evidence That Oral Input Enhancement and Working Memory Help Second Language Learning

    Science.gov (United States)

    Sagarra, Nuria; Abbuhl, Rebekha

    2013-01-01

    This study investigates whether practice with computer-administered feedback in the absence of meaning-focused interaction can help second language learners notice the corrective intent of recasts and develop linguistic accuracy. A group of 218 beginning Anglophone learners of Spanish received 1 of 4 types of automated feedback (no feedback,…

  19. Development of Combinatorial Methods for Alloy Design and Optimization

    International Nuclear Information System (INIS)

    Pharr, George M.; George, Easo P.; Santella, Michael L

    2005-01-01

    The primary goal of this research was to develop a comprehensive methodology for designing and optimizing metallic alloys by combinatorial principles. Because conventional techniques for alloy preparation are unavoidably restrictive in the range of alloy composition that can be examined, combinatorial methods promise to significantly reduce the time, energy, and expense needed for alloy design. Combinatorial methods can be developed not only to optimize existing alloys, but to explore and develop new ones as well. The scientific approach involved fabricating an alloy specimen with a continuous distribution of binary and ternary alloy compositions across its surface--an ''alloy library''--and then using spatially resolved probing techniques to characterize its structure, composition, and relevant properties. The three specific objectives of the project were: (1) to devise means by which simple test specimens with a library of alloy compositions spanning the range interest can be produced; (2) to assess how well the properties of the combinatorial specimen reproduce those of the conventionally processed alloys; and (3) to devise screening tools which can be used to rapidly assess the important properties of the alloys. As proof of principle, the methodology was applied to the Fe-Ni-Cr ternary alloy system that constitutes many commercially important materials such as stainless steels and the H-series and C-series heat and corrosion resistant casting alloys. Three different techniques were developed for making alloy libraries: (1) vapor deposition of discrete thin films on an appropriate substrate and then alloying them together by solid-state diffusion; (2) co-deposition of the alloying elements from three separate magnetron sputtering sources onto an inert substrate; and (3) localized melting of thin films with a focused electron-beam welding system. Each of the techniques was found to have its own advantages and disadvantages. A new and very powerful technique for

  20. Optimal planning in a developing industrial microgrid with sensitive loads

    Directory of Open Access Journals (Sweden)

    M. Naderi

    2017-11-01

    Full Text Available Computer numerical control (CNC machines are known as sensitive loads in industrial estates. These machines require reliable and qualified electricity in their often long work periods. Supplying these loads with distributed energy resources (DERs in a microgrid (MG can be done as an appropriate solution. The aim of this paper is to analyze the implementation potential of a real and developing MG in Shad-Abad industrial estate, Tehran, Iran. Three MG planning objectives are considered including assurance of sustainable and secure operation of CNC machines as sensitive loads, minimizing the costs of MG construction and operation, and using available capacities to penetrate the highest possible renewable energy sources (RESs which subsequently results in decreasing the air pollutants specially carbon dioxide (CO2. The HOMER (hybrid optimization model for electric renewable software is used to specify the technical feasibility of MG planning and to select the best plan economically and environmentally. Different scenarios are considered in this regard to determine suitable capacity of production participants, and to assess the MG indices such as the reliability.

  1. Optimization of automation: III. Development of optimization method for determining automation rate in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Kim, Man Cheol; Seong, Poong Hyun

    2016-01-01

    Highlights: • We propose an appropriate automation rate that enables the best human performance. • We analyze the shortest working time considering Situation Awareness Recovery (SAR). • The optimized automation rate is estimated by integrating the automation and ostracism rate estimation methods. • The process to derive the optimized automation rate is demonstrated through case studies. - Abstract: Automation has been introduced in various industries, including the nuclear field, because it is commonly believed that automation promises greater efficiency, lower workloads, and fewer operator errors through reducing operator errors and enhancing operator and system performance. However, the excessive introduction of automation has deteriorated operator performance due to the side effects of automation, which are referred to as Out-of-the-Loop (OOTL), and this is critical issue that must be resolved. Thus, in order to determine the optimal level of automation introduction that assures the best human operator performance, a quantitative method of optimizing the automation is proposed in this paper. In order to propose the optimization method for determining appropriate automation levels that enable the best human performance, the automation rate and ostracism rate, which are estimation methods that quantitatively analyze the positive and negative effects of automation, respectively, are integrated. The integration was conducted in order to derive the shortest working time through considering the concept of situation awareness recovery (SAR), which states that the automation rate with the shortest working time assures the best human performance. The process to derive the optimized automation rate is demonstrated through an emergency operation scenario-based case study. In this case study, four types of procedures are assumed through redesigning the original emergency operating procedure according to the introduced automation and ostracism levels. Using the

  2. DEVELOPMENT OF THE ALGORITHM FOR CHOOSING THE OPTIMAL SCENARIO FOR THE DEVELOPMENT OF THE REGION'S ECONOMY

    Directory of Open Access Journals (Sweden)

    I. S. Borisova

    2018-01-01

    Full Text Available Purpose: the article deals with the development of an algorithm for choosing the optimal scenario for the development of the regional economy. Since the "Strategy for socio-economic development of the Lipetsk region for the period until 2020" does not contain scenarios for the development of the region, the algorithm for choosing the optimal scenario for the development of the regional economy is formalized. The scenarios for the development of the economy of the Lipetsk region according to the indicators of the Program of social and economic development are calculated: "Quality of life index", "Average monthly nominal wage", "Level of registered unemployment", "Growth rate of gross regional product", "The share of innovative products in the total volume of goods shipped, works performed and services rendered by industrial organizations", "Total volume of atmospheric pollution per unit GRP" and "Satisfaction of the population with the activity of executive bodies of state power of the region". Based on the calculation of development scenarios, the dynamics of the values of these indicators was developed in the implementation of scenarios for the development of the economy of the Lipetsk region in 2016–2020. Discounted financial costs of economic participants for realization of scenarios of development of economy of the Lipetsk region are estimated. It is shown that the current situation in the economy of the Russian Federation assumes the choice of a paradigm for the innovative development of territories and requires all participants in economic relations at the regional level to concentrate their resources on the creation of new science-intensive products. An assessment of the effects of the implementation of reasonable scenarios for the development of the economy of the Lipetsk region was carried out. It is shown that the most acceptable is the "base" scenario, which assumes a consistent change in the main indicators. The specific economic

  3. Development of input data layers for the FARSITE fire growth model for the Selway-Bitterroot Wilderness Complex, USA

    Science.gov (United States)

    Robert E. Keane; Janice L. Garner; Kirsten M. Schmidt; Donald G. Long; James P. Menakis; Mark A. Finney

    1998-01-01

    Fuel and vegetation spatial data layers required by the spatially explicit fire growth model FARSITE were developed for all lands in and around the Selway-Bitterroot Wilderness Area in Idaho and Montana. Satellite imagery and terrain modeling were used to create the three base vegetation spatial data layers of potential vegetation, cover type, and structural stage....

  4. Development and Optimization of Insulin-Chitosan Nanoparticles

    African Journals Online (AJOL)

    to optimize formulation. Properties such as particle shape, size, zeta potential and release behavior ... the positively charged chitosan nanoparticles had strong electrostatic ... dropping 3 ml of TPP reserve liquid quickly into the system at 40 ºC ...

  5. development of optimized s hollow block with block with ment

    African Journals Online (AJOL)

    eobe

    optimize the design computation process the design ... acy is checked using the control factors. Finally a ... utation process which takes the desired property of the mix, and generat of the mix ...... We recall our statistical hypothesis as follows: 1.

  6. REGION AGRICULTURE DEVELOPMENT ON THE BASIS OF OPTIMIZATION MODELLING

    Directory of Open Access Journals (Sweden)

    V.P. Neganova

    2008-06-01

    Full Text Available The scientific substantiation of accommodation of an agricultural production of territorial divisions of region is a complex social-economic problem. The decision of this problem demands definition market-oriented criteria of an optimality. The author considers three criteria of optimality: maximum of profit; maximum of gross output without production costs and costs for soil fertility simple reproduction; maximum of marginal income. Conclusion is drawn that the best criterion of optimization of production is the maximum the marginal income (the marginal income without constant costs, which will raise economic and ecological efficiency of an agricultural production at all management levels. As a result of agricultural production optimization the Republic Bashkortostan will become self-provided and taking out (foodstuffs region of Russia. In this case the republic is capable to provide with food substances (protein, carbohydrates and etc. 5.8 – 6.5 million person. It exceeds a population of republic on 40 – 60 %.

  7. GARFEM input deck description

    Energy Technology Data Exchange (ETDEWEB)

    Zdunek, A.; Soederberg, M. (Aeronautical Research Inst. of Sweden, Bromma (Sweden))

    1989-01-01

    The input card deck for the finite element program GARFEM version 3.2 is described in this manual. The program includes, but is not limited to, capabilities to handle the following problems: * Linear bar and beam element structures, * Geometrically non-linear problems (bar and beam), both static and transient dynamic analysis, * Transient response dynamics from a catalog of time varying external forcing function types or input function tables, * Eigenvalue solution (modes and frequencies), * Multi point constraints (MPC) for the modelling of mechanisms and e.g. rigid links. The MPC definition is used only in the geometrically linearized sense, * Beams with disjunct shear axis and neutral axis, * Beams with rigid offset. An interface exist that connects GARFEM with the program GAROS. GAROS is a program for aeroelastic analysis of rotating structures. Since this interface was developed GARFEM now serves as a preprocessor program in place of NASTRAN which was formerly used. Documentation of the methods applied in GARFEM exists but is so far limited to the capacities in existence before the GAROS interface was developed.

  8. Input or intimacy

    Directory of Open Access Journals (Sweden)

    Judit Navracsics

    2014-01-01

    Full Text Available According to the critical period hypothesis, the earlier the acquisition of a second language starts, the better. Owing to the plasticity of the brain, up until a certain age a second language can be acquired successfully according to this view. Early second language learners are commonly said to have an advantage over later ones especially in phonetic/phonological acquisition. Native-like pronunciation is said to be most likely to be achieved by young learners. However, there is evidence of accentfree speech in second languages learnt after puberty as well. Occasionally, on the other hand, a nonnative accent may appear even in early second (or third language acquisition. Cross-linguistic influences are natural in multilingual development, and we would expect the dominant language to have an impact on the weaker one(s. The dominant language is usually the one that provides the largest amount of input for the child. But is it always the amount that counts? Perhaps sometimes other factors, such as emotions, ome into play? In this paper, data obtained from an EnglishPersian-Hungarian trilingual pair of siblings (under age 4 and 3 respectively is analyzed, with a special focus on cross-linguistic influences at the phonetic/phonological levels. It will be shown that beyond the amount of input there are more important factors that trigger interference in multilingual development.

  9. Development of a fast optimization preview in radiation treatment planning

    International Nuclear Information System (INIS)

    Hoeffner, J.; Decker, P.; Schmidt, E.L.; Herbig, W.; Rittler, J.; Weiss, P.

    1996-01-01

    Usually, the speed of convergence of some iterative algorithms is restricted to a bounded relaxation parameter. Exploiting the special altering behavior of the weighting factors at each step, many iteration steps are avoided by overrelaxing this relaxation parameter. Therefore, the relaxation parameter is increased as long as the optimization result is improved. This can be performed without loss of accuracy. Our optimization technique is demonstrated by the case of a right lung carcinoma. The solution space for this case is 36 isocentric X-ray beams evenly spaced at 10 . Each beam is restricted to 23 MV X-ray fields with a planning target volume matched by irregular field shapes, similar to that produced by a multileaf collimator. Four organs at risk plus the planning target volume are considered in the optimization process. The convergence behavior of the optimization algorithm is shown by overrelaxing the relaxation parameter in comparison to conventional relaxation parameter control. The new approach offers the ability to get a fast preview of the expected final result. If the clinician is in agreement with the preview, the algorithm is continued and achieves the result proven by the Cimmino optimization algorithm. In the other case, if the clinician doesn't agree with the preview, he will be able to change the optimization parameters (e.g. field entry points) and to restart the algorithm. (orig./MG) [de

  10. Optimal Recycling of Steel Scrap and Alloying Elements: Input-Output based Linear Programming Method with Its Application to End-of-Life Vehicles in Japan.

    Science.gov (United States)

    Ohno, Hajime; Matsubae, Kazuyo; Nakajima, Kenichi; Kondo, Yasushi; Nakamura, Shinichiro; Fukushima, Yasuhiro; Nagasaka, Tetsuya

    2017-11-21

    Importance of end-of-life vehicles (ELVs) as an urban mine is expected to grow, as more people in developing countries are experiencing increased standards of living, while the automobiles are increasingly made using high-quality materials to meet stricter environmental and safety requirements. While most materials in ELVs, particularly steel, have been recycled at high rates, quality issues have not been adequately addressed due to the complex use of automobile materials, leading to considerable losses of valuable alloying elements. This study highlights the maximal potential of quality-oriented recycling of ELV steel, by exploring the utilization methods of scrap, sorted by parts, to produce electric-arc-furnace-based crude alloy steel with minimal losses of alloying elements. Using linear programming on the case of Japanese economy in 2005, we found that adoption of parts-based scrap sorting could result in the recovery of around 94-98% of the alloying elements occurring in parts scrap (manganese, chromium, nickel, and molybdenum), which may replace 10% of the virgin sources in electric arc furnace-based crude alloy steel production.

  11. Development, validation and application of a fixed district heating model structure that requires small amounts of input data

    International Nuclear Information System (INIS)

    Aberg, Magnus; Widén, Joakim

    2013-01-01

    Highlights: • A fixed model structure for cost-optimisaton studies of DH systems is developed. • A method for approximating heat demands using outdoor temperature data is developed. • Six different Swedish district heating systems are modelled and studied. • The impact of heat demand change on heat and electricity production is examined. • Reduced heat demand leads to less use of fossil fuels and biomass in the modelled systems. - Abstract: Reducing the energy use of buildings is an important part in reaching the European energy efficiency targets. Consequently, local energy systems need to adapt to a lower demand for heating. A 90% of Swedish multi-family residential buildings use district heating (DH) produced in Sweden’s over 400 DH systems, which use different heat production technologies and fuels. DH system modelling results obtained until now are mostly for particular DH systems and cannot be easily generalised. Here, a fixed model structure (FMS) based on linear programming for cost-optimisaton studies of DH systems is developed requiring only general DH system information. A method for approximating heat demands based on local outdoor temperature data is also developed. A scenario is studied where the FMS is applied to six Swedish DH systems and heat demands are reduced due to energy efficiency improvements in buildings. The results show that the FMS is a useful tool for DH system optimisation studies and that building energy efficiency improvements lead to reduced use of fossil fuels and biomass in DH systems. Also, the share of CHP in the production mix is increased in five of the six DH systems when the heat demand is reduced

  12. Development of optimization-based probabilistic earthquake scenarios for the city of Tehran

    Science.gov (United States)

    Zolfaghari, M. R.; Peyghaleh, E.

    2016-01-01

    This paper presents the methodology and practical example for the application of optimization process to select earthquake scenarios which best represent probabilistic earthquake hazard in a given region. The method is based on simulation of a large dataset of potential earthquakes, representing the long-term seismotectonic characteristics in a given region. The simulation process uses Monte-Carlo simulation and regional seismogenic source parameters to generate a synthetic earthquake catalogue consisting of a large number of earthquakes, each characterized with magnitude, location, focal depth and fault characteristics. Such catalogue provides full distributions of events in time, space and size; however, demands large computation power when is used for risk assessment, particularly when other sources of uncertainties are involved in the process. To reduce the number of selected earthquake scenarios, a mixed-integer linear program formulation is developed in this study. This approach results in reduced set of optimization-based probabilistic earthquake scenario, while maintaining shape of hazard curves and full probabilistic picture by minimizing the error between hazard curves driven by full and reduced sets of synthetic earthquake scenarios. To test the model, the regional seismotectonic and seismogenic characteristics of northern Iran are used to simulate a set of 10,000-year worth of events consisting of some 84,000 earthquakes. The optimization model is then performed multiple times with various input data, taking into account probabilistic seismic hazard for Tehran city as the main constrains. The sensitivity of the selected scenarios to the user-specified site/return period error-weight is also assessed. The methodology could enhance run time process for full probabilistic earthquake studies like seismic hazard and risk assessment. The reduced set is the representative of the contributions of all possible earthquakes; however, it requires far less

  13. Developing optimized CT scan protocols: Phantom measurements of image quality

    International Nuclear Information System (INIS)

    Zarb, Francis; Rainford, Louise; McEntee, Mark F.

    2011-01-01

    Purpose: The increasing frequency of computerized tomography (CT) examinations is well documented, leading to concern about potential radiation risks for patients. However, the consequences of not performing the CT examination and missing injuries and disease are potentially serious, impacting upon correct patient management. The ALARA principle of dose optimization must be employed for all justified CT examinations. Dose indicators displayed on the CT console as either CT dose index (CTDI) and/or dose length product (DLP), are used to indicate dose and can quantify improvements achieved through optimization. Key scan parameters contributing to dose have been identified in previous literature and in previous work by our group. The aim of this study was to optimize the scan parameters of mA; kV and pitch, whilst maintaining image quality and reducing dose. This research was conducted using psychophysical image quality measurements on a CT quality assurance (QA) phantom establishing the impact of dose optimization on image quality parameters. Method: Current CT scan parameters for head (posterior fossa and cerebrum), abdomen and chest examinations were collected from 57% of CT suites available nationally in Malta (n = 4). Current scan protocols were used to image a Catphan 600 CT QA phantom whereby image quality was assessed. Each scan parameter: mA; kV and pitch were systematically reduced until the contrast resolution (CR), spatial resolution (SR) and noise were significantly lowered. The Catphan 600 images, produced by the range of protocols, were evaluated by 2 expert observers assessing CR, SR and noise. The protocol considered as the optimization threshold was just above the setting that resulted in a significant reduction in CR and noise but not affecting SR at the 95% confidence interval. Results: The limit of optimization threshold was determined for each CT suite. Employing optimized parameters, CTDI and DLP were both significantly reduced (p ≤ 0.001) by

  14. Development and application of computer assisted optimal method for treatment of femoral neck fracture.

    Science.gov (United States)

    Wang, Monan; Zhang, Kai; Yang, Ning

    2018-04-09

    To help doctors decide their treatment from the aspect of mechanical analysis, the work built a computer assisted optimal system for treatment of femoral neck fracture oriented to clinical application. The whole system encompassed the following three parts: Preprocessing module, finite element mechanical analysis module, post processing module. Preprocessing module included parametric modeling of bone, parametric modeling of fracture face, parametric modeling of fixed screw and fixed position and input and transmission of model parameters. Finite element mechanical analysis module included grid division, element type setting, material property setting, contact setting, constraint and load setting, analysis method setting and batch processing operation. Post processing module included extraction and display of batch processing operation results, image generation of batch processing operation, optimal program operation and optimal result display. The system implemented the whole operations from input of fracture parameters to output of the optimal fixed plan according to specific patient real fracture parameter and optimal rules, which demonstrated the effectiveness of the system. Meanwhile, the system had a friendly interface, simple operation and could improve the system function quickly through modifying single module.

  15. TART input manual

    International Nuclear Information System (INIS)

    Kimlinger, J.R.; Plechaty, E.F.

    1982-01-01

    The TART code is a Monte Carlo neutron/photon transport code that is only on the CRAY computer. All the input cards for the TART code are listed, and definitions for all input parameters are given. The execution and limitations of the code are described, and input for two sample problems are given

  16. Influence Of Tools Input/Output Requirements On Managers Core Front End Activities In New Product Development

    DEFF Research Database (Denmark)

    Appio, Francesco P.; Achiche, Sofiane; Minin, Alberto Di

    2011-01-01

    opportunities; make this early phase of the innovation process uncertain and extremely risky. Literature suggests that the understanding, selection and use of appropriate tools/techniques to support decision making are instrumental for a less fuzzy front end of innovation. This paper considers the adoption......The object of analysis of this explorative research is the Fuzzy Front End of Innovation in Product Development, described by those activities going from the opportunity identification to the concept definition. Business scholars have shown that confusion in terms of goals and different ideas about...

  17. On the Need for Reliable Seismic Input Assessment for Optimized Design and Retrofit of Seismically Isolated Civil and Industrial Structures, Equipment, and Cultural Heritage

    Science.gov (United States)

    Martelli, Alessandro

    2011-01-01

    laterally is sufficient to create a structural gap compatible with the design displacement, overestimating this displacement may lead to unnecessarily renouncing of the use of such a very efficient method, especially in the case of retrofits of existing buildings. Finally, for long structures (e.g. several bridges or viaducts and even some buildings) an accurate evaluation of the possibly different ground displacements along the structure is required (this also applies to conventionally built structures). In order to overcome the limits of PSHA, this method shall be complemented by the development and application of deterministic models. In particular, the lack of displacement records requires the use of modelling, once they are calibrated against more commonly available velocity or acceleration records. The aforesaid remarks are now particularly important in the P.R. China and Italy, to ensure safe reconstruction after the Wenchuan earthquake of May 12, 2008 and the Abruzzo earthquake of April 6, 2009: in fact, wide use of SI and other anti-seismic systems has been planned in the areas struck by both events.

  18. DRIFTING TO SOCIALLY-ORIENTED ECONOMY AND SUSTAINABLE DEVELOPMENT OF THE RUSSIAN ARCTIC: THE INPUT OF TECHNOLOGICAL MODERNIZATION

    Directory of Open Access Journals (Sweden)

    B. N. Porfiryev

    2017-01-01

    Full Text Available Purpose: the research paper purports to substantiate the must and opportunity of consistent combination integration of the policies of technological modernization, transition to socially-oriented economy and sustainable spatial development as an imperative requirement of the Arctic zone of the Russian Federation (AZRF re-development.Methods: research methodology and methods employ interdisciplinary approach which integrates specific tools of research of economic, sociological, political science, ecological, legal and other issues of spatial systems’ functionings.Results: the obtained research results reveal that provided for budget constraints, volatility of hydrocarbon prices, ongoing international confrontation, climate change and other external and internal challenges socially and ecologically oriented technological modernization should become a priority of the AZRF state (public and corporate policies. The issues of technological modernization should be tackled and solved concurrently with those of healthcare and supporting of the working capacity of industrial personnel, and reduction of the risk to local environment and communities. Case studies illustrating successful implementation of the above policies in the mining and energy sectors in selected AZRF regions are introduced.Conclusions and relevance: substantiated is the conclusion of imperative of the public policy stimulating socially and ecologically oriented technological modernization in the AZRF. Implementation of this policy should be preceded by the exhaustive inventory (survey of the technical condition of each and every industrial, civil engineering and social infrastructure facility to reveal critical elements (areas of technological obsoleteness and deterioration and provide assessment of the amount and sources of the resources necessary for the early (urgent phase of technological modernization. Whatever the issue is considered the proposed solution should involve a

  19. Optimal policy of energy innovation in developing countries: Development of solar PV in Iran

    International Nuclear Information System (INIS)

    Shafiei, Ehsan; Saboohi, Yadollah; Ghofrani, Mohammad B.

    2009-01-01

    The purpose of this study is to apply managerial economics and methods of decision analysis to study the optimal pattern of innovation activities for development of new energy technologies in developing countries. For this purpose, a model of energy research and development (R and D) planning is developed and it is then linked to a bottom-up energy-systems model. The set of interlinked models provide a comprehensive analytical tool for assessment of energy technologies and innovation planning taking into account the specific conditions of developing countries. An energy-system model is used as a tool for the assessment and prioritization of new energy technologies. Based on the results of the technology assessment model, the optimal R and D resources allocation for new energy technologies is estimated with the help of the R and D planning model. The R and D planning model is based on maximization of the total net present value of resulting R and D benefits taking into account the dynamics of technological progress, knowledge and experience spillovers from advanced economies, technology adoption and R and D constraints. Application of the set of interlinked models is explained through the analysis of the development of solar PV in Iranian electricity supply system and then some important policy insights are concluded

  20. Nuclear Industry Input to the Development of Concepts for the Consolidated Storage of Used Nuclear Fuel - 13411

    International Nuclear Information System (INIS)

    Phillips, Chris; Thomas, Ivan; McNiven, Steven; Lanthrum, Gary

    2013-01-01

    EnergySolutions and its team partners, NAC International, Exelon Nuclear Partners, Talisman International, TerranearPMC, Booz Allen Hamilton and Sargent and Lundy, have carried out a study to develop concepts for a Consolidated Storage Facility (CSF) for the USA's stocks of commercial Used Nuclear Fuel (UNF), and the packaging and transport provisions required to move the UNF to the CSF. The UNF is currently stored at all 65 operating nuclear reactor sites in the US, and at 10 shutdown sites. The study was funded by the US Department of Energy and followed the recommendations of the Blue Ribbon Commission on America's Nuclear Future (BRC), one of which was that the US should make prompt efforts to develop one or more consolidated storage facilities for commercial UNF. The study showed that viable schemes can be devised to move all UNF and store it at a CSF, but that a range of schemes is required to accommodate the present widely varying UNF storage arrangements. Although most UNF that is currently stored at operating reactor sites is in water-filled pools, a significant amount is now dry stored in concrete casks. At the shutdown sites, the UNF is dry stored at all but two of the ten sites. Various types of UNF dry storage configurations are used at the operating sites and shutdown sites that include vertical storage casks that are also licensed for transportation, vertical casks that are licensed for storage only, and horizontally orientated storage modules. The shutdown sites have limited to nonexistent UNF handling infrastructure and several no longer have railroad connections, complicating UNF handling and transport off the site. However four methods were identified that will satisfactorily retrieve the UNF canisters within the storage casks and transport them to the CSF. The study showed that all of the issues associated with the transportation and storage of UNF from all sites in the US can be accommodated by adopting a staged approach to the construction of

  1. Nuclear Industry Input to the Development of Concepts for the Consolidated Storage of Used Nuclear Fuel - 13411

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, Chris; Thomas, Ivan; McNiven, Steven [EnergySolutions Federal EPC., 2345 Stevens Drive, Richland, WA, 99354 (United States); Lanthrum, Gary [NAC International, 3930 East Jones Bridge Road, Norcross, GA, 30092 (United States)

    2013-07-01

    EnergySolutions and its team partners, NAC International, Exelon Nuclear Partners, Talisman International, TerranearPMC, Booz Allen Hamilton and Sargent and Lundy, have carried out a study to develop concepts for a Consolidated Storage Facility (CSF) for the USA's stocks of commercial Used Nuclear Fuel (UNF), and the packaging and transport provisions required to move the UNF to the CSF. The UNF is currently stored at all 65 operating nuclear reactor sites in the US, and at 10 shutdown sites. The study was funded by the US Department of Energy and followed the recommendations of the Blue Ribbon Commission on America's Nuclear Future (BRC), one of which was that the US should make prompt efforts to develop one or more consolidated storage facilities for commercial UNF. The study showed that viable schemes can be devised to move all UNF and store it at a CSF, but that a range of schemes is required to accommodate the present widely varying UNF storage arrangements. Although most UNF that is currently stored at operating reactor sites is in water-filled pools, a significant amount is now dry stored in concrete casks. At the shutdown sites, the UNF is dry stored at all but two of the ten sites. Various types of UNF dry storage configurations are used at the operating sites and shutdown sites that include vertical storage casks that are also licensed for transportation, vertical casks that are licensed for storage only, and horizontally orientated storage modules. The shutdown sites have limited to nonexistent UNF handling infrastructure and several no longer have railroad connections, complicating UNF handling and transport off the site. However four methods were identified that will satisfactorily retrieve the UNF canisters within the storage casks and transport them to the CSF. The study showed that all of the issues associated with the transportation and storage of UNF from all sites in the US can be accommodated by adopting a staged approach to the

  2. FLUTAN input specifications

    International Nuclear Information System (INIS)

    Borgwaldt, H.; Baumann, W.; Willerding, G.

    1991-05-01

    FLUTAN is a highly vectorized computer code for 3-D fluiddynamic and thermal-hydraulic analyses in cartesian and cylinder coordinates. It is related to the family of COMMIX codes originally developed at Argonne National Laboratory, USA. To a large extent, FLUTAN relies on basic concepts and structures imported from COMMIX-1B and COMMIX-2 which were made available to KfK in the frame of cooperation contracts in the fast reactor safety field. While on the one hand not all features of the original COMMIX versions have been implemented in FLUTAN, the code on the other hand includes some essential innovative options like CRESOR solution algorithm, general 3-dimensional rebalacing scheme for solving the pressure equation, and LECUSSO-QUICK-FRAM techniques suitable for reducing 'numerical diffusion' in both the enthalphy and momentum equations. This report provides users with detailed input instructions, presents formulations of the various model options, and explains by means of comprehensive sample input, how to use the code. (orig.) [de

  3. Factors on Enhancing the Competitive Edge and Attributes of Graduates as Inputs to the Development of Teacher Education Enhancement Program

    Directory of Open Access Journals (Sweden)

    Susan S. Janer

    2015-11-01

    Full Text Available In response to the CHED’s Higher Education Development Project and the need to track the status of Sorsogon State College (SSC teacher education graduates, this research was conceptualized. The study aims to gauge the teacher education program’s thrust of providing a quality and relevant education that could ensure worthwhile and appropriate employment opportunity to its graduates. Descriptive research design was employed in this study. Surveys, unstructured interviews, and documentary analysis were undertaken to gather pertinent data among the respondents. The study consists of 427 teacher education graduates who were selected through stratified random technique. This tracer study determined the employability of Teacher Education graduates in SSC, Sorsogon Campus from 2009 to 2013 with an end-view of proposing a Teacher Education Enhancement Program (TEEP to enhance the competitive edge of SSC Teacher Education graduates in all teaching job opportunities. The intellectual, social and linguistic attributes of the SSC graduates were likewise identified in this study. Some of the factors identified by the respondents that could help improve their competitive edge are the pre-service trainings, job placement program, teacher education curriculum enrichment, and Licensure Examination for Teachers (LET review program.

  4. A pilot study of an online workplace nutrition program: the value of participant input in program development.

    Science.gov (United States)

    Cousineau, Tara; Houle, Brian; Bromberg, Jonas; Fernandez, Kathrine C; Kling, Whitney C

    2008-01-01

    Tailored nutrition Web programs constitute an emerging trend in obesity prevention. Initial investment in innovative technology necessitates that the target population be well understood. This pilot study's purpose was to determine the feasibility of a workplace nutrition Web program. Formative research was conducted with gaming industry employees and benefits managers to develop a consensus on workplace-specific nutrition needs. A demonstration Web program was piloted with stakeholders to determine feasibility. Indiana, Mississippi, Nevada, and New Jersey gaming establishments. 86 employees, 18 benefits managers. Prototype Web program. Concept mapping; 16-item nutrition knowledge test; satisfaction. Concept mapping was used to aggregate importance ratings on programmatic content, which informed Web program curriculum. Chi-square tests were performed postintervention to determine knowledge improvement. (1) Employees and benefits managers exhibited moderate agreement about content priorities for the program (r = 0.48). (2) There was a significant increase in employees' nutrition knowledge scores postintervention (t = 7.16, df = 36, P benefit managers do not necessarily agree on the priority of nutrition-related content, suggesting a need for programs to appeal to various stakeholders. Computer-based approaches can address various stakeholder health concerns via tailored, customized programming.

  5. Developing an optimal electricity generation mix for the UK 2050 future

    International Nuclear Information System (INIS)

    Sithole, H.; Cockerill, T.T.; Hughes, K.J.; Ingham, D.B.; Ma, L.; Porter, R.T.J.; Pourkashanian, M.

    2016-01-01

    The UK electricity sector is undergoing a transition driven by domestic and regional climate change and environmental policies. Aging electricity generating infrastructure is set to affect capacity margins after 2015. These developments, coupled with the increased proportion of inflexible and variable generation technologies will impact on the security of electricity supply. Investment in low-carbon technologies is central to the UK meeting its energy policy objectives. The complexity of these challenges over the future development of the UK electricity generation sector has motivated this study which aims to develop a policy-informed optimal electricity generation scenario to assess the sector's transition to 2050. The study analyses the level of deployment of electricity generating technologies in line with the 80% by 2050 emission target. This is achieved by using an excel-based “Energy Optimisation Calculator” which captures the interaction of various inputs to produce a least-cost generation mix. The key results focus on the least-cost electricity generation portfolio, emission intensity, and total investment required to assemble a sustainable electricity generation mix. A carbon neutral electricity sector is feasible if low-carbon technologies are deployed on a large scale. This requires a robust policy framework that supports the development and deployment of mature and emerging technologies. - Highlights: • Electricity generation decarbonised in 2030 and nearly carbon neutral in 2050. • Nuclear, CCS and offshore wind are central in decarbonising electricity generation. • Uncertainty over future fuel and investment cost has no impact on decarbonisation. • Unabated fossil fuel generation is limited unless with Carbon Capture and Storage. • Decarbonising the electricity generation could cost about £213.4 billion by 2030.

  6. Development and optimization of SPECT gated blood pool cluster analysis for the prediction of CRT outcome

    Energy Technology Data Exchange (ETDEWEB)

    Lalonde, Michel, E-mail: mlalonde15@rogers.com; Wassenaar, Richard [Department of Physics, Carleton University, Ottawa, Ontario K1S 5B6 (Canada); Wells, R. Glenn; Birnie, David; Ruddy, Terrence D. [Division of Cardiology, University of Ottawa Heart Institute, Ottawa, Ontario K1Y 4W7 (Canada)

    2014-07-15

    Purpose: Phase analysis of single photon emission computed tomography (SPECT) radionuclide angiography (RNA) has been investigated for its potential to predict the outcome of cardiac resynchronization therapy (CRT). However, phase analysis may be limited in its potential at predicting CRT outcome as valuable information may be lost by assuming that time-activity curves (TAC) follow a simple sinusoidal shape. A new method, cluster analysis, is proposed which directly evaluates the TACs and may lead to a better understanding of dyssynchrony patterns and CRT outcome. Cluster analysis algorithms were developed and optimized to maximize their ability to predict CRT response. Methods: About 49 patients (N = 27 ischemic etiology) received a SPECT RNA scan as well as positron emission tomography (PET) perfusion and viability scans prior to undergoing CRT. A semiautomated algorithm sampled the left ventricle wall to produce 568 TACs from SPECT RNA data. The TACs were then subjected to two different cluster analysis techniques, K-means, and normal average, where several input metrics were also varied to determine the optimal settings for the prediction of CRT outcome. Each TAC was assigned to a cluster group based on the comparison criteria and global and segmental cluster size and scores were used as measures of dyssynchrony and used to predict response to CRT. A repeated random twofold cross-validation technique was used to train and validate the cluster algorithm. Receiver operating characteristic (ROC) analysis was used to calculate the area under the curve (AUC) and compare results to those obtained for SPECT RNA phase analysis and PET scar size analysis methods. Results: Using the normal average cluster analysis approach, the septal wall produced statistically significant results for predicting CRT results in the ischemic population (ROC AUC = 0.73;p < 0.05 vs. equal chance ROC AUC = 0.50) with an optimal operating point of 71% sensitivity and 60% specificity. Cluster

  7. Development and optimization of SPECT gated blood pool cluster analysis for the prediction of CRT outcome

    International Nuclear Information System (INIS)

    Lalonde, Michel; Wassenaar, Richard; Wells, R. Glenn; Birnie, David; Ruddy, Terrence D.

    2014-01-01

    Purpose: Phase analysis of single photon emission computed tomography (SPECT) radionuclide angiography (RNA) has been investigated for its potential to predict the outcome of cardiac resynchronization therapy (CRT). However, phase analysis may be limited in its potential at predicting CRT outcome as valuable information may be lost by assuming that time-activity curves (TAC) follow a simple sinusoidal shape. A new method, cluster analysis, is proposed which directly evaluates the TACs and may lead to a better understanding of dyssynchrony patterns and CRT outcome. Cluster analysis algorithms were developed and optimized to maximize their ability to predict CRT response. Methods: About 49 patients (N = 27 ischemic etiology) received a SPECT RNA scan as well as positron emission tomography (PET) perfusion and viability scans prior to undergoing CRT. A semiautomated algorithm sampled the left ventricle wall to produce 568 TACs from SPECT RNA data. The TACs were then subjected to two different cluster analysis techniques, K-means, and normal average, where several input metrics were also varied to determine the optimal settings for the prediction of CRT outcome. Each TAC was assigned to a cluster group based on the comparison criteria and global and segmental cluster size and scores were used as measures of dyssynchrony and used to predict response to CRT. A repeated random twofold cross-validation technique was used to train and validate the cluster algorithm. Receiver operating characteristic (ROC) analysis was used to calculate the area under the curve (AUC) and compare results to those obtained for SPECT RNA phase analysis and PET scar size analysis methods. Results: Using the normal average cluster analysis approach, the septal wall produced statistically significant results for predicting CRT results in the ischemic population (ROC AUC = 0.73;p < 0.05 vs. equal chance ROC AUC = 0.50) with an optimal operating point of 71% sensitivity and 60% specificity. Cluster

  8. Radiation safety assessment and development of environmental radiation monitoring technology; standardization of input parameters for the calculation of annual dose from routine releases from commercial reactor effluents

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, I. H.; Cho, D.; Youn, S. H.; Kim, H. S.; Lee, S. J.; Ahn, H. K. [Soonchunhyang University, Ahsan (Korea)

    2002-04-01

    This research is to develop a standard methodology for determining the input parameters that impose a substantial impact on radiation doses of residential individuals in the vicinity of four nuclear power plants in Korea. We have selected critical nuclei, pathways and organs related to the human exposure via simulated estimation with K-DOSE 60 based on the updated ICRP-60 and sensitivity analyses. From the results, we found that 1) the critical nuclides were found to be {sup 3}H, {sup 133}Xe, {sup 60}Co for Kori plants and {sup 14}C, {sup 41}Ar for Wolsong plants. The most critical pathway was 'vegetable intake' for adults and 'milk intake' for infants. However, there was no preference in the effective organs, and 2) sensitivity analyses showed that the chemical composition in a nuclide much more influenced upon the radiation dose than any other input parameters such as food intake, radiation discharge, and transfer/concentration coefficients by more than 102 factor. The effect of transfer/concentration coefficients on the radiation dose was negligible. All input parameters showed highly estimated correlation with the radiation dose, approximated to 1.0, except for food intake in Wolsong power plant (partial correlation coefficient (PCC)=0.877). Consequently, we suggest that a prediction model or scenarios for food intake reflecting the current living trend and a formal publications including details of chemical components in the critical nuclei from each plant are needed. Also, standardized domestic values of the parameters used in the calculation must replace the values of the existed or default-set imported factors via properly designed experiments and/or modelling such as transport of liquid discharge in waters nearby the plants, exposure tests on corps and plants so on. 4 figs., 576 tabs. (Author)

  9. Growing up with Frisian and Dutch: The role of language input in the early development of Frisian and Dutch among preschool children in Friesland

    NARCIS (Netherlands)

    Dijkstra, J.E.

    2013-01-01

    Bilingual acquisition largely depends on the input that children receive in each language. The more input in a language, the more proficient a child becomes in that language. The current project studied the role of language input among bilingual Frisian-Dutch preschool children (age 2.5-4 years) in

  10. Development of optimized nanogap plasmonic substrate for improved SERS enhancement

    Directory of Open Access Journals (Sweden)

    Jayakumar Perumal

    2017-05-01

    Full Text Available SERS enhancement factor (EF of planar substrates depends on the size and shape of the fine nanostructure forming a defect free, well-arranged matrix. Nano-lithographic process is considered to be the most advanced methods employed for the fabrication SERS substrates. Nanostructured plasmonic substrates with nanogap (NG pattern often results in stable, efficient and reproducible SERS enhancement. For such substrates, NG and their diagonal length (DL need to be optimized. Theoretically smaller NGs (∼30-40 nm or smaller results in higher SERS enhancement. However, fabrication of NG substrates below such limit is a challenge even for the most advanced lithography process. In this context, herein, we report the optimization of fabrication process, where higher SERS enhancement can be realized from larger NGs substrates by optimizing their DL of nanostructures between the NGs. Based on simulation we could demonstrate that, by optimizing the DL, SERS enhancement from larger NG substrate such as 60 and 80 nm could be comparable to that of smaller (40nm NG substrates. We envision that this concept will open up new regime in the nanofabrication of practically feasible NG based plasmonic substrates with higher SERS enhancement. Initial results of our experiments are in close agreement with our simulated study.

  11. Development of optimized dosimetric models for HDR brachytherapy

    International Nuclear Information System (INIS)

    Thayalan, K.; Jagadeesan, M.

    2003-01-01

    High dose rate brachytherapy (HDRB) systems are in clinical use for more than four decades particularly in cervical cancer. Optimization is the method to produce dose distribution which assures that doses are not compromised at the treatment sites whilst reducing the risk of overdosing critical organs. Hence HDRB optimization begins with the desired dose distribution and requires the calculations of the relative weighting factors for each dwell position with out changing the source activity. The optimization for Ca. uterine cervix treatment is simply duplication of the dose distribution used for Low dose rate (LDR) applications. In the present work, two optimized dosimetric models were proposed and studied thoroughly, to suit the local clinical conditions. These models are named as HDR-C and HDR-D, where C and D represent configuration and distance respectively. These models duplicate exactly the LDR pear shaped dose distribution, which is a golden standard. The validity of these models is tested in different clinical situations and in actual patients (n=92). These models: HDR-C and HDR-D reduce bladder dose by 11.11% and 10% and rectal dose by 8% and 7% respectively. The treatment time is also reduced by 12-14%. In a busy hospital setup, these models find a place to cater large number of patients, while addressing individual patients geometry. (author)

  12. Development of a novel ultrasonic motor resonator using topology optimization

    CSIR Research Space (South Africa)

    M'Boungui, G

    2011-01-01

    Full Text Available , in which the objective function is to minimize the amount of material with intermediate density, while satisfying a constraint related to the frequency ratio of selected resonant modes. The planar design produced using the optimization procedure was refined...

  13. Development of free-piston Stirling engine performance and optimization codes based on Martini simulation technique

    Science.gov (United States)

    Martini, William R.

    1989-01-01

    A FORTRAN computer code is described that could be used to design and optimize a free-displacer, free-piston Stirling engine similar to the RE-1000 engine made by Sunpower. The code contains options for specifying displacer and power piston motion or for allowing these motions to be calculated by a force balance. The engine load may be a dashpot, inertial compressor, hydraulic pump or linear alternator. Cycle analysis may be done by isothermal analysis or adiabatic analysis. Adiabatic analysis may be done using the Martini moving gas node analysis or the Rios second-order Runge-Kutta analysis. Flow loss and heat loss equations are included. Graphical display of engine motions and pressures and temperatures are included. Programming for optimizing up to 15 independent dimensions is included. Sample performance results are shown for both specified and unconstrained piston motions; these results are shown as generated by each of the two Martini analyses. Two sample optimization searches are shown using specified piston motion isothermal analysis. One is for three adjustable input and one is for four. Also, two optimization searches for calculated piston motion are presented for three and for four adjustable inputs. The effect of leakage is evaluated. Suggestions for further work are given.

  14. GAROS input deck description

    Energy Technology Data Exchange (ETDEWEB)

    Vollan, A.; Soederberg, M. (Aeronautical Research Inst. of Sweden, Bromma (Sweden))

    1989-01-01

    This report describes the input for the programs GAROS1 and GAROS2, version 5.8 and later, February 1988. The GAROS system, developed by Arne Vollan, Omega GmbH, is used for the analysis of the mechanical and aeroelastic properties for general rotating systems. It has been specially designed to meet the requirements of aeroelastic stability and dynamic response of horizontal axis wind energy converters. Some of the special characteristics are: * The rotor may have one or more blades. * The blades may be rigidly attached to the hub, or they may be fully articulated. * The full elastic properties of the blades, the hub, the machine house and the tower are taken into account. * With the same basic model, a number of different analyses can be performed: Snap-shot analysis, Floquet method, transient response analysis, frequency response analysis etc.

  15. Research on magnetorheological damper suspension with permanent magnet and magnetic valve based on developed FOA-optimal control algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, Ping; Gao, Hong [Anhui Polytechnic University, Wuhu (China); Niu, Limin [Anhui University of Technology, Maanshan (China)

    2017-07-15

    Due to the fail safe problem, it was difficult for the existing Magnetorheological damper (MD) to be widely applied in automotive suspensions. Therefore, permanent magnets and magnetic valves were introduced to existing MDs so that fail safe problem could be solved by the magnets and damping force could be adjusted easily by the magnetic valve. Thus, a new Magnetorheological damper with permanent magnet and magnetic valve (MDPMMV) was developed and MDPMMV suspension was studied. First of all, mechanical structure of existing magnetorheological damper applied in automobile suspensions was redesigned, comprising a permanent magnet and a magnetic valve. In addition, prediction model of damping force was built based on electromagnetics theory and Bingham model. Experimental research was onducted on the newly designed damper and goodness of fit between experiment results and simulated ones by models was high. On this basis, a quarter suspension model was built. Then, fruit Fly optimization algorithm (FOA)-optimal control algorithm suitable for automobile suspension was designed based on developing normal FOA. Finally, simulation experiments and bench tests with input surface of pulse road and B road were carried out and the results indicated that working erformance of MDPMMV suspension based on FOA-optimal control algorithm was good.

  16. Development of Optimization Procedure for Design of Package Cushioning

    Science.gov (United States)

    1975-01-01

    that we seek. Vibration Optimization from Random Excitation Excitation power spect : cl density S(U)j) must be specified in 1-octave frequency...CUO INCHES 3.010 INCHES t.11* INCHES •i»i. C ’Sl/llfs Bl.’Ji •COS-Is-jT •" ~» TBI «."»tf<I»l. vf»TIC»L,*xIS MGHIZONT»L J.’" l

  17. Development of frozen-fried yam slices: Optimization of the ...

    African Journals Online (AJOL)

    Frying conditions optimized with Box-Behnken experimental design were short pre-frying and frying conditions at high temperature characterized by prefrying temperature at 157-170°C during 5-9s and frying temperature at 181-188°C for 2min 15s-2min 30s; or long pre-frying and frying conditions at low temperature ...

  18. Input-output supervisor

    International Nuclear Information System (INIS)

    Dupuy, R.

    1970-01-01

    The input-output supervisor is the program which monitors the flow of informations between core storage and peripheral equipments of a computer. This work is composed of three parts: 1 - Study of a generalized input-output supervisor. With sample modifications it looks like most of input-output supervisors which are running now on computers. 2 - Application of this theory on a magnetic drum. 3 - Hardware requirement for time-sharing. (author) [fr

  19. Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Ettehadtavakkol, Amin, E-mail: amin.ettehadtavakkol@ttu.edu [Texas Tech University (United States); Jablonowski, Christopher [Shell Exploration and Production Company (United States); Lake, Larry [University of Texas at Austin (United States)

    2017-04-15

    Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum design concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.

  20. Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs

    International Nuclear Information System (INIS)

    Ettehadtavakkol, Amin; Jablonowski, Christopher; Lake, Larry

    2017-01-01

    Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum design concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.

  1. Molecular descriptor subset selection in theoretical peptide quantitative structure-retention relationship model development using nature-inspired optimization algorithms.

    Science.gov (United States)

    Žuvela, Petar; Liu, J Jay; Macur, Katarzyna; Bączek, Tomasz

    2015-10-06

    In this work, performance of five nature-inspired optimization algorithms, genetic algorithm (GA), particle swarm optimization (PSO), artificial bee colony (ABC), firefly algorithm (FA), and flower pollination algorithm (FPA), was compared in molecular descriptor selection for development of quantitative structure-retention relationship (QSRR) models for 83 peptides that originate from eight model proteins. The matrix with 423 descriptors was used as input, and QSRR models based on selected descriptors were built using partial least squares (PLS), whereas root mean square error of prediction (RMSEP) was used as a fitness function for their selection. Three performance criteria, prediction accuracy, computational cost, and the number of selected descriptors, were used to evaluate the developed QSRR models. The results show that all five variable selection methods outperform interval PLS (iPLS), sparse PLS (sPLS), and the full PLS model, whereas GA is superior because of its lowest computational cost and higher accuracy (RMSEP of 5.534%) with a smaller number of variables (nine descriptors). The GA-QSRR model was validated initially through Y-randomization. In addition, it was successfully validated with an external testing set out of 102 peptides originating from Bacillus subtilis proteomes (RMSEP of 22.030%). Its applicability domain was defined, from which it was evident that the developed GA-QSRR exhibited strong robustness. All the sources of the model's error were identified, thus allowing for further application of the developed methodology in proteomics.

  2. SSYST-3. Input description

    International Nuclear Information System (INIS)

    Meyder, R.

    1983-12-01

    The code system SSYST-3 is designed to analyse the thermal and mechanical behaviour of a fuel rod during a LOCA. The report contains a complete input-list for all modules and several tested inputs for a LOCA analysis. (orig.)

  3. Design and development of bio-inspired framework for reservoir operation optimization

    Science.gov (United States)

    Asvini, M. Sakthi; Amudha, T.

    2017-12-01

    Frameworks for optimal reservoir operation play an important role in the management of water resources and delivery of economic benefits. Effective utilization and conservation of water from reservoirs helps to manage water deficit periods. The main challenge in reservoir optimization is to design operating rules that can be used to inform real-time decisions on reservoir release. We develop a bio-inspired framework for the optimization of reservoir release to satisfy the diverse needs of various stakeholders. In this work, single-objective optimization and multiobjective optimization problems are formulated using an algorithm known as "strawberry optimization" and tested with actual reservoir data. Results indicate that well planned reservoir operations lead to efficient deployment of the reservoir water with the help of optimal release patterns.

  4. Design, Development and Optimization of a Low Cost System for Digital Industrial Radiology

    International Nuclear Information System (INIS)

    2013-01-01

    regional training courses in which participants from Member States were given training in DIR techniques. The IAEA also supported establishing facilities for DIR techniques in some Member States. Realizing the need for easy construction and assembly of a low cost, more economically viable system for DIR technology, the IAEA conducted a coordinated research project (CRP) during 2007-2010 for research and development in the field of digital radiology, with the participation of 12 Member State laboratories. The current publication on design, development and optimization of a low cost DIR system is based on the findings of this CRP and inputs from other experts. The report provides guidelines to enable interested Member States to build their own DIR system in an affordable manner

  5. Development of an approach to facilitate optimal equipment replacement.

    Science.gov (United States)

    1999-10-01

    The principle objective of this study was to develop a procedure using available departmental data on operation, maintenance and replacement costs to provide the Louisiana Department of Transportation and Development with guidelines for the identific...

  6. Development and flight testing of UV optimized Photon Counting CCDs

    Science.gov (United States)

    Hamden, Erika T.

    2018-06-01

    I will discuss the latest results from the Hamden UV/Vis Detector Lab and our ongoing work using a UV optimized EMCCD in flight. Our lab is currently testing efficiency and performance of delta-doped, anti-reflection coated EMCCDs, in collaboration with JPL. The lab has been set-up to test quantum efficiency, dark current, clock-induced-charge, and read noise. I will describe our improvements to our circuit boards for lower noise, updates from a new, more flexible NUVU controller, and the integration of an EMCCD in the FIREBall-2 UV spectrograph. I will also briefly describe future plans to conduct radiation testing on delta-doped EMCCDs (both warm, unbiased and cold, biased configurations) thus summer and longer term plans for testing newer photon counting CCDs as I move the HUVD Lab to the University of Arizona in the Fall of 2018.

  7. ColloInputGenerator

    DEFF Research Database (Denmark)

    2013-01-01

    This is a very simple program to help you put together input files for use in Gries' (2007) R-based collostruction analysis program. It basically puts together a text file with a frequency list of lexemes in the construction and inserts a column where you can add the corpus frequencies. It requires...... it as input for basic collexeme collostructional analysis (Stefanowitsch & Gries 2003) in Gries' (2007) program. ColloInputGenerator is, in its current state, based on programming commands introduced in Gries (2009). Projected updates: Generation of complete work-ready frequency lists....

  8. Development of the complex general linear model in the Fourier domain: application to fMRI multiple input-output evoked responses for single subjects.

    Science.gov (United States)

    Rio, Daniel E; Rawlings, Robert R; Woltz, Lawrence A; Gilman, Jodi; Hommer, Daniel W

    2013-01-01

    A linear time-invariant model based on statistical time series analysis in the Fourier domain for single subjects is further developed and applied to functional MRI (fMRI) blood-oxygen level-dependent (BOLD) multivariate data. This methodology was originally developed to analyze multiple stimulus input evoked response BOLD data. However, to analyze clinical data generated using a repeated measures experimental design, the model has been extended to handle multivariate time series data and demonstrated on control and alcoholic subjects taken from data previously analyzed in the temporal domain. Analysis of BOLD data is typically carried out in the time domain where the data has a high temporal correlation. These analyses generally employ parametric models of the hemodynamic response function (HRF) where prewhitening of the data is attempted using autoregressive (AR) models for the noise. However, this data can be analyzed in the Fourier domain. Here, assumptions made on the noise structure are less restrictive, and hypothesis tests can be constructed based on voxel-specific nonparametric estimates of the hemodynamic transfer function (HRF in the Fourier domain). This is especially important for experimental designs involving multiple states (either stimulus or drug induced) that may alter the form of the response function.

  9. Experimental evaluation of optimization method for developing ultraviolet barrier coatings

    Science.gov (United States)

    Gonome, Hiroki; Okajima, Junnosuke; Komiya, Atsuki; Maruyama, Shigenao

    2014-01-01

    Ultraviolet (UV) barrier coatings can be used to protect many industrial products from UV attack. This study introduces a method of optimizing UV barrier coatings using pigment particles. The radiative properties of the pigment particles were evaluated theoretically, and the optimum particle size was decided from the absorption efficiency and the back-scattering efficiency. UV barrier coatings were prepared with zinc oxide (ZnO) and titanium dioxide (TiO2). The transmittance of the UV barrier coating was calculated theoretically. The radiative transfer in the UV barrier coating was modeled using the radiation element method by ray emission model (REM2). In order to validate the calculated results, the transmittances of these coatings were measured by a spectrophotometer. A UV barrier coating with a low UV transmittance and high VIS transmittance could be achieved. The calculated transmittance showed a similar spectral tendency with the measured one. The use of appropriate particles with optimum size, coating thickness and volume fraction will result in effective UV barrier coatings. UV barrier coatings can be achieved by the application of optical engineering.

  10. Development of an optimal velocity selection method with velocity obstacle

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Min Geuk; Oh, Jun Ho [KAIST, Daejeon (Korea, Republic of)

    2015-08-15

    The Velocity obstacle (VO) method is one of the most well-known methods for local path planning, allowing consideration of dynamic obstacles and unexpected obstacles. Typical VO methods separate a velocity map into a collision area and a collision-free area. A robot can avoid collisions by selecting its velocity from within the collision-free area. However, if there are numerous obstacles near a robot, the robot will have very few velocity candidates. In this paper, a method for choosing optimal velocity components using the concept of pass-time and vertical clearance is proposed for the efficient movement of a robot. The pass-time is the time required for a robot to pass by an obstacle. By generating a latticized available velocity map for a robot, each velocity component can be evaluated using a cost function that considers the pass-time and other aspects. From the output of the cost function, even a velocity component that will cause a collision in the future can be chosen as a final velocity if the pass-time is sufficiently long enough.

  11. Developing Fast Fluorescent Protein Voltage Sensors by Optimizing FRET Interactions.

    Directory of Open Access Journals (Sweden)

    Uhna Sung

    Full Text Available FRET (Förster Resonance Energy Transfer-based protein voltage sensors can be useful for monitoring neuronal activity in vivo because the ratio of signals between the donor and acceptor pair reduces common sources of noise such as heart beat artifacts. We improved the performance of FRET based genetically encoded Fluorescent Protein (FP voltage sensors by optimizing the location of donor and acceptor FPs flanking the voltage sensitive domain of the Ciona intestinalis voltage sensitive phosphatase. First, we created 39 different "Nabi1" constructs by positioning the donor FP, UKG, at 8 different locations downstream of the voltage-sensing domain and the acceptor FP, mKO, at 6 positions upstream. Several of these combinations resulted in large voltage dependent signals and relatively fast kinetics. Nabi1 probes responded with signal size up to 11% ΔF/F for a 100 mV depolarization and fast response time constants both for signal activation (~2 ms and signal decay (~3 ms. We improved expression in neuronal cells by replacing the mKO and UKG FRET pair with Clover (donor FP and mRuby2 (acceptor FP to create Nabi2 probes. Nabi2 probes also had large signals and relatively fast time constants in HEK293 cells. In primary neuronal culture, a Nabi2 probe was able to differentiate individual action potentials at 45 Hz.

  12. XFEL diffraction: developing processing methods to optimize data quality

    Energy Technology Data Exchange (ETDEWEB)

    Sauter, Nicholas K., E-mail: nksauter@lbl.gov [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)

    2015-01-29

    Bragg spots recorded from a still crystal necessarily give partial measurements of the structure factor intensity. Correction to the full-spot equivalent, relying on both a physical model for crystal disorder and postrefinement of the crystal orientation, improves the electron density map in serial crystallography. Serial crystallography, using either femtosecond X-ray pulses from free-electron laser sources or short synchrotron-radiation exposures, has the potential to reveal metalloprotein structural details while minimizing damage processes. However, deriving a self-consistent set of Bragg intensities from numerous still-crystal exposures remains a difficult problem, with optimal protocols likely to be quite different from those well established for rotation photography. Here several data processing issues unique to serial crystallography are examined. It is found that the limiting resolution differs for each shot, an effect that is likely to be due to both the sample heterogeneity and pulse-to-pulse variation in experimental conditions. Shots with lower resolution limits produce lower-quality models for predicting Bragg spot positions during the integration step. Also, still shots by their nature record only partial measurements of the Bragg intensity. An approximate model that corrects to the full-spot equivalent (with the simplifying assumption that the X-rays are monochromatic) brings the distribution of intensities closer to that expected from an ideal crystal, and improves the sharpness of anomalous difference Fourier peaks indicating metal positions.

  13. A complex of optimization problems in planning for the development of mining operations in coal mines

    Energy Technology Data Exchange (ETDEWEB)

    Todorov, A K; Arnaudov, B K; Brankova, B A; Gyuleva, B I; Zakhariyev, G K

    1977-01-01

    The system for planning for the development of coal mines is a complex of interrelated plan optimization, plan calculation and supporting (accounting-analytical and standards) tasks. An important point in this complex is held by the plan optimization tasks. The questions about the synthesis and the structural peculiarities of the system, the essence and machine realization of the tasks are examined.

  14. Simple Design Tool for Development of Well Insulated Window Frames and Optimization of the Frame Geometry

    DEFF Research Database (Denmark)

    Zajas, Jan Jakub; Heiselberg, Per

    2012-01-01

    in order to approach an optimal solution. The program was also used to conduct an optimization process of the frame geometry. A large number of various window frame designs were created and evaluated, based on their insulation properties. The paper presents the investigation process and some of the best......This paper describes a design tool created with the purpose of designing highly insulated window frames. The design tool is based on a parametric model of the frame geometry, where various parameters describing the frame can be easily changed by the user. Based on this input, geometry of the frame...... is generated by the program and is used by the finite element simulator to calculate the thermal performance of the frame (the U value). After the initial design is evaluated, the user can quickly modify chosen parameters and generate a new design. This process can then be repeated in multiple iterations...

  15. Input description for BIOPATH

    International Nuclear Information System (INIS)

    Marklund, J.E.; Bergstroem, U.; Edlund, O.

    1980-01-01

    The computer program BIOPATH describes the flow of radioactivity within a given ecosystem after a postulated release of radioactive material and the resulting dose for specified population groups. The present report accounts for the input data necessary to run BIOPATH. The report also contains descriptions of possible control cards and an input example as well as a short summary of the basic theory.(author)

  16. Input and execution

    International Nuclear Information System (INIS)

    Carr, S.; Lane, G.; Rowling, G.

    1986-11-01

    This document describes the input procedures, input data files and operating instructions for the SYVAC A/C 1.03 computer program. SYVAC A/C 1.03 simulates the groundwater mediated movement of radionuclides from underground facilities for the disposal of low and intermediate level wastes to the accessible environment, and provides an estimate of the subsequent radiological risk to man. (author)

  17. Developments in model-based optimization and control distributed control and industrial applications

    CERN Document Server

    Grancharova, Alexandra; Pereira, Fernando

    2015-01-01

    This book deals with optimization methods as tools for decision making and control in the presence of model uncertainty. It is oriented to the use of these tools in engineering, specifically in automatic control design with all its components: analysis of dynamical systems, identification problems, and feedback control design. Developments in Model-Based Optimization and Control takes advantage of optimization-based formulations for such classical feedback design objectives as stability, performance and feasibility, afforded by the established body of results and methodologies constituting optimal control theory. It makes particular use of the popular formulation known as predictive control or receding-horizon optimization. The individual contributions in this volume are wide-ranging in subject matter but coordinated within a five-part structure covering material on: · complexity and structure in model predictive control (MPC); · collaborative MPC; · distributed MPC; · optimization-based analysis and desi...

  18. Reliability Evaluation for Optimizing Electricity Supply in a Developing Country

    OpenAIRE

    Mark Ndubuka NWOHU

    2007-01-01

    The reliability standards for electricity supply in a developing country, like Nigeria, have to be determined on past engineering principles and practice. Because of the high demand of electrical power due to rapid development, industrialization and rural electrification; the economic, social and political climate in which the electric power supply industry now operates should be critically viewed to ensure that the production of electrical power should be augmented and remain uninterrupted. ...

  19. Visual gene developer: a fully programmable bioinformatics software for synthetic gene optimization

    Directory of Open Access Journals (Sweden)

    McDonald Karen

    2011-08-01

    Full Text Available Abstract Background Direct gene synthesis is becoming more popular owing to decreases in gene synthesis pricing. Compared with using natural genes, gene synthesis provides a good opportunity to optimize gene sequence for specific applications. In order to facilitate gene optimization, we have developed a stand-alone software called Visual Gene Developer. Results The software not only provides general functions for gene analysis and optimization along with an interactive user-friendly interface, but also includes unique features such as programming capability, dedicated mRNA secondary structure prediction, artificial neural network modeling, network & multi-threaded computing, and user-accessible programming modules. The software allows a user to analyze and optimize a sequence using main menu functions or specialized module windows. Alternatively, gene optimization can be initiated by designing a gene construct and configuring an optimization strategy. A user can choose several predefined or user-defined algorithms to design a complicated strategy. The software provides expandable functionality as platform software supporting module development using popular script languages such as VBScript and JScript in the software programming environment. Conclusion Visual Gene Developer is useful for both researchers who want to quickly analyze and optimize genes, and those who are interested in developing and testing new algorithms in bioinformatics. The software is available for free download at http://www.visualgenedeveloper.net.

  20. Integration of safety engineering into a cost optimized development program.

    Science.gov (United States)

    Ball, L. W.

    1972-01-01

    A six-segment management model is presented, each segment of which represents a major area in a new product development program. The first segment of the model covers integration of specialist engineers into 'systems requirement definition' or the system engineering documentation process. The second covers preparation of five basic types of 'development program plans.' The third segment covers integration of system requirements, scheduling, and funding of specialist engineering activities into 'work breakdown structures,' 'cost accounts,' and 'work packages.' The fourth covers 'requirement communication' by line organizations. The fifth covers 'performance measurement' based on work package data. The sixth covers 'baseline requirements achievement tracking.'

  1. Optimal Height Of Land Development – An Economic Perspective

    Directory of Open Access Journals (Sweden)

    Żelazowski Konrad

    2015-03-01

    Full Text Available Sky-scrapers are rising in the panorama of big modern cities more and more often, becoming a symbol of dynamic growth and prestige. High-rise development appears to be an answer to the expanding demand for new residential and commercial space as urban land prices continue to go up and the availability of land decreases.

  2. Developing a Quality Improvement Process to Optimize Faculty Success

    Science.gov (United States)

    Merillat, Linda; Scheibmeir, Monica

    2016-01-01

    As part of a major shift to embed quality improvement processes within a School of Nursing at a medium-sized Midwestern university, a faculty enrichment program using a Plan-Do-Act-Study design was implemented. A central focus for the program was the development and maintenance of an online faculty resource center identified as "My Faculty…

  3. Intentional Design of Student Organizations to Optimize Leadership Development.

    Science.gov (United States)

    Mainella, Felicia C

    2017-09-01

    This chapter addresses how a group's organizational structure can promote or hinder the leadership capacity of its members. The information in this chapter provides insight into structuring student organizations in a way to maximize all members' leadership development. © 2017 Wiley Periodicals, Inc., A Wiley Company.

  4. Concept development and needs identification for intelligent network flow optimization (INFLO) : concept of operations.

    Science.gov (United States)

    2012-06-01

    The purpose of this project is to develop for the Intelligent Network Flow Optimization (INFLO), which is one collection (or bundle) of high-priority transformative applications identified by the United States Department of Transportation (USDOT) Mob...

  5. Concept development and needs identification for intelligent network flow optimization (INFLO) : test readiness assessment.

    Science.gov (United States)

    2012-11-01

    The purpose of this project is to develop for the Intelligent Network Flow Optimization (INFLO), which is one collection (or bundle) of high-priority transformative applications identified by the United States Department of Transportation (USDOT) Mob...

  6. Development of Design Tools for the Optimization of Biologically Based Control Systems

    Data.gov (United States)

    National Aeronautics and Space Administration — I plan to develop software that aids in the design of biomimetic control systems by optimizing the properties of the system in order to produce the desired output....

  7. How to Optimally Interdict a Belligerent Project to Develop a Nuclear Weapon

    National Research Council Canada - National Science Library

    Skroch, Eric

    2004-01-01

    ... a large-scale project management model that includes alternate development paths to achieve certain key technical milestones We show how such a project can be optimally accelerated by expediting critical...

  8. Methods of the enterprise cash flows optimization in the context of sustainable development

    OpenAIRE

    O. Bardyn

    2015-01-01

    This paper deals with nature and analysis of current approaches to optimization of cash flows of the enterprise. Ways and management directions in order to achieve sustainable development by enterprise have been justified

  9. Development of a external exposure computational model for studying of input dose in skin for radiographs of thorax and vertebral column

    International Nuclear Information System (INIS)

    Muniz, Bianca C.; Menezes, Claudio J.M.; Vieira, Jose W.

    2014-01-01

    The dosimetric measurements do not always happen directly in the human body. Therefore, these assessments can be performed using anthropomorphic models (phantoms) evidencing models computational exposure (MCE) using techniques of Monte Carlo Method for virtual simulations. These processing techniques coupled with more powerful and affordable computers make the Monte Carlo method one of the tools most used worldwide in radiation transport area. In this work, the Monte Carlo EGS4 program was used to develop a computer model of external exposure to study the entrance skin dose for chest and column X-radiography and, aiming to optimize these practices by reducing doses to patients, professionals involved and the general public. The results obtained experimentally with the electrometer Radcal, model 9015, associated with the ionization chamber for radiology model 10X5-6, showed that the proposed computational model can be used in quality assurance programs in radiodiagnostic, evaluating the entrance skin dose when varying parameters of the radiation beam such as kilo voltage peak (kVp), current-time product (mAs), total filtration and distance surface source (DFS), optimizing the practices in radiodiagnostic and meeting the current regulation

  10. Sizing solar home systems for optimal development impact

    International Nuclear Information System (INIS)

    Bond, M.; Fuller, R.J.; Aye, Lu

    2012-01-01

    The paper compares the development impact of three different sized solar home systems (SHS) (10, 40 and 80 W p ) installed in rural East Timor. It describes research aimed to determine whether the higher cost of the larger systems was justified by additional household benefits. To assess the development impact of these different sizes of SHS the research used a combination of participatory and quantitative tools. Participatory exercises were conducted with seventy-seven small groups of SHS users in twenty-four rural communities and supplemented with a household survey of 195 SHS users. The combined results of these evaluation processes enabled the three sizes of SHS to be compared for two types of benefits—those associated with carrying out important household tasks and attributes of SHS which were advantageous compared to the use of non-electric lighting sources. The research findings showed that the small, 10 W p SHS provided much of the development impact of the larger systems. It suggests three significant implications for the design of SHS programs in contexts such as East Timor: provide more small systems rather than fewer large ones; provide lighting in the kitchen wherever possible; and carefully match SHS operating costs to the incomes of rural users. - Highlights: ► We compare development benefits for 3 sizes of solar home systems—10, 40 and 80 W p . ► Benefit assessment uses a combination of qualitative and quantitative approaches. ► Small systems are found to provide much of the benefits of the larger systems. ► To maximise benefits systems should be fitted with luminaires in kitchen areas. ► Financial benefits are important to users and may not accrue for large systems.

  11. Development and Optimization of Silver Nanoparticle Formulation for Fabrication

    Science.gov (United States)

    2015-08-14

    steady increase in multi - drug resistant organisms. Therefore , the development of next generation antimicrobial compounds , such as silver...information. Publication Clearance Request (CPP- 1) Version 9.3: 10.2014 D Animal Use Research Protocol No.: List the IACUC that approved the...nanoparticles, is a priority. However , due to the inconsistencies in current fabrication and processing methods , it is unclear whether the antimicrobial

  12. Development of a VVER-1000 core loading pattern optimization program based on perturbation theory

    International Nuclear Information System (INIS)

    Hosseini, Mohammad; Vosoughi, Naser

    2012-01-01

    Highlights: ► We use perturbation theory to find an optimum fuel loading pattern in a VVER-1000. ► We provide a software for in-core fuel management optimization. ► We consider two objectives for our method (perturbation theory). ► We show that perturbation theory method is very fast and accurate for optimization. - Abstract: In-core nuclear fuel management is one of the most important concerns in the design of nuclear reactors. Two main goals in core fuel loading pattern design optimization are maximizing the core effective multiplication factor in order to extract the maximum energy, and keeping the local power peaking factor lower than a predetermined value to maintain the fuel integrity. Because of the numerous possible patterns of fuel assemblies in the reactor core, finding the best configuration is so important and challenging. Different techniques for optimization of fuel loading pattern in the reactor core have been introduced by now. In this study, a software is programmed in C language to find an order of the fuel loading pattern of a VVER-1000 reactor core using the perturbation theory. Our optimization method is based on minimizing the radial power peaking factor. The optimization process launches by considering an initial loading pattern and the specifications of the fuel assemblies which are given as the input of the software. The results on a typical VVER-1000 reactor reveal that the method could reach to a pattern with an allowed radial power peaking factor and increases the cycle length 1.1 days, as well.

  13. Developing a simulation framework for safe and optimal trajectories considering drivers’ driving style

    DEFF Research Database (Denmark)

    Gruber, Thierry; Larue, Grégoire S.; Rakotonirainy, Andry

    2017-01-01

    drivers with the optimal trajectory considering the motorist's driving style in real time. Travel duration and safety are the main parameters used to find the optimal trajectory. A simulation framework to determine the optimal trajectory was developed in which the ego car travels in a highway environment......Advanced driving assistance systems (ADAS) have huge potential for improving road safety and travel times. However, their take-up in the market is very slow; and these systems should consider driver's preferences to increase adoption rates. The aim of this study is to develop a model providing...

  14. Developing an optimal valve closing rule curve for real-time pressure control in pipes

    Energy Technology Data Exchange (ETDEWEB)

    Bazarganlari, Mohammad Reza; Afshar, Hossein [Islamic Azad University, Tehran (Iran, Islamic Republic of); Kerachian, Reza [University of Tehran, Tehran (Iran, Islamic Republic of); Bashiazghadi, Seyyed Nasser [Iran University of Science and Technology, Tehran (Iran, Islamic Republic of)

    2013-01-15

    Sudden valve closure in pipeline systems can cause high pressures that may lead to serious damages. Using an optimal valve closing rule can play an important role in managing extreme pressures in sudden valve closure. In this paper, an optimal closing rule curve is developed using a multi-objective optimization model and Bayesian networks (BNs) for controlling water pressure in valve closure instead of traditional step functions or single linear functions. The method of characteristics is used to simulate transient flow caused by valve closure. Non-dominated sorting genetic algorithms-II is also used to develop a Pareto front among three objectives related to maximum and minimum water pressures, and the amount of water passes through the valve during the valve-closing process. Simulation and optimization processes are usually time-consuming, thus results of the optimization model are used for training the BN. The trained BN is capable of determining optimal real-time closing rules without running costly simulation and optimization models. To demonstrate its efficiency, the proposed methodology is applied to a reservoir-pipe-valve system and the optimal closing rule curve is calculated for the valve. The results of the linear and BN-based valve closure rules show that the latter can significantly reduce the range of variations in water hammer pressures.

  15. Curcumin phytosomal softgel formulation: Development, optimization and physicochemical characterization.

    Science.gov (United States)

    Allam, Ahmed N; Komeil, Ibrahim A; Abdallah, Ossama Y

    2015-09-01

    Curcumin, a naturally occurring lipophilic molecule can exert multiple and diverse bioactivities. However, its limited aqueous solubility and extensive presystemic metabolism restrict its bioavailability. Curcumin phytosomes were prepared by a simple solvent evaporation method where free flowing powder was obtained in addition to a newly developed semisolid formulation to increase curcumin content in softgels. Phytosomal powder was characterized in terms of drug content and zeta potential. Thirteen different softgel formulations were developed using oils such as Miglyol 812, castor oil and oleic acid, a hydrophilic vehicle such as PEG 400 and bioactive surfactants such as Cremophor EL and KLS P 124. Selected formulations were characterized in terms of curcumin in vitro dissolution. TEM analysis revealed good stability and a spherical, self-closed structure of curcumin phytosomes in complex formulations. Stability studies of chosen formulations prepared using the hydrophilic vehicle revealed a stable curcumin dissolution pattern. In contrast, a dramatic decrease in curcumin dissolution was observed in case of phytosomes formulated in oily vehicles.

  16. Curcumin phytosomal softgel formulation: Development, optimization and physicochemical characterization

    Directory of Open Access Journals (Sweden)

    Allam Ahmed N.

    2015-09-01

    Full Text Available Curcumin, a naturally occurring lipophilic molecule can exert multiple and diverse bioactivities. However, its limited aqueous solubility and extensive presystemic metabolism restrict its bioavailability. Curcumin phytosomes were prepared by a simple solvent evaporation method where free flowing powder was obtained in addition to a newly developed semisolid formulation to increase curcumin content in softgels. Phytosomal powder was characterized in terms of drug content and zeta potential. Thirteen different softgel formulations were developed using oils such as Miglyol 812, castor oil and oleic acid, a hydrophilic vehicle such as PEG 400 and bioactive surfactants such as Cremophor EL and KLS P 124. Selected formulations were characterized in terms of curcumin in vitro dissolution. TEM analysis revealed good stability and a spherical, self-closed structure of curcumin phytosomes in complex formulations. Stability studies of chosen formulations prepared using the hydrophilic vehicle revealed a stable curcumin dissolution pattern. In contrast, a dramatic decrease in curcumin dissolution was observed in case of phytosomes formulated in oily vehicles.

  17. Critical overview of the development of the optimization requirement and its implementation

    International Nuclear Information System (INIS)

    Gonzalez, A.J.

    1988-01-01

    This paper is intended to provide a critical overview of the development of the optimization requirement of the system of dose limitation recommended by the International Commission on Radiological Protection (ICRP), as well as of its implementation. The concept of optimization began to evolve in the mid-1950s and was formally introduced as a requirement for radiation protection in 1978. Recommendations on its practical implementation have been available for five years. After such a long evolution, it seems reasonable to make a critical assessment of the development of the optimization concept, and this summary paper is intended to provide such an assessment. It does not include a description of the requirement or of any method for implementing it, since it is assumed that optimization is familiar by now. The paper concentrates rather on misunderstandings of and achievements owed to optimization and explores some of their underlying causes, the intention being to draw lessons from past experience and to apply them to future developments in this area. The paper presents an outlook on some remaining policy issues that still pending solution as well as some suggestions on prioritization of implementation of optimization and on standardizing the optimization protection

  18. Optimally combined regional geoid models for the realization of height systems in developing countries - ORG4heights

    Science.gov (United States)

    Lieb, Verena; Schmidt, Michael; Willberg, Martin; Pail, Roland

    2017-04-01

    Precise height systems require high-resolution and high-quality gravity data. However, such data sets are sparse especially in developing or newly industrializing countries. Thus, we initiated the DFG-project "ORG4heights" for the formulation of a general scientific concept how to (1) optimally combine all available data sets and (2) estimate realistic errors. The resulting regional gravity field models then deliver the fundamental basis for (3) establishing physical national height systems. The innovative key aspects of the project incorporate the development of a method which links (low- up to mid-resolution) gravity satellite mission data and (high- down to low-quality) terrestrial data. Hereby, an optimal combination of the data utilizing their highest measure of information including uncertainty quantification and analyzing systematic omission errors is pursued. Regional gravity field modeling via Multi-Resolution Representation (MRR) and Least Squares Collocation (LSC) are studied in detail and compared based on their theoretical fundamentals. From the findings, MRR shall be further developed towards implementing a pyramid algorithm. Within the project, we investigate comprehensive case studies in Saudi Arabia and South America, i. e. regions with varying topography, by means of simulated data with heterogeneous distribution, resolution, quality and altitude. GPS and tide gauge records serve as complementary input or validation data. The resulting products include error propagation, internal and external validation. A generalized concept then is derived in order to establish physical height systems in developing countries. The recommendations may serve as guidelines for sciences and administration. We present the ideas and strategies of the project, which combines methodical development and practical applications with high socio-economic impact.

  19. Generalized DSS shell for developing simulation and optimization hydro-economic models of complex water resources systems

    Science.gov (United States)

    Pulido-Velazquez, Manuel; Lopez-Nicolas, Antonio; Harou, Julien J.; Andreu, Joaquin

    2013-04-01

    Hydrologic-economic models allow integrated analysis of water supply, demand and infrastructure management at the river basin scale. These models simultaneously analyze engineering, hydrology and economic aspects of water resources management. Two new tools have been designed to develop models within this approach: a simulation tool (SIM_GAMS), for models in which water is allocated each month based on supply priorities to competing uses and system operating rules, and an optimization tool (OPT_GAMS), in which water resources are allocated optimally following economic criteria. The characterization of the water resource network system requires a connectivity matrix representing the topology of the elements, generated using HydroPlatform. HydroPlatform, an open-source software platform for network (node-link) models, allows to store, display and export all information needed to characterize the system. Two generic non-linear models have been programmed in GAMS to use the inputs from HydroPlatform in simulation and optimization models. The simulation model allocates water resources on a monthly basis, according to different targets (demands, storage, environmental flows, hydropower production, etc.), priorities and other system operating rules (such as reservoir operating rules). The optimization model's objective function is designed so that the system meets operational targets (ranked according to priorities) each month while following system operating rules. This function is analogous to the one used in the simulation module of the DSS AQUATOOL. Each element of the system has its own contribution to the objective function through unit cost coefficients that preserve the relative priority rank and the system operating rules. The model incorporates groundwater and stream-aquifer interaction (allowing conjunctive use simulation) with a wide range of modeling options, from lumped and analytical approaches to parameter-distributed models (eigenvalue approach). Such

  20. Development of a method of robust rain gauge network optimization based on intensity-duration-frequency results

    Directory of Open Access Journals (Sweden)

    A. Chebbi

    2013-10-01

    Full Text Available Based on rainfall intensity-duration-frequency (IDF curves, fitted in several locations of a given area, a robust optimization approach is proposed to identify the best locations to install new rain gauges. The advantage of robust optimization is that the resulting design solutions yield networks which behave acceptably under hydrological variability. Robust optimization can overcome the problem of selecting representative rainfall events when building the optimization process. This paper reports an original approach based on Montana IDF model parameters. The latter are assumed to be geostatistical variables, and their spatial interdependence is taken into account through the adoption of cross-variograms in the kriging process. The problem of optimally locating a fixed number of new monitoring stations based on an existing rain gauge network is addressed. The objective function is based on the mean spatial kriging variance and rainfall variogram structure using a variance-reduction method. Hydrological variability was taken into account by considering and implementing several return periods to define the robust objective function. Variance minimization is performed using a simulated annealing algorithm. In addition, knowledge of the time horizon is needed for the computation of the robust objective function. A short- and a long-term horizon were studied, and optimal networks are identified for each. The method developed is applied to north Tunisia (area = 21 000 km2. Data inputs for the variogram analysis were IDF curves provided by the hydrological bureau and available for 14 tipping bucket type rain gauges. The recording period was from 1962 to 2001, depending on the station. The study concerns an imaginary network augmentation based on the network configuration in 1973, which is a very significant year in Tunisia because there was an exceptional regional flood event in March 1973. This network consisted of 13 stations and did not meet World

  1. Multi-Objective Optimization for Analysis of Changing Trade-Offs in the Nepalese Water-Energy-Food Nexus with Hydropower Development

    DEFF Research Database (Denmark)

    Dhaubanjar, Sanita; Davidsen, Claus; Bauer-Gottwein, Peter

    2017-01-01

    transmission constraints using an optimal power flow approach. Basin inflows, hydropower plant specifications, reservoir characteristics, reservoir rules, irrigation water demand, environmental flow requirements, power demand, and transmission line properties are provided as model inputs. The trade......-established water and power system models to develop a decision support tool combining multiple nexus objectives in a linear objective function. To demonstrate our framework, we compare eight Nepalese power development scenarios based on five nexus objectives: minimization of power deficit, maintenance of water...... availability for irrigation to support food self-sufficiency, reduction in flood risk, maintenance of environmental flows, and maximization of power export. The deterministic multi-objective optimization model is spatially resolved to enable realistic representation of the nexus linkages and accounts for power...

  2. Managing the Public Sector Research and Development Portfolio Selection Process: A Case Study of Quantitative Selection and Optimization

    Science.gov (United States)

    2016-09-01

    PUBLIC SECTOR RESEARCH & DEVELOPMENT PORTFOLIO SELECTION PROCESS: A CASE STUDY OF QUANTITATIVE SELECTION AND OPTIMIZATION by Jason A. Schwartz...PUBLIC SECTOR RESEARCH & DEVELOPMENT PORTFOLIO SELECTION PROCESS: A CASE STUDY OF QUANTITATIVE SELECTION AND OPTIMIZATION 5. FUNDING NUMBERS 6...describing how public sector organizations can implement a research and development (R&D) portfolio optimization strategy to maximize the cost

  3. Development and Application of a Tool for Optimizing Composite Matrix Viscoplastic Material Parameters

    Science.gov (United States)

    Murthy, Pappu L. N.; Naghipour Ghezeljeh, Paria; Bednarcyk, Brett A.

    2018-01-01

    This document describes a recently developed analysis tool that enhances the resident capabilities of the Micromechanics Analysis Code with the Generalized Method of Cells (MAC/GMC) and its application. MAC/GMC is a composite material and laminate analysis software package developed at NASA Glenn Research Center. The primary focus of the current effort is to provide a graphical user interface (GUI) capability that helps users optimize highly nonlinear viscoplastic constitutive law parameters by fitting experimentally observed/measured stress-strain responses under various thermo-mechanical conditions for braided composites. The tool has been developed utilizing the MATrix LABoratory (MATLAB) (The Mathworks, Inc., Natick, MA) programming language. Illustrative examples shown are for a specific braided composite system wherein the matrix viscoplastic behavior is represented by a constitutive law described by seven parameters. The tool is general enough to fit any number of experimentally observed stress-strain responses of the material. The number of parameters to be optimized, as well as the importance given to each stress-strain response, are user choice. Three different optimization algorithms are included: (1) Optimization based on gradient method, (2) Genetic algorithm (GA) based optimization and (3) Particle Swarm Optimization (PSO). The user can mix and match the three algorithms. For example, one can start optimization with either 2 or 3 and then use the optimized solution to further fine tune with approach 1. The secondary focus of this paper is to demonstrate the application of this tool to optimize/calibrate parameters for a nonlinear viscoplastic matrix to predict stress-strain curves (for constituent and composite levels) at different rates, temperatures and/or loading conditions utilizing the Generalized Method of Cells. After preliminary validation of the tool through comparison with experimental results, a detailed virtual parametric study is

  4. Development of a codon optimization strategy using the efor RED reporter gene as a test case

    Science.gov (United States)

    Yip, Chee-Hoo; Yarkoni, Orr; Ajioka, James; Wan, Kiew-Lian; Nathan, Sheila

    2018-04-01

    Synthetic biology is a platform that enables high-level synthesis of useful products such as pharmaceutically related drugs, bioplastics and green fuels from synthetic DNA constructs. Large-scale expression of these products can be achieved in an industrial compliant host such as Escherichia coli. To maximise the production of recombinant proteins in a heterologous host, the genes of interest are usually codon optimized based on the codon usage of the host. However, the bioinformatics freeware available for standard codon optimization might not be ideal in determining the best sequence for the synthesis of synthetic DNA. Synthesis of incorrect sequences can prove to be a costly error and to avoid this, a codon optimization strategy was developed based on the E. coli codon usage using the efor RED reporter gene as a test case. This strategy replaces codons encoding for serine, leucine, proline and threonine with the most frequently used codons in E. coli. Furthermore, codons encoding for valine and glycine are substituted with the second highly used codons in E. coli. Both the optimized and original efor RED genes were ligated to the pJS209 plasmid backbone using Gibson Assembly and the recombinant DNAs were transformed into E. coli E. cloni 10G strain. The fluorescence intensity per cell density of the optimized sequence was improved by 20% compared to the original sequence. Hence, the developed codon optimization strategy is proposed when designing an optimal sequence for heterologous protein production in E. coli.

  5. Development of a standard methodology for optimizing remote visual display for nuclear maintenance tasks

    Science.gov (United States)

    Clarke, M. M.; Garin, J.; Prestonanderson, A.

    A fuel recycle facility being designed at Oak Ridge National Laboratory involves the Remotex concept: advanced servo-controlled master/slave manipulators, with remote television viewing, will totally replace direct human contact with the radioactive environment. The design of optimal viewing conditions is a critical component of the overall man/machine system. A methodology was developed for optimizing remote visual displays for nuclear maintenance tasks. The usefulness of this approach was demonstrated by preliminary specification of optimal closed circuit TV systems for such tasks.

  6. Multidisciplinary Design, Analysis, and Optimization Tool Development Using a Genetic Algorithm

    Science.gov (United States)

    Pak, Chan-gi; Li, Wesley

    2009-01-01

    Multidisciplinary design, analysis, and optimization using a genetic algorithm is being developed at the National Aeronautics and Space Administration Dryden Flight Research Center (Edwards, California) to automate analysis and design process by leveraging existing tools to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic, and hypersonic aircraft. This is a promising technology, but faces many challenges in large-scale, real-world application. This report describes current approaches, recent results, and challenges for multidisciplinary design, analysis, and optimization as demonstrated by experience with the Ikhana fire pod design.!

  7. Development of a standard methodology for optimizing remote visual display for nuclear-maintenance tasks

    International Nuclear Information System (INIS)

    Clarke, M.M.; Garin, J.; Preston-Anderson, A.

    1981-01-01

    The aim of the present study is to develop a methodology for optimizing remote viewing systems for a fuel recycle facility (HEF) being designed at Oak Ridge National Laboratory (ORNL). An important feature of this design involves the Remotex concept: advanced servo-controlled master/slave manipulators, with remote television viewing, will totally replace direct human contact with the radioactive environment. Therefore, the design of optimal viewing conditions is a critical component of the overall man/machine system. A methodology has been developed for optimizing remote visual displays for nuclear maintenance tasks. The usefulness of this approach has been demonstrated by preliminary specification of optimal closed circuit TV systems for such tasks

  8. Development of a Pattern Recognition Methodology for Determining Operationally Optimal Heat Balance Instrumentation Calibration Schedules

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Beran; John Christenson; Dragos Nica; Kenny Gross

    2002-12-15

    The goal of the project is to enable plant operators to detect with high sensitivity and reliability the onset of decalibration drifts in all of the instrumentation used as input to the reactor heat balance calculations. To achieve this objective, the collaborators developed and implemented at DBNPS an extension of the Multivariate State Estimation Technique (MSET) pattern recognition methodology pioneered by ANAL. The extension was implemented during the second phase of the project and fully achieved the project goal.

  9. Effects of the input polarization on JET polarimeter horizontal channels

    International Nuclear Information System (INIS)

    Gaudio, P.; Gelfusa, M.; Murari, A.; Orsitto, F.; Boboc, A.

    2013-01-01

    In the past, the analysis of JET polarimetry measurements were carried out only for the vertical channels using a polarimetry propagation code based on the Stokes vector formalism [1,2]. A new propagation code has been developed therefore for the horizontal chords to simulate and interpret the measurements of the Faraday rotation and Cotton–Mouton phase shift in JET. The code has been used to develop a theoretical study to the effect of the input polarization on the eventual quality of the measurements. The results allow choosing the best polarization to optimize the polarimetric measurements for the various experiments

  10. A theoretical cost optimization model of reused flowback distribution network of regional shale gas development

    International Nuclear Information System (INIS)

    Li, Huajiao; An, Haizhong; Fang, Wei; Jiang, Meng

    2017-01-01

    The logistical issues surrounding the timing and transport of flowback generated by each shale gas well to the next is a big challenge. Due to more and more flowback being stored temporarily near the shale gas well and reused in the shale gas development, both transportation cost and storage cost are the heavy burden for the developers. This research proposed a theoretical cost optimization model to get the optimal flowback distribution solution for regional multi shale gas wells in a holistic perspective. Then, we used some empirical data of Marcellus Shale to do the empirical study. In addition, we compared the optimal flowback distribution solution by considering both the transportation cost and storage cost with the flowback distribution solution which only minimized the transportation cost or only minimized the storage cost. - Highlights: • A theoretical cost optimization model to get optimal flowback distribution solution. • An empirical study using the shale gas data in Bradford County of Marcellus Shale. • Visualization of optimal flowback distribution solutions under different scenarios. • Transportation cost is a more important factor for reducing the cost. • Help the developers to cut the storage and transportation cost of reusing flowback.

  11. Internal combustion engine report: Spark ignited ICE GenSet optimization and novel concept development

    Energy Technology Data Exchange (ETDEWEB)

    Keller, J.; Blarigan, P. Van [Sandia National Labs., Livermore, CA (United States)

    1998-08-01

    In this manuscript the authors report on two projects each of which the goal is to produce cost effective hydrogen utilization technologies. These projects are: (1) the development of an electrical generation system using a conventional four-stroke spark-ignited internal combustion engine generator combination (SI-GenSet) optimized for maximum efficiency and minimum emissions, and (2) the development of a novel internal combustion engine concept. The SI-GenSet will be optimized to run on either hydrogen or hydrogen-blends. The novel concept seeks to develop an engine that optimizes the Otto cycle in a free piston configuration while minimizing all emissions. To this end the authors are developing a rapid combustion homogeneous charge compression ignition (HCCI) engine using a linear alternator for both power take-off and engine control. Targeted applications include stationary electrical power generation, stationary shaft power generation, hybrid vehicles, and nearly any other application now being accomplished with internal combustion engines.

  12. To an optimal electricity supply system. Possible bottlenecks in the development to an optimal electricity supply system in northwest Europe

    International Nuclear Information System (INIS)

    Van Werven, M.J.N.; De Joode, J.; Scheepers, M.J.J.

    2006-02-01

    It is uncertain how the electricity system in Europe, and in particular northwest Europe and the Netherlands, will develop in the next fifteen years. The main objective of this report is to identify possible bottlenecks that may hamper the northwest European electricity system to develop into an optimal system in the long term (until 2020). Subsequently, based on the identified bottlenecks, the report attempts to indicate relevant market response and policy options. To be able to identify possible bottlenecks in the development to an optimal electricity system, an analytical framework has been set up with the aim to identify possible (future) problems in a structured way. The segments generation, network, demand, balancing, and policy and regulation are analysed, as well as the interactions between these segments. Each identified bottleneck is assessed on the criteria reliability, sustainability and affordability. Three bottlenecks are analysed in more detail: (1) The increasing penetration of distributed generation (DG) and its interaction with the electricity network. Dutch policy could be aimed at: (a) Gaining more insight in the costs and benefits that result from the increasing penetration of DG; (b) Creating possibilities for DSOs to experiment with innovative (network management) concepts; (c) Introducing locational signals; and (d) Further analyse the possibility of ownership unbundling; (2) The problem of intermittency and its implications for balancing the electricity system. Dutch policy could be aimed at: (a) Creating the environment in which the market is able to respond in an efficient way; (b) Monitoring market responses; (c) Market coupling; and (d) Discussing the timing of the gate closure; and (3) Interconnection and congestion issues in combination with generation. Dutch policy could be aimed at: (a) Using the existing interconnection capacity as efficient as possible; (b) Identifying the causes behind price differences; and (c) Harmonise market

  13. Development and testing of VTT approach to risk-informed in-service inspection methodology. Final report of SAFIR INTELI INPUT Project RI-ISI

    International Nuclear Information System (INIS)

    Cronvall, O.; Maennistoe, I.; Simola, K.

    2007-04-01

    This report summarises the results of a research project on risk-informed in-service inspection (RI-ISI) methodology conducted in the Finnish national nuclear energy research programme SAFIR (2003-2006). The purpose of this work was to increase the accuracy of risk estimates used in RI-ISI analyses of nuclear power plant (NPP) piping systems, and to quantitatively evaluate the effects of different piping inspection strategies on risk. Piping failure occurrences were sampled by using probabilistic fracture mechanics (PFM) analyses. The PFM results for crack growth were used to construct transition matrices for a discrete-time Markov process model, which in turn was applied to examine the effects of various inspection strategies on the failure probabilities and risks. The applicability of the developed quantitative risk matrix approach was examined as a pilot study performed to the Shut-down cooling piping system 321 in NPP unit OL1 of Teollisuuden Voima Oy (TVO). The analysed degradation mechanisms were stress corrosion cracking (SCC) and thermal fatigue induced cracking (in the mixing points). Here a new and rather straightforward approach was developed to model thermal fatigue induced cracking, which degradation mechanism is much more difficult to model than SCC. This study further demonstrated the usefulness of Markov analysis procedure development by VTT in RI-ISI applications. The most important results are the quantified comparisons of different inspections strategies. It was shown in this study that Markov models are useful for this purpose, when combined with PFM analyses. While the numerical results could benefit from further considerations of inspection reliability, this does not affect the feasibility of the method itself. The approach can be used to identify an optimal inspection strategy for achieving a balanced risk profile of piping segments. (orig.)

  14. Concept development and needs identification for INFLO : report on stakeholder input on transformative goals, performance measures and high level user needs for INFLO.

    Science.gov (United States)

    2012-04-01

    The purpose of this report is to document the stakeholder input received at the February 8, 2012, stakeholder workshop at the Hall of States in Washington, D.C. on goals, performance measures, transformative performance targets, and high-level user n...

  15. Tamoxifen-loaded lecithin organogel (LO) for topical application: Development, optimization and characterization.

    Science.gov (United States)

    Bhatia, Amit; Singh, Bhupinder; Raza, Kaisar; Wadhwa, Sheetu; Katare, Om Prakash

    2013-02-28

    Lecithin organogels (LOs) are semi-solid systems with immobilized organic liquid phase in 3-D network of self-assembled gelators. This paper attempts to study the various attributes of LOs, starting from selection of materials, optimization of influential components to LO specific characterization. After screening of various components (type of gelators, organic and aqueous phase) and construction of phase diagrams, a D-optimal mixture design was employed for the systematic optimization of the LO composition. The response surface plots were constructed for various response variables, viz. viscosity, gel strength, spreadability and consistency index. The optimized LO composition was searched employing overlay plots. Subsequent validation of the optimization study employing check-point formulations, located using grid search, indicated high degree of prognostic ability of the experimental design. The optimized formulation was characterized for morphology, drug content, rheology, spreadability, pH, phase transition temperatures, and physical and chemical stability. The outcomes of the study were interesting showing high dependence of LO attributes on the type and amount of phospholipid, Poloxamer™, auxillary gelators and organic solvent. The optimized LO was found to be quite stable, easily applicable and biocompatible. The findings of the study can be utilized for the development of LO systems of other drugs for the safer and effective topical delivery. Crown Copyright © 2013. Published by Elsevier B.V. All rights reserved.

  16. Optimization of capillary zone electrophoresis for charge heterogeneity testing of biopharmaceuticals using enhanced method development principles.

    Science.gov (United States)

    Moritz, Bernd; Locatelli, Valentina; Niess, Michele; Bathke, Andrea; Kiessig, Steffen; Entler, Barbara; Finkler, Christof; Wegele, Harald; Stracke, Jan

    2017-12-01

    CZE is a well-established technique for charge heterogeneity testing of biopharmaceuticals. It is based on the differences between the ratios of net charge and hydrodynamic radius. In an extensive intercompany study, it was recently shown that CZE is very robust and can be easily implemented in labs that did not perform it before. However, individual characteristics of some examined proteins resulted in suboptimal resolution. Therefore, enhanced method development principles were applied here to investigate possibilities for further method optimization. For this purpose, a high number of different method parameters was evaluated with the aim to improve CZE separation. For the relevant parameters, design of experiments (DoE) models were generated and optimized in several ways for different sets of responses like resolution, peak width and number of peaks. In spite of product specific DoE optimization it was found that the resulting combination of optimized parameters did result in significant improvement of separation for 13 out of 16 different antibodies and other molecule formats. These results clearly demonstrate generic applicability of the optimized CZE method. Adaptation to individual molecular properties may sometimes still be required in order to achieve optimal separation but the set screws discussed in this study [mainly pH, identity of the polymer additive (HPC versus HPMC) and the concentrations of additives like acetonitrile, butanolamine and TETA] are expected to significantly reduce the effort for specific optimization. 2017 The Authors. Electrophoresis published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. DETERMINATION OF THE OPTIMAL CAPITAL INVESTMENTS TO ENSURE THE SUSTAINABLE DEVELOPMENT OF THE RAILWAY

    Directory of Open Access Journals (Sweden)

    O. I. Kharchenko

    2015-04-01

    Full Text Available Purpose. Every year more attention is paid for the theoretical and practical issue of sustainable development of railway transport. But today the mechanisms of financial support of this development are poorly understood. Therefore, the aim of this article is to determine the optimal investment allocation to ensure sustainable development of the railway transport on the example of State Enterprise «Prydniprovsk Railway» and the creation of preconditions for the mathematical model development. Methodology. The ensuring task for sustainable development of railway transport is solved on the basis of the integral indicator of sustainable development effectiveness and defined as the maximization of this criterion. The optimization of measures technological and technical characters are proposed to carry out for increasing values of the integral performance measure components. To the optimization activities of technological nature that enhance the performance criteria belongs: optimization of the number of train and shunting locomotives, optimization of power handling mechanisms at the stations, optimization of routes of train flows. The activities related to the technical nature include: modernization of railways in the direction of their electrification and modernization of the running gear and coupler drawbars of rolling stock, as well as means of separators mechanization at stations to reduce noise impacts on the environment. Findings. The work resulted in the optimal allocation of investments to ensure the sustainable development of railway transportation of State Enterprise «Prydniprovsk Railway». This allows providing such kind of railway development when functioning of State Enterprise «Prydniprovsk Railway» is characterized by a maximum value of the integral indicator of efficiency. Originality. The work was reviewed and the new approach was proposed to determine the optimal allocation of capital investments to ensure sustainable

  18. Sequential development of tidal ravinement surfaces in macro- to hypertidal estuaries with high volcaniclastic input: the Miocene Puerto Madryn Formation (Patagonia, Argentina)

    Science.gov (United States)

    Scasso, Roberto A.; Cuitiño, José I.

    2017-08-01

    The late Miocene beds of the Puerto Madryn Formation (Provincia del Chubut, Argentina) are formed by shallow marine and estuarine sediments. The latter include several tidal-channel infills well exposed on the cliffy coast of the Peninsula Valdés. The Bahía Punta Fósil and Cerro Olazábal paleochannels are end members of these tidal channels and show a fining-upward infilling starting with intraformational channel lag conglomerates above deeply erosional surfaces interpreted as fluvial ravinement surfaces (the erosion surface formed in the purely fluvial or the fluvially dominated part of the estuary, where erosion is driven by fluvial processes). These are overlain and eventually truncated (and suppressed) by the tidal ravinement surface (TRS), in turn covered with high-energy, bioclastic conglomerates mostly formed in the "tidally dominated/fluvially influenced" part of an estuary. Above, large straight or arcuate point bars with alternatively sandy/muddy seasonal beds and varying trace and body fossil contents were deposited from the freshwater fluvially dominated to saline-water tidally dominated part of the estuary. The upper channel infill is formed by cross-bedded sands with mud drapes and seaward-directed paleocurrents, together with barren, volcaniclastic sandy to muddy heterolithic seasonal rhythmites, both deposited in the fluvially dominated part of the estuary. Volcanic ash driven by the rivers after large explosive volcanic eruptions on land resulted in sedimentation rates as high as 0.9 m per year, preserving (through burial) the morphology of tidal channels and TRSs. The channel deposits were formed in a tide-dominated, macrotidal to hypertidal open estuary with well-developed TRSs resulting from strong tidal currents deeply scouring into the transgressive filling of the channels and eventually cutting the fluvial ravinement surface. The TRSs extended upstream to the inner part of the estuary during long periods of low sedimentation rates

  19. Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques

    Science.gov (United States)

    2017-11-01

    on Bio -Inspired Optimization Techniques by Canh Ly, Nghia Tran, and Ozlem Kilic Approved for public release; distribution is...Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio -Inspired Optimization Techniques by...SUBTITLE Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio -Inspired Optimization Techniques 5a. CONTRACT NUMBER

  20. Spatial Optimization of Future Urban Development with Regards to Climate Risk and Sustainability Objectives.

    Science.gov (United States)

    Caparros-Midwood, Daniel; Barr, Stuart; Dawson, Richard

    2017-11-01

    Future development in cities needs to manage increasing populations, climate-related risks, and sustainable development objectives such as reducing greenhouse gas emissions. Planners therefore face a challenge of multidimensional, spatial optimization in order to balance potential tradeoffs and maximize synergies between risks and other objectives. To address this, a spatial optimization framework has been developed. This uses a spatially implemented genetic algorithm to generate a set of Pareto-optimal results that provide planners with the best set of trade-off spatial plans for six risk and sustainability objectives: (i) minimize heat risks, (ii) minimize flooding risks, (iii) minimize transport travel costs to minimize associated emissions, (iv) maximize brownfield development, (v) minimize urban sprawl, and (vi) prevent development of greenspace. The framework is applied to Greater London (U.K.) and shown to generate spatial development strategies that are optimal for specific objectives and differ significantly from the existing development strategies. In addition, the analysis reveals tradeoffs between different risks as well as between risk and sustainability objectives. While increases in heat or flood risk can be avoided, there are no strategies that do not increase at least one of these. Tradeoffs between risk and other sustainability objectives can be more severe, for example, minimizing heat risk is only possible if future development is allowed to sprawl significantly. The results highlight the importance of spatial structure in modulating risks and other sustainability objectives. However, not all planning objectives are suited to quantified optimization and so the results should form part of an evidence base to improve the delivery of risk and sustainability management in future urban development. © 2017 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  1. Development of an Optimizing Control Concept for Fossil-Fired Boilers using a Simulation Model

    DEFF Research Database (Denmark)

    Mortensen, J. H.; Mølbak, T.; Commisso, M.B.

    1997-01-01

    of implementation and commissioning. The optimizing control system takes into account the multivariable and nonlinear characteristics of the boiler process as a gain-scheduled LQG-controller is utilized. For the purpose of facilitating the control concept development a dynamic simulation model of the boiler process......An optimizing control system for improving the load following capabilities of power plant units has been developed. The system is implemented as a complement producing additive control signals to the existing boiler control system, a concept which has various practical advantages in terms...... model when designing a new control concept are discussed....

  2. Optimal gasoline tax in developing, oil-producing countries: The case of Mexico

    International Nuclear Information System (INIS)

    Antón-Sarabia, Arturo; Hernández-Trillo, Fausto

    2014-01-01

    This paper uses the methodology of Parry and Small (2005) to estimate the optimal gasoline tax for a less-developed oil-producing country. The relevance of the estimation relies on the differences between less-developed countries (LDCs) and industrial countries. We argue that lawless roads, general subsidies on gasoline, poor mass transportation systems, older vehicle fleets and unregulated city growth make the tax rates in LDCs differ substantially from the rates in the developed world. We find that the optimal gasoline tax is $1.90 per gallon at 2011 prices and show that the estimate differences are in line with the factors hypothesized. In contrast to the existing literature on industrial countries, we show that the relative gasoline tax incidence may be progressive in Mexico and, more generally, in LDCs. - Highlights: • We estimate the optimal gasoline tax for a typical less-developed, oil-producing country like Mexico. • The relevance of the estimation relies on the differences between less-developed and industrial countries. • The optimal gasoline tax is $1.90 per gallon at 2011 prices. • Distance-related pollution damages, accident costs and gas subsidies account for the major differences. • Gasoline tax incidence may be progressive in less developed countries

  3. Report on planning of input earthquake vibration for design of vibration controlling structure, in the Tokai Works, Power Reactor and Nuclear Fuel Development Corporation

    International Nuclear Information System (INIS)

    Uryu, Mitsuru; Shinohara, Takaharu; Terada, Shuji; Yamazaki, Toshihiko; Nakayama, Kazuhiko; Kondo, Toshinari; Hosoya, Hisashi

    1997-05-01

    When adopting a vibration controlling structure for a nuclear facility building, it is necessary to evaluate a little longer frequency vibration properly. Although various evaluation methods are proposed, there is no finished method. And, to the earthquake itself to investigate, some factors such as effect of surface wave, distant great earthquake, and so on must be considered, and further various evaluations and investigations are required. Here is reported on an evaluation method of the input earthquake vibration for vibration controlling design establishing on adoption of the vibration controlling structure using a vibration control device comprising of laminated rubber and lead damper for the buildings of reprocessing facility in Tokai Works. The input earthquake vibration for vibration controlling design shown in this report is to be adopted for a vibration controlling facility buildings in the Tokai Works. (G.K.)

  4. Realizing an Optimization Approach Inspired from Piaget’s Theory on Cognitive Development

    Directory of Open Access Journals (Sweden)

    Utku Kose

    2015-09-01

    Full Text Available The objective of this paper is to introduce an artificial intelligence based optimization approach, which is inspired from Piaget’s theory on cognitive development. The approach has been designed according to essential processes that an individual may experience while learning something new or improving his / her knowledge. These processes are associated with the Piaget’s ideas on an individual’s cognitive development. The approach expressed in this paper is a simple algorithm employing swarm intelligence oriented tasks in order to overcome single-objective optimization problems. For evaluating effectiveness of this early version of the algorithm, test operations have been done via some benchmark functions. The obtained results show that the approach / algorithm can be an alternative to the literature in terms of single-objective optimization.The authors have suggested the name: Cognitive Development Optimization Algorithm (CoDOA for the related intelligent optimization approach.

  5. A dynamic optimization on economic energy efficiency in development: A numerical case of China

    International Nuclear Information System (INIS)

    Wang, Dong

    2014-01-01

    This paper is based on dynamic optimization methodology to investigate the economic energy efficiency issues in developing countries. The paper introduces some definitions about energy efficiency both in economics and physics, and establishes a quantitative way for measuring the economic energy efficiency. The linkage between economic energy efficiency, energy consumption and other macroeconomic variables is demonstrated primarily. Using the methodology of dynamic optimization, a maximum problem of economic energy efficiency over time, which is subjected to the extended Solow growth model and instantaneous investment rate, is modelled. In this model, the energy consumption is set as a control variable and the capital is regarded as a state variable. The analytic solutions can be derived and the diagrammatic analysis provides saddle-point equilibrium. A numerical simulation based on China is also presented; meanwhile, the optimal paths of investment and energy consumption can be drawn. The dynamic optimization encourages governments in developing countries to pursue higher economic energy efficiency by controlling the energy consumption and regulating the investment state as it can conserve energy without influencing the achievement of steady state in terms of Solow model. If that, a sustainable development will be achieved. - Highlights: • A new definition on economic energy efficiency is proposed mathematically. • A dynamic optimization modelling links economic energy efficiency with other macroeconomic variables in long run. • Economic energy efficiency is determined by capital stock level and energy consumption. • Energy saving is a key solution for improving economic energy efficiency

  6. Statistical identification of effective input variables

    International Nuclear Information System (INIS)

    Vaurio, J.K.

    1982-09-01

    A statistical sensitivity analysis procedure has been developed for ranking the input data of large computer codes in the order of sensitivity-importance. The method is economical for large codes with many input variables, since it uses a relatively small number of computer runs. No prior judgemental elimination of input variables is needed. The sceening method is based on stagewise correlation and extensive regression analysis of output values calculated with selected input value combinations. The regression process deals with multivariate nonlinear functions, and statistical tests are also available for identifying input variables that contribute to threshold effects, i.e., discontinuities in the output variables. A computer code SCREEN has been developed for implementing the screening techniques. The efficiency has been demonstrated by several examples and applied to a fast reactor safety analysis code (Venus-II). However, the methods and the coding are general and not limited to such applications

  7. Brain Emotional Learning Based Intelligent Decoupler for Nonlinear Multi-Input Multi-Output Distillation Columns

    Directory of Open Access Journals (Sweden)

    M. H. El-Saify

    2017-01-01

    Full Text Available The distillation process is vital in many fields of chemical industries, such as the two-coupled distillation columns that are usually highly nonlinear Multi-Input Multi-Output (MIMO coupled processes. The control of MIMO process is usually implemented via a decentralized approach using a set of Single-Input Single-Output (SISO loop controllers. Decoupling the MIMO process into group of single loops requires proper input-output pairing and development of decoupling compensator unit. This paper proposes a novel intelligent decoupling approach for MIMO processes based on new MIMO brain emotional learning architecture. A MIMO architecture of Brain Emotional Learning Based Intelligent Controller (BELBIC is developed and applied as a decoupler for 4 input/4 output highly nonlinear coupled distillation columns process. Moreover, the performance of the proposed Brain Emotional Learning Based Intelligent Decoupler (BELBID is enhanced using Particle Swarm Optimization (PSO technique. The performance is compared with the PSO optimized steady state decoupling compensation matrix. Mathematical models of the distillation columns and the decouplers are built and tested in simulation environment by applying the same inputs. The results prove remarkable success of the BELBID in minimizing the loops interactions without degrading the output that every input has been paired with.

  8. Experimental design and multiple response optimization. Using the desirability function in analytical methods development.

    Science.gov (United States)

    Candioti, Luciana Vera; De Zan, María M; Cámara, María S; Goicoechea, Héctor C

    2014-06-01

    A review about the application of response surface methodology (RSM) when several responses have to be simultaneously optimized in the field of analytical methods development is presented. Several critical issues like response transformation, multiple response optimization and modeling with least squares and artificial neural networks are discussed. Most recent analytical applications are presented in the context of analytLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, ArgentinaLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, Argentinaical methods development, especially in multiple response optimization procedures using the desirability function. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Developing a computationally efficient dynamic multilevel hybrid optimization scheme using multifidelity model interactions.

    Energy Technology Data Exchange (ETDEWEB)

    Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA); Castro, Joseph Pete Jr. (; .); Giunta, Anthony Andrew

    2006-01-01

    Many engineering application problems use optimization algorithms in conjunction with numerical simulators to search for solutions. The formulation of relevant objective functions and constraints dictate possible optimization algorithms. Often, a gradient based approach is not possible since objective functions and constraints can be nonlinear, nonconvex, non-differentiable, or even discontinuous and the simulations involved can be computationally expensive. Moreover, computational efficiency and accuracy are desirable and also influence the choice of solution method. With the advent and increasing availability of massively parallel computers, computational speed has increased tremendously. Unfortunately, the numerical and model complexities of many problems still demand significant computational resources. Moreover, in optimization, these expenses can be a limiting factor since obtaining solutions often requires the completion of numerous computationally intensive simulations. Therefore, we propose a multifidelity optimization algorithm (MFO) designed to improve the computational efficiency of an optimization method for a wide range of applications. In developing the MFO algorithm, we take advantage of the interactions between multi fidelity models to develop a dynamic and computational time saving optimization algorithm. First, a direct search method is applied to the high fidelity model over a reduced design space. In conjunction with this search, a specialized oracle is employed to map the design space of this high fidelity model to that of a computationally cheaper low fidelity model using space mapping techniques. Then, in the low fidelity space, an optimum is obtained using gradient or non-gradient based optimization, and it is mapped back to the high fidelity space. In this paper, we describe the theory and implementation details of our MFO algorithm. We also demonstrate our MFO method on some example problems and on two applications: earth penetrators and

  10. Model development and optimization of operating conditions to maximize PEMFC performance by response surface methodology

    International Nuclear Information System (INIS)

    Kanani, Homayoon; Shams, Mehrzad; Hasheminasab, Mohammadreza; Bozorgnezhad, Ali

    2015-01-01

    Highlights: • The optimization of the operating parameters in a serpentine PEMFC is done using RSM. • The RSM model can predict the cell power over the wide range of operating conditions. • St-An, St-Ca and RH-Ca have an optimum value to obtain the best performance. • The interactions of the operating conditions affect the output power significantly. • The cathode and anode stoichiometry are the most effective parameters on the power. - Abstract: Optimization of operating conditions to obtain maximum power in PEMFCs could have a significant role to reduce the costs of this emerging technology. In the present experimental study, a single serpentine PEMFC is used to investigate the effects of operating conditions on the electrical power production of the cell. Four significant parameters including cathode stoichiometry, anode stoichiometry, gases inlet temperature, and cathode relative humidity are studied using Design of Experiment (DOE) to obtain an optimal power. Central composite second order Response Surface Methodology (RSM) is used to model the relationship between goal function (power) and considered input parameters (operating conditions). Using this statistical–mathematical method leads to obtain a second-order equation for the cell power. This model considers interactions and quadratic effects of different operating conditions and predicts the maximum or minimum power production over the entire working range of the parameters. In this range, high stoichiometry of cathode and low stoichiometry of anode results in the minimum cell power and contrary the medium range of fuel and oxidant stoichiometry leads to the maximum power. Results show that there is an optimum value for the anode stoichiometry, cathode stoichiometry and relative humidity to reach the best performance. The predictions of the model are evaluated by experimental tests and they are in a good agreement for different ranges of the parameters

  11. Development of a biorefinery optimized biofuel supply curve for the western United States

    Science.gov (United States)

    Nathan Parker; Peter Tittmann; Quinn Hart; Richard Nelson; Ken Skog; Anneliese Schmidt; Edward Gray; Bryan Jenkins

    2010-01-01

    A resource assessment and biorefinery siting optimization model was developed and implemented to assess potential biofuel supply across the Western United States from agricultural, forest, urban, and energy crop biomass. Spatial information including feedstock resources, existing and potential refinery locations and a transportation network model is provided to a mixed...

  12. How to develop a customer satisfaction scale with optimal construct validity

    NARCIS (Netherlands)

    Terpstra, M.J.; Kuijlen, A.A.A.; Sijtsma, K.

    2014-01-01

    In this article, we investigate how to construct a customer satisfaction (CS) scale which yields optimally valid measurements of the construct of interest. For this purpose we compare three alternative methodologies for scale development and construct validation. Furthermore, we discuss a

  13. Development of a solar-powered residential air conditioner: System optimization preliminary specification

    Science.gov (United States)

    Rousseau, J.; Hwang, K. C.

    1975-01-01

    Investigations aimed at the optimization of a baseline Rankine cycle solar powered air conditioner and the development of a preliminary system specification were conducted. Efforts encompassed the following: (1) investigations of the use of recuperators/regenerators to enhance the performance of the baseline system, (2) development of an off-design computer program for system performance prediction, (3) optimization of the turbocompressor design to cover a broad range of conditions and permit operation at low heat source water temperatures, (4) generation of parametric data describing system performance (COP and capacity), (5) development and evaluation of candidate system augmentation concepts and selection of the optimum approach, (6) generation of auxiliary power requirement data, (7) development of a complete solar collector-thermal storage-air conditioner computer program, (8) evaluation of the baseline Rankine air conditioner over a five day period simulating the NASA solar house operation, and (9) evaluation of the air conditioner as a heat pump.

  14. Development & optimization of a rule-based energy management strategy for fuel economy improvement in hybrid electric vehicles

    Science.gov (United States)

    Asfoor, Mostafa

    The gradual decline of oil reserves and the increasing demand for energy over the past decades has resulted in automotive manufacturers seeking alternative solutions to reduce the dependency on fossil-based fuels for transportation. A viable technology that enables significant improvements in the overall energy conversion efficiencies is the hybridization of conventional vehicle drive systems. This dissertation builds on prior hybrid powertrain development at the University of Idaho. Advanced vehicle models of a passenger car with a conventional powertrain and three different hybrid powertrain layouts were created using GT-Suite. These different powertrain models were validated against a variety of standard driving cycles. The overall fuel economy, energy consumption, and losses were monitored, and a comprehensive energy analysis was performed to compare energy sources and sinks. The GT-Suite model was then used to predict the formula hybrid SAE vehicle performance. Inputs to this model were a numerically predicted engine performance map, an electric motor torque curve, vehicle geometry, and road load parameters derived from a roll-down test. In this case study, the vehicle had a supervisory controller that followed a rule-based energy management strategy to insure a proper power split during hybrid mode operation. The supervisory controller parameters were optimized using discrete grid optimization method that minimized the total amount of fuel consumed during a specific urban driving cycle with an average speed of approximately 30 [mph]. More than a 15% increase in fuel economy was achieved by adding supervisory control and managing power split. The vehicle configuration without the supervisory controller displayed a fuel economy of 25 [mpg]. With the supervisory controller this rose to 29 [mpg]. Wider applications of this research include hybrid vehicle controller designs that can extend the range and survivability of military combat platforms. Furthermore, the

  15. H∞ memory feedback control with input limitation minimization for offshore jacket platform stabilization

    Science.gov (United States)

    Yang, Jia Sheng

    2018-06-01

    In this paper, we investigate a H∞ memory controller with input limitation minimization (HMCIM) for offshore jacket platforms stabilization. The main objective of this study is to reduce the control consumption as well as protect the actuator when satisfying the requirement of the system performance. First, we introduce a dynamic model of offshore platform with low order main modes based on mode reduction method in numerical analysis. Then, based on H∞ control theory and matrix inequality techniques, we develop a novel H∞ memory controller with input limitation. Furthermore, a non-convex optimization model to minimize input energy consumption is proposed. Since it is difficult to solve this non-convex optimization model by optimization algorithm, we use a relaxation method with matrix operations to transform this non-convex optimization model to be a convex optimization model. Thus, it could be solved by a standard convex optimization solver in MATLAB or CPLEX. Finally, several numerical examples are given to validate the proposed models and methods.

  16. Access to Research Inputs

    DEFF Research Database (Denmark)

    Czarnitzki, Dirk; Grimpe, Christoph; Pellens, Maikel

    2015-01-01

    The viability of modern open science norms and practices depends on public disclosure of new knowledge, methods, and materials. However, increasing industry funding of research can restrict the dissemination of results and materials. We show, through a survey sample of 837 German scientists in life...... sciences, natural sciences, engineering, and social sciences, that scientists who receive industry funding are twice as likely to deny requests for research inputs as those who do not. Receiving external funding in general does not affect denying others access. Scientists who receive external funding...... of any kind are, however, 50 % more likely to be denied access to research materials by others, but this is not affected by being funded specifically by industry...

  17. Access to Research Inputs

    DEFF Research Database (Denmark)

    Czarnitzki, Dirk; Grimpe, Christoph; Pellens, Maikel

    The viability of modern open science norms and practices depend on public disclosure of new knowledge, methods, and materials. However, increasing industry funding of research can restrict the dissemination of results and materials. We show, through a survey sample of 837 German scientists in life...... sciences, natural sciences, engineering, and social sciences, that scientists who receive industry funding are twice as likely to deny requests for research inputs as those who do not. Receiving external funding in general does not affect denying others access. Scientists who receive external funding...... of any kind are, however, 50% more likely to be denied access to research materials by others, but this is not affected by being funded specifically by industry....

  18. CBM First-level Event Selector Input Interface Demonstrator

    Science.gov (United States)

    Hutter, Dirk; de Cuveland, Jan; Lindenstruth, Volker

    2017-10-01

    CBM is a heavy-ion experiment at the future FAIR facility in Darmstadt, Germany. Featuring self-triggered front-end electronics and free-streaming read-out, event selection will exclusively be done by the First Level Event Selector (FLES). Designed as an HPC cluster with several hundred nodes its task is an online analysis and selection of the physics data at a total input data rate exceeding 1 TByte/s. To allow efficient event selection, the FLES performs timeslice building, which combines the data from all given input links to self-contained, potentially overlapping processing intervals and distributes them to compute nodes. Partitioning the input data streams into specialized containers allows performing this task very efficiently. The FLES Input Interface defines the linkage between the FEE and the FLES data transport framework. A custom FPGA PCIe board, the FLES Interface Board (FLIB), is used to receive data via optical links and transfer them via DMA to the host’s memory. The current prototype of the FLIB features a Kintex-7 FPGA and provides up to eight 10 GBit/s optical links. A custom FPGA design has been developed for this board. DMA transfers and data structures are optimized for subsequent timeslice building. Index tables generated by the FPGA enable fast random access to the written data containers. In addition the DMA target buffers can directly serve as InfiniBand RDMA source buffers without copying the data. The usage of POSIX shared memory for these buffers allows data access from multiple processes. An accompanying HDL module has been developed to integrate the FLES link into the front-end FPGA designs. It implements the front-end logic interface as well as the link protocol. Prototypes of all Input Interface components have been implemented and integrated into the FLES test framework. This allows the implementation and evaluation of the foreseen CBM read-out chain.

  19. Development of a biorefinery optimized biofuel supply curve for the Western United States

    International Nuclear Information System (INIS)

    Parker, Nathan; Tittmann, Peter; Hart, Quinn; Nelson, Richard; Skog, Ken; Schmidt, Anneliese; Gray, Edward; Jenkins, Bryan

    2010-01-01

    A resource assessment and biorefinery siting optimization model was developed and implemented to assess potential biofuel supply across the Western United States from agricultural, forest, urban, and energy crop biomass. Spatial information including feedstock resources, existing and potential refinery locations and a transportation network model is provided to a mixed integer-linear optimization model that determines the optimal locations, technology types and sizes of biorefineries to satisfy a maximum profit objective function applied across the biofuel supply and demand chain from site of feedstock production to the product fuel terminal. The resource basis includes preliminary considerations of crop and residue sustainability. Sensitivity analyses explore possible effects of policy and technology changes. At a target market price of 19.6 $ GJ -1 , the model predicts a feasible production level of 610-1098 PJ, enough to supply up to 15% of current regional liquid transportation fuel demand. (author)

  20. Optimization of upstream and development of cellulose hydrolysis process for cellulosic bio-ethanol production

    International Nuclear Information System (INIS)

    Bae, Hyeun Jong; Wi, Seung Gon; Lee, Yoon Gyo; Kim, Ho Myung; Kim, Su Bae

    2011-10-01

    The purpose of this project is optimization of upstream and development of cellulose hydrolysis process for cellulosic bio-ethanol production. The 2nd year Research scope includes: 1) Optimization of pre-treatment conditions for enzymatic hydrolysis of lignocellulosic biomass and 2) Demonstration of enzymatic hydrolysis by recombinant enzymes. To optimize the pretreatment, we applied two processes: a wet process (wet milling + popping), and dry process (popping + dry milling). Out of these, the wet process presented the best glucose yield with a 93.1% conversion, while the dry process yielded 69.6%, and the unpretreated process yielded <20%. The recombinant cellulolytic enzymes showed very high specific activity, about 80-1000 times on CMC and 13-70 times on filter paper at pH 3.5 and 55 .deg. C

  1. Development of an evaluation method for optimization of maintenance strategy in commercial plant

    International Nuclear Information System (INIS)

    Ito, Satoshi; Shiraishi, Natsuki; Yuki, Kazuhisa; Hashizume, Hidetoshi

    2006-01-01

    In this study, a new simulation method is developed for optimization of maintenance strategy in NPP as a multiple-objective optimization problem (MOP). The result of operation is evaluated as the average of the following three measures in 3,000 trials: Cost of Electricity (COE) as economic risk, Frequency of unplanned shutdown as plant reliability, and Unavailability of Regular Service System (RSS) and Engineering Safety Features (ESF) as safety measures. The following maintenance parameters are considered to evaluate several risk in plant operation by changing maintenance strategy: planned outage cycle, surveillance cycle, major inspection cycle, and surveillance cycle depending on the value of Fussel-Vesely importance measure. By using the Decision-Making method based on AHP, there are individual tendencies depending on individual decision-maker. Therefore this study could be useful for resolving the problem of maintenance optimization as a MOP. (author)

  2. Development of New Lipid-Based Paclitaxel Nanoparticles Using Sequential Simplex Optimization

    Science.gov (United States)

    Dong, Xiaowei; Mattingly, Cynthia A.; Tseng, Michael; Cho, Moo; Adams, Val R.; Mumper, Russell J.

    2008-01-01

    The objective of these studies was to develop Cremophor-free lipid-based paclitaxel (PX) nanoparticle formulations prepared from warm microemulsion precursors. To identify and optimize new nanoparticles, experimental design was performed combining Taguchi array and sequential simplex optimization. The combination of Taguchi array and sequential simplex optimization efficiently directed the design of paclitaxel nanoparticles. Two optimized paclitaxel nanoparticles (NPs) were obtained: G78 NPs composed of glyceryl tridodecanoate (GT) and polyoxyethylene 20-stearyl ether (Brij 78), and BTM NPs composed of Miglyol 812, Brij 78 and D-alpha-tocopheryl polyethylene glycol 1000 succinate (TPGS). Both nanoparticles successfully entrapped paclitaxel at a final concentration of 150 μg/ml (over 6% drug loading) with particle sizes less than 200 nm and over 85% of entrapment efficiency. These novel paclitaxel nanoparticles were stable at 4°C over three months and in PBS at 37°C over 102 hours as measured by physical stability. Release of paclitaxel was slow and sustained without initial burst release. Cytotoxicity studies in MDA-MB-231 cancer cells showed that both nanoparticles have similar anticancer activities compared to Taxol®. Interestingly, PX BTM nanocapsules could be lyophilized without cryoprotectants. The lyophilized powder comprised only of PX BTM NPs in water could be rapidly rehydrated with complete retention of original physicochemical properties, in-vitro release properties, and cytotoxicity profile. Sequential Simplex Optimization has been utilized to identify promising new lipid-based paclitaxel nanoparticles having useful attributes. PMID:19111929

  3. Development of a Deterministic Optimization Model for Design of an Integrated Utility and Hydrogen Supply Network

    International Nuclear Information System (INIS)

    Hwangbo, Soonho; Lee, In-Beum; Han, Jeehoon

    2014-01-01

    Lots of networks are constructed in a large scale industrial complex. Each network meet their demands through production or transportation of materials which are needed to companies in a network. Network directly produces materials for satisfying demands in a company or purchase form outside due to demand uncertainty, financial factor, and so on. Especially utility network and hydrogen network are typical and major networks in a large scale industrial complex. Many studies have been done mainly with focusing on minimizing the total cost or optimizing the network structure. But, few research tries to make an integrated network model by connecting utility network and hydrogen network. In this study, deterministic mixed integer linear programming model is developed for integrating utility network and hydrogen network. Steam Methane Reforming process is necessary for combining two networks. After producing hydrogen from Steam-Methane Reforming process whose raw material is steam vents from utility network, produced hydrogen go into hydrogen network and fulfill own needs. Proposed model can suggest optimized case in integrated network model, optimized blueprint, and calculate optimal total cost. The capability of the proposed model is tested by applying it to Yeosu industrial complex in Korea. Yeosu industrial complex has the one of the biggest petrochemical complex and various papers are based in data of Yeosu industrial complex. From a case study, the integrated network model suggests more optimal conclusions compared with previous results obtained by individually researching utility network and hydrogen network

  4. VISUALIZATION SOFTWARE DEVELOPMENT FOR PROCEDURE OF MULTI-DIMENSIONAL OPTIMIZATION OF TECHNOLOGICAL PROCESS FUNCTIONAL PARAMETERS

    Directory of Open Access Journals (Sweden)

    E. N. Ishakova

    2016-05-01

    Full Text Available A method for multi-criteria optimization of the design parameters for technological object is described. The existing optimization methods are overviewed, and works in the field of basic research and applied problems are analyzed. The problem is formulated, based on the process requirements, making it possible to choose the geometrical dimensions of machine tips and the flow rate of the process, so that the resulting technical and economical parameters were optimal. In the problem formulation application of the performance method adapted to a particular domain is described. Task implementation is shown; the method of characteristics creation for the studied object in view of some restrictions for parameters in both analytical and graphical representation. On the basis of theoretical research the software system is developed that gives the possibility to automate the discovery of optimal solutions for specific problems. Using available information sources, that characterize the object of study, it is possible to establish identifiers, add restrictions from the one side, and in the interval as well. Obtained result is a visual depiction of dependence of the main study parameters on the others, which may have an impact on both the flow of the process, and the quality of products. The resulting optimal area shows the use of different design options for technological object in an acceptable kinematic range that makes it possible for the researcher to choose the best design solution.

  5. Balancing development costs and sales to optimize the development time of product line additions

    NARCIS (Netherlands)

    Langerak, F.; Griffin, A.; Hultink, E.J.

    2010-01-01

    Development teams often use mental models to simplify development time decision making because a comprehensive empirical assessment of the trade-offs across the metrics of development time, development costs, proficiency in market-entry timing, and new product sales is simply not feasible.

  6. MARS code manual volume II: input requirements

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Kim, Kyung Doo; Bae, Sung Won; Jeong, Jae Jun; Lee, Seung Wook; Hwang, Moon Kyu

    2010-02-01

    Korea Advanced Energy Research Institute (KAERI) conceived and started the development of MARS code with the main objective of producing a state-of-the-art realistic thermal hydraulic systems analysis code with multi-dimensional analysis capability. MARS achieves this objective by very tightly integrating the one dimensional RELAP5/MOD3 with the multi-dimensional COBRA-TF codes. The method of integration of the two codes is based on the dynamic link library techniques, and the system pressure equation matrices of both codes are implicitly integrated and solved simultaneously. In addition, the Equation-Of-State (EOS) for the light water was unified by replacing the EOS of COBRA-TF by that of the RELAP5. This input manual provides a complete list of input required to run MARS. The manual is divided largely into two parts, namely, the one-dimensional part and the multi-dimensional part. The inputs for auxiliary parts such as minor edit requests and graph formatting inputs are shared by the two parts and as such mixed input is possible. The overall structure of the input is modeled on the structure of the RELAP5 and as such the layout of the manual is very similar to that of the RELAP. This similitude to RELAP5 input is intentional as this input scheme will allow minimum modification between the inputs of RELAP5 and MARS3.1. MARS3.1 development team would like to express its appreciation to the RELAP5 Development Team and the USNRC for making this manual possible

  7. Soft sensor development and optimization of the commercial petrochemical plant integrating support vector regression and genetic algorithm

    Directory of Open Access Journals (Sweden)

    S.K. Lahiri

    2009-09-01

    Full Text Available Soft sensors have been widely used in the industrial process control to improve the quality of the product and assure safety in the production. The core of a soft sensor is to construct a soft sensing model. This paper introduces support vector regression (SVR, a new powerful machine learning methodbased on a statistical learning theory (SLT into soft sensor modeling and proposes a new soft sensing modeling method based on SVR. This paper presents an artificial intelligence based hybrid soft sensormodeling and optimization strategies, namely support vector regression – genetic algorithm (SVR-GA for modeling and optimization of mono ethylene glycol (MEG quality variable in a commercial glycol plant. In the SVR-GA approach, a support vector regression model is constructed for correlating the process data comprising values of operating and performance variables. Next, model inputs describing the process operating variables are optimized using genetic algorithm with a view to maximize the process performance. The SVR-GA is a new strategy for soft sensor modeling and optimization. The major advantage of the strategies is that modeling and optimization can be conducted exclusively from the historic process data wherein the detailed knowledge of process phenomenology (reaction mechanism, kinetics etc. is not required. Using SVR-GA strategy, a number of sets of optimized operating conditions were found. The optimized solutions, when verified in an actual plant, resulted in a significant improvement in the quality.

  8. Modeling and generating input processes

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, M.E.

    1987-01-01

    This tutorial paper provides information relevant to the selection and generation of stochastic inputs to simulation studies. The primary area considered is multivariate but much of the philosophy at least is relevant to univariate inputs as well. 14 refs.

  9. Evaluating Optimism: Developing Children’s Version of Optimistic Attributional Style Questionnaire

    Directory of Open Access Journals (Sweden)

    Gordeeva T.O.,

    2017-08-01

    Full Text Available People differ significantly in how they usually explain to themselves the reasons of events, both positive and negative, that happen in their lives. Psychological research shows that children who tend to think optimistically have certain advantages as compared to their pessimistically thinking peers: they are less likely to suffer from depression, establish more positive relationships with peers, and demonstrate higher academic achievements. This paper describes the process of creating the children’s version of the Optimistic Attributional Style Questionnaire (OASQ-C. This technique is based on the theory of learned hopelessness and optimism developed by M. Seligman, L. Abramson and J. Teas dale and is an efficient (compact tool for measuring optimism as an explanatory style in children and adolescents (9-14 years. Confirmatory factor analysis revealed that this technique is a two-factor structure with acceptable reliability. Validity is supported by the presence of expected correlations between explanatory style and rates of psychological well-being, dispositional optimism, positive attitude to life and its aspects, depression, and academic performance. The outcomes of this technique are not affected by social desirability. The developed questionnaire may be recommended to researchers and school counsellors for evaluating optimism (optimistic thinking as one of the major factors in psychological well-being of children; it may also be used in assessing the effectiveness of cognitive oriented training for adolescents.

  10. Development and optimization of power plant concepts for local wet fuels

    Energy Technology Data Exchange (ETDEWEB)

    Raiko, M.O.; Gronfors, T.H.A. [Fortum Energy Solutions, Fortum (Finland); Haukka, P. [Tampere University of Technology (Finland)

    2003-01-01

    Many changes in business drivers are now affecting power-producing companies. The power market has been opened up and the number of locally operating companies has increased. At the same time the need to utilize locally produced biofuels is increasing because of environmental benefits and regulations. In this situation, power-producing companies have on focus their in-house skills for generating a competitive edge over their rivals, such as the skills needed for developing the most economical energy investments for the best-paying customer for the local biomass producers. This paper explores the role of optimization in the development of small-sized energy investments. The paper provides an overview on a new design process for power companies for improved use of in-house technical and business expertise. As an example, illustrative design and optimization of local wet peat-based power investment is presented. Three concept alternatives are generated. Only power plant production capacity and peat moisture content are optimized for all alternatives. Long commercial experience of using peat as a power plant fuel in Finland can be transferred to bioenergy investments. In this paper, it is shown that conventional technology can be feasible for bioenergy production even in quite small size (below 10 MW). It is important to optimize simultaneously both the technology and the two businesses, power production and fuel production. Further, such high moisture content biomass as sludge, seaweed, grass, etc. can be economical fuels, if advanced drying systems are adopted in a power plant. (author)

  11. Off-line learning from clustered input examples

    NARCIS (Netherlands)

    Marangi, Carmela; Solla, Sara A.; Biehl, Michael; Riegler, Peter; Marinaro, Maria; Tagliaferri, Roberto

    1996-01-01

    We analyze the generalization ability of a simple perceptron acting on a structured input distribution for the simple case of two clusters of input data and a linearly separable rule. The generalization ability computed for three learning scenarios: maximal stability, Gibbs, and optimal learning, is

  12. Development and optimization of a self-microemulsifying drug delivery system for atorvastatin calcium by using D-optimal mixture design.

    Science.gov (United States)

    Yeom, Dong Woo; Song, Ye Seul; Kim, Sung Rae; Lee, Sang Gon; Kang, Min Hyung; Lee, Sangkil; Choi, Young Wook

    2015-01-01

    In this study, we developed and optimized a self-microemulsifying drug delivery system (SMEDDS) formulation for improving the dissolution and oral absorption of atorvastatin calcium (ATV), a poorly water-soluble drug. Solubility and emulsification tests were performed to select a suitable combination of oil, surfactant, and cosurfactant. A D-optimal mixture design was used to optimize the concentration of components used in the SMEDDS formulation for achieving excellent physicochemical characteristics, such as small droplet size and high dissolution. The optimized ATV-loaded SMEDDS formulation containing 7.16% Capmul MCM (oil), 48.25% Tween 20 (surfactant), and 44.59% Tetraglycol (cosurfactant) significantly enhanced the dissolution rate of ATV in different types of medium, including simulated intestinal fluid, simulated gastric fluid, and distilled water, compared with ATV suspension. Good agreement was observed between predicted and experimental values for mean droplet size and percentage of the drug released in 15 minutes. Further, pharmacokinetic studies in rats showed that the optimized SMEDDS formulation considerably enhanced the oral absorption of ATV, with 3.4-fold and 4.3-fold increases in the area under the concentration-time curve and time taken to reach peak plasma concentration, respectively, when compared with the ATV suspension. Thus, we successfully developed an optimized ATV-loaded SMEDDS formulation by using the D-optimal mixture design, that could potentially be used for improving the oral absorption of poorly water-soluble drugs.

  13. The decision optimization of product development by considering the customer demand saturation

    Directory of Open Access Journals (Sweden)

    Qing-song Xing

    2015-05-01

    Full Text Available Purpose: The purpose of this paper is to analyze the impacts of over meeting customer demands on the product development process, which is on the basis of the quantitative model of customer demands, development cost and time. Then propose the corresponding product development optimization decision. Design/methodology/approach: First of all, investigate to obtain the customer demand information, and then quantify customer demands weights by using variation coefficient method. Secondly, analyses the relationship between customer demands and product development time and cost based on the quality function deployment and establish corresponding mathematical model. On this basis, put forward the concept of customer demand saturation and optimization decision method of product development, and then apply it in the notebook development process of a company. Finally, when customer demand is saturated, it also needs to prove the consistency of strengthening satisfies customer demands and high attention degree customer demands, and the stability of customer demand saturation under different parameters. Findings: The development cost and the time will rise sharply when over meeting the customer demand. On the basis of considering the customer demand saturation, the relationship between customer demand and development time cost is quantified and balanced. And also there is basically consistent between the sequence of meeting customer demands and customer demands survey results. Originality/value: The paper proposes a model of customer demand saturation. It proves the correctness and effectiveness on the product development decision method.

  14. Optimization and Optimal Control

    CERN Document Server

    Chinchuluun, Altannar; Enkhbat, Rentsen; Tseveendorj, Ider

    2010-01-01

    During the last four decades there has been a remarkable development in optimization and optimal control. Due to its wide variety of applications, many scientists and researchers have paid attention to fields of optimization and optimal control. A huge number of new theoretical, algorithmic, and computational results have been observed in the last few years. This book gives the latest advances, and due to the rapid development of these fields, there are no other recent publications on the same topics. Key features: Provides a collection of selected contributions giving a state-of-the-art accou

  15. Reprocessing input data validation

    International Nuclear Information System (INIS)

    Persiani, P.J.; Bucher, R.G.; Pond, R.B.; Cornella, R.J.

    1990-01-01

    The Isotope Correlation Technique (ICT), in conjunction with the gravimetric (Pu/U ratio) method for mass determination, provides an independent verification of the input accountancy at the dissolver or accountancy stage of the reprocessing plant. The Isotope Correlation Technique has been applied to many classes of domestic and international reactor systems (light-water, heavy-water, graphite, and liquid-metal) operating in a variety of modes (power, research, production, and breeder), and for a variety of reprocessing fuel cycle management strategies. Analysis of reprocessing operations data based on isotopic correlations derived for assemblies in a PWR environment and fuel management scheme, yielded differences between the measurement-derived and ICT-derived plutonium mass determinations of (-0.02 ± 0.23)% for the measured U-235 and (+0.50 ± 0.31)% for the measured Pu-239, for a core campaign. The ICT analyses has been implemented for the plutonium isotopics in a depleted uranium assembly in a heavy-water, enriched uranium system and for the uranium isotopes in the fuel assemblies in light-water, highly-enriched systems. 7 refs., 5 figs., 4 tabs

  16. An engineering optimization method with application to STOL-aircraft approach and landing trajectories

    Science.gov (United States)

    Jacob, H. G.

    1972-01-01

    An optimization method has been developed that computes the optimal open loop inputs for a dynamical system by observing only its output. The method reduces to static optimization by expressing the inputs as series of functions with parameters to be optimized. Since the method is not concerned with the details of the dynamical system to be optimized, it works for both linear and nonlinear systems. The method and the application to optimizing longitudinal landing paths for a STOL aircraft with an augmented wing are discussed. Noise, fuel, time, and path deviation minimizations are considered with and without angle of attack, acceleration excursion, flight path, endpoint, and other constraints.

  17. Optimization of a large integrated area development of gas fields offshore Sarawak, Malaysia

    International Nuclear Information System (INIS)

    Inyang, S.E.; Tak, A.N.H.; Costello, G.

    1995-01-01

    Optimizations of field development plans are routine in the industry. The size, schedule and nature of the upstream gas supply project to the second Malaysia LNG (MLNG Dua) plant in Bintulu, Sarawak made the need for extensive optimizations critical to realizing a robust and cost effective development scheme, and makes the work of more general interest. The project comprises the upstream development of 11 offshore fields for gas supply to MLNG Dua plant at an initial plateau production of 7.8 million tons per year of LNG. The gas fields span a large geographical area in medium water depths (up to 440 ft), and contain gas reserves of a distinctly variable gas quality. This paper describes the project optimization efforts aimed to ensure an upstream gas supply system effectiveness of over 99% throughout the project life while maintaining high safety and environmental standards and also achieving an economic development in an era of low hydrocarbon prices. Fifty percent of the first of the three phases of this gas supply project has already been completed and the first gas from these fields is scheduled to be available by the end of 1995

  18. Development and Field Test of Voltage VAR Optimization in the Korean Smart Distribution Management System

    Directory of Open Access Journals (Sweden)

    Sang-Yun Yun

    2014-02-01

    Full Text Available This paper is a summary of the development and demonstration of an optimization program, voltage VAR optimization (VVO, in the Korean Smart Distribution Management System (KSDMS. KSDMS was developed to address the lack of receptivity of distributed generators (DGs, standardization and compatibility, and manual failure recovery in the existing Korean automated distribution system. Focusing on the lack of receptivity of DGs, we developed a real-time system analysis and control program. The KSDMS VVO enhances manual system operation of the existing distribution system and provides a solution with all control equipment operated at a system level. The developed VVO is an optimal power flow (OPF method that resolves violations, minimizes switching costs, and minimizes loss, and its function can vary depending on the operator’s command. The sequential mixed integer linear programming (SMILP method was adopted to find the solution of the OPF. We tested the precision of the proposed VVO on selected simulated systems and its applicability to actual systems at two substations on the Jeju Island. Running the KSDMS VVO on a regular basis improved system stability, and it also raised no issues regarding its applicability to actual systems.

  19. Optimization of urban spatial development against flooding and other climate risks, and wider sustainability objectives

    Directory of Open Access Journals (Sweden)

    Caparros-Midwood Daniel

    2016-01-01

    Full Text Available A spatial optimization framework has been developed to help urban areas mitigate climate risks such as flooding and to curb resource use and greenhouse gas emissions. Measures required to address these issues often conflict with each other, for example more compact cities typically use less energy for transportation but increase runoff from high intensity rainfall events. Balancing potential trade-offs and maximizing synergies between these risks and vulnerabilities is therefore a multi-dimensional, spatial, challenge for urban planners. A spatial optimization framework is used to optimize the following objectives to minimize: (1 risk from heat waves; (2 risk from flooding; (3 the distance of new development to the current central business district; (4 urban sprawl to prevent increased travel costs; and (5 the development of green-space. The framework is applied to a real case study in the North East of England. From an initial configuration, alternative spatial configurations are tested against these objectives and the spatial pattern is evolved over successive generations to search for spatially optimum configurations. The resulting solutions provide planners with a range of robust spatial development patterns known to be best trade-offs which mitigate conflicts between risk and sustainability objectives.

  20. Development and optimization of locust bean gum and sodium alginate interpenetrating polymeric network of capecitabine.

    Science.gov (United States)

    Upadhyay, Mansi; Adena, Sandeep Kumar Reddy; Vardhan, Harsh; Pandey, Sureshwar; Mishra, Brahmeshwar

    2018-03-01

    The objective of the study was to develop interpenetrating polymeric network (IPN) of capecitabine (CAP) using natural polymers locust bean gum (LBG) and sodium alginate (NaAlg). The IPN microbeads were optimized by Box-Behnken Design (BBD) to provide anticipated particle size with good drug entrapment efficiency. The comparative dissolution profile of IPN microbeads of CAP with the marketed preparation proved an excellent sustained drug delivery vehicle. Ionotropic gelation method utilizing metal ion calcium (Ca 2+ ) as a cross-linker was used to prepare IPN microbeads. The optimization study was done by response surface methodology based Box-Behnken Design. The effect of the factors on the responses of optimized batch was exhibited through response surface and contour plots. The optimized batch was analyzed for particle size, % drug entrapment, pharmacokinetic study, in vitro drug release study and further characterized by FTIR, XRD, and SEM. To study the water uptake capacity and hydrodynamic activity of the polymers, swelling studies and viscosity measurement were performed, respectively. The particle size and % drug entrapment of the optimized batch was 494.37 ± 1.4 µm and 81.39 ± 2.9%, respectively, closer to the value predicted by Minitab 17 software. The in vitro drug release study showed sustained release of 92% for 12 h and followed anomalous drug release pattern. The derived pharmacokinetic parameters of optimized batch showed improved results than pure CAP. Thus, the formed IPN microbeads of CAP proved to be an effective extended drug delivery vehicle for the water soluble antineoplastic drug.

  1. Optimized ex-ovo culturing of chick embryos to advanced stages of development.

    Science.gov (United States)

    Cloney, Kellie; Franz-Odendaal, Tamara Anne

    2015-01-24

    Research in anatomy, embryology, and developmental biology has largely relied on the use of model organisms. In order to study development in live embryos model organisms, such as the chicken, are often used. The chicken is an excellent model organism due to its low cost and minimal maintenance, however they present observational challenges because they are enclosed in an opaque eggshell. In order to properly view the embryo as it develops, the shell must be windowed or removed. Both windowing and ex ovo techniques have been developed to assist researchers in the study of embryonic development. However, each of the methods has limitations and challenges. Here, we present a simple, optimized ex ovo culture technique for chicken embryos that enables the observation of embryonic development from stage HH 19 into late stages of development (HH 40), when many organs have developed. This technique is easy to adopt in both undergraduate classes and more advanced research laboratories where embryo manipulations are conducted.

  2. Enhancement and Optimization Mechanisms of Biogas Production for Rural Household Energy in Developing Countries: A review

    Directory of Open Access Journals (Sweden)

    Yitayal Addis Alemayehu

    2015-10-01

    Full Text Available Anaerobic digestion is common but vital process used for biogas and fertilizer production as well as one method for waste treatment. The process is currently used in developing countries primarily for biogas production in the household level of rural people. The aim of this review is to indicate possible ways of including rural households who own less than four heads of cattle for the biogas programs in developing countries. The review provides different research out puts on using biogas substrates other than cow dung or its mix through different enhancement and optimization mechanisms. Many biodegradable materials have been studied for alternative methane production. Therefore, these substrates could be used for production by addressing the optimum conditions for each factor and each processes for enhanced and optimized biogas production.

  3. Development of multidisciplinary design optimization procedures for smart composite wings and turbomachinery blades

    Science.gov (United States)

    Jha, Ratneshwar

    Multidisciplinary design optimization (MDO) procedures have been developed for smart composite wings and turbomachinery blades. The analysis and optimization methods used are computationally efficient and sufficiently rigorous. Therefore, the developed MDO procedures are well suited for actual design applications. The optimization procedure for the conceptual design of composite aircraft wings with surface bonded piezoelectric actuators involves the coupling of structural mechanics, aeroelasticity, aerodynamics and controls. The load carrying member of the wing is represented as a single-celled composite box beam. Each wall of the box beam is analyzed as a composite laminate using a refined higher-order displacement field to account for the variations in transverse shear stresses through the thickness. Therefore, the model is applicable for the analysis of composite wings of arbitrary thickness. Detailed structural modeling issues associated with piezoelectric actuation of composite structures are considered. The governing equations of motion are solved using the finite element method to analyze practical wing geometries. Three-dimensional aerodynamic computations are performed using a panel code based on the constant-pressure lifting surface method to obtain steady and unsteady forces. The Laplace domain method of aeroelastic analysis produces root-loci of the system which gives an insight into the physical phenomena leading to flutter/divergence and can be efficiently integrated within an optimization procedure. The significance of the refined higher-order displacement field on the aeroelastic stability of composite wings has been established. The effect of composite ply orientations on flutter and divergence speeds has been studied. The Kreisselmeier-Steinhauser (K-S) function approach is used to efficiently integrate the objective functions and constraints into a single envelope function. The resulting unconstrained optimization problem is solved using the

  4. Identifying the sociological implications of the main aspects affecting the optimal sporting career development

    OpenAIRE

    2014-01-01

    M.Phil. (Sport Management) This study is strengthened by several studies that have indicated that the dualist nature of student-athletes is problematic, as well as the management thereof. The study aimed to identify the sociological implications of the main aspects affecting the optimal sporting career development in athletics (throwers) at University of Johannesburg Sport, and offers recommendations for managing student-athletes. The methods utilized for this study included: i) self-desig...

  5. Optimization of urban spatial development against flooding and other climate risks, and wider sustainability objectives

    OpenAIRE

    Caparros-Midwood Daniel; Dawson Richard; Barr Stuart

    2016-01-01

    A spatial optimization framework has been developed to help urban areas mitigate climate risks such as flooding and to curb resource use and greenhouse gas emissions. Measures required to address these issues often conflict with each other, for example more compact cities typically use less energy for transportation but increase runoff from high intensity rainfall events. Balancing potential trade-offs and maximizing synergies between these risks and vulnerabilities is therefore a multi-dimen...

  6. A Technical and Economic Optimization Approach to Exploring Offshore Renewable Energy Development in Hawaii

    Energy Technology Data Exchange (ETDEWEB)

    Larson, Kyle B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Tagestad, Jerry D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Perkins, Casey J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Oster, Matthew R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Warwick, M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Geerlofs, Simon H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-09-01

    This study was conducted with the support of the U.S. Department of Energy’s (DOE’s) Wind and Water Power Technologies Office (WWPTO) as part of ongoing efforts to minimize key risks and reduce the cost and time associated with permitting and deploying ocean renewable energy. The focus of the study was to discuss a possible approach to exploring scenarios for ocean renewable energy development in Hawaii that attempts to optimize future development based on technical, economic, and policy criteria. The goal of the study was not to identify potentially suitable or feasible locations for development, but to discuss how such an approach may be developed for a given offshore area. Hawaii was selected for this case study due to the complex nature of the energy climate there and DOE’s ongoing involvement to support marine spatial planning for the West Coast. Primary objectives of the study included 1) discussing the political and economic context for ocean renewable energy development in Hawaii, especially with respect to how inter-island transmission may affect the future of renewable energy development in Hawaii; 2) applying a Geographic Information System (GIS) approach that has been used to assess the technical suitability of offshore renewable energy technologies in Washington, Oregon, and California, to Hawaii’s offshore environment; and 3) formulate a mathematical model for exploring scenarios for ocean renewable energy development in Hawaii that seeks to optimize technical and economic suitability within the context of Hawaii’s existing energy policy and planning.

  7. Development of an optimized random amplified polymorphic DNA protocol for fingerprinting of Klebsiella pneumoniae.

    Science.gov (United States)

    Ashayeri-Panah, M; Eftekhar, F; Feizabadi, M M

    2012-04-01

    To develop an optimized random amplified polymorphic DNA (RAPD) protocol for fingerprinting clinical isolates of Klebsiella pneumoniae. Employing factorial design of experiments, repeatable amplification patterns were obtained for 54 nosocomial isolates using 1 μmol 1(-1) primer, 4 mmol 1(-1) MgCl(2), 0·4 mmol 1(-1) dNTPs, 2·5 U Taq DNA polymerase and 90 ng DNA template in a total volume of 25 μl. The optimum thermocycling program was: initial denaturation at 94°C for 4 min followed by 50 cycles of 1 min at 94°C, 2 min at 34°C, 2 min at 72°C and a final extension at 72°C for 10 min. The optimized RAPD protocol was highly discriminatory (Simpson's diversity index, 0·982), and all isolates were typable with repeatable patterns (Pearson's similarity coefficient ≈ 100%). Seven main clusters were obtained on a similarity level of 70% and 32 distinct clusters on a similarity level of 85%, reflecting the heterogeneity of the isolates. Systematic optimization of RAPD generated reliable DNA fingerprints for nosocomial isolates of K. pneumoniae. This is the first report on RAPD optimization based on factorial design of experiments for discrimination of K. pneumoniae. © 2012 The Authors. Letters in Applied Microbiology © 2012 The Society for Applied Microbiology.

  8. Development and Optimization of Osmotically Controlled Asymmetric Membrane Capsules for Delivery of Solid Dispersion of Lycopene

    Directory of Open Access Journals (Sweden)

    Nitin Jain

    2014-01-01

    Full Text Available The aim of the present investigation is to develop and statistically optimize the osmotically controlled asymmetric membrane capsules of solid dispersion of lycopene. Solid dispersions of lycopene with β-cyclodextrin in different ratios were prepared using solvent evaporation method. Solubility studies showed that the solid dispersion with 1 : 5 (lycopene : β-cyclodextrin exhibited optimum solubility (56.25 mg/mL for osmotic controlled delivery. Asymmetric membrane capsules (AMCs were prepared on glass mold pins via dip coating method. Membrane characterization by scanning electron microscopy showed inner porous region and outer dense region. Central composite design response surface methodology was applied for the optimization of AMCs. The independent variables were ethyl cellulose (X1, glycerol (X2, and NaCl (X3 which were varied at different levels to analyze the effect on dependent variables (percentage of cumulative drug release (Y1 and correlation coefficient of drug release (Y2. The effect of independent variables on the response was significantly influential. The F18 was selected as optimized formulation based on percentage of CDR (cumulative drug release of 85.63% and correlation coefficient of 0.9994. The optimized formulation was subjected to analyze the effect of osmotic pressure and agitational intensity on percentage of CDR. The drug release was independent of agitational intensity but was dependent on osmotic pressure of dissolution medium.

  9. Development of Pangasius steaks by improved sous-vide technology and its process optimization.

    Science.gov (United States)

    Kumari, Namita; Singh, Chongtham Baru; Kumar, Raushan; Martin Xavier, K A; Lekshmi, Manjusha; Venkateshwarlu, Gudipati; Balange, Amjad K

    2016-11-01

    The present study embarked on the objective of optimizing improved sous - vide processing condition for development of ready-to-cook Pangasius steaks with extended shelf-life using response surface methodology. For the development of improved sous - vide cooked product, Pangasius steaks were treated with additional hurdles in various combinations for optimization. Based on the study, suitable combination of chitosan and spices was selected which enhanced antimicrobial and oxidative stability of the product. The Box-Behnken experimental design with 15 trials per model was adopted for designing the experiment to know the effect of independent variables, namely chitosan concentration (X 1 ), cooking time (X 2 ) and cooking temperature (X 3 ) on dependent variable i.e. TBARS value (Y 1 ). From RSM generated model, the optimum condition for sous - vide processing of Pangasius steaks were 1.08% chitosan concentration, 70.93 °C of cooking temperature and 16.48 min for cooking time and predicted minimum value of multiple response optimal condition was Y = 0.855 mg MDA/Kg of fish. The high correlation coefficient (R 2  = 0.975) between the model and the experimental data showed that the model was able to efficiently predict processing condition for development of sous - vide processed Pangasius steaks. This research may help the processing industries and Pangasius fish farmer as it provides an alternative low cost technology for the proper utilization of Pangasius .

  10. A Fully Developed Flow Thermofluid Model for Topology Optimization of 3D-Printed Air-Cooled Heat Exchangers

    DEFF Research Database (Denmark)

    Haertel, Jan Hendrik Klaas; Nellis, Gregory F.

    2017-01-01

    In this work, density-based topology optimization is applied to the design of the air-side surface of dry-cooled power plant condensers. A topology optimization model assuming a steady-state, thermally and fluid dynamically fully developed internal flow is developed and used for this application....

  11. A new and fast image feature selection method for developing an optimal mammographic mass detection scheme.

    Science.gov (United States)

    Tan, Maxine; Pu, Jiantao; Zheng, Bin

    2014-08-01

    Selecting optimal features from a large image feature pool remains a major challenge in developing computer-aided detection (CAD) schemes of medical images. The objective of this study is to investigate a new approach to significantly improve efficacy of image feature selection and classifier optimization in developing a CAD scheme of mammographic masses. An image dataset including 1600 regions of interest (ROIs) in which 800 are positive (depicting malignant masses) and 800 are negative (depicting CAD-generated false positive regions) was used in this study. After segmentation of each suspicious lesion by a multilayer topographic region growth algorithm, 271 features were computed in different feature categories including shape, texture, contrast, isodensity, spiculation, local topological features, as well as the features related to the presence and location of fat and calcifications. Besides computing features from the original images, the authors also computed new texture features from the dilated lesion segments. In order to select optimal features from this initial feature pool and build a highly performing classifier, the authors examined and compared four feature selection methods to optimize an artificial neural network (ANN) based classifier, namely: (1) Phased Searching with NEAT in a Time-Scaled Framework, (2) A sequential floating forward selection (SFFS) method, (3) A genetic algorithm (GA), and (4) A sequential forward selection (SFS) method. Performances of the four approaches were assessed using a tenfold cross validation method. Among these four methods, SFFS has highest efficacy, which takes 3%-5% of computational time as compared to GA approach, and yields the highest performance level with the area under a receiver operating characteristic curve (AUC) = 0.864 ± 0.034. The results also demonstrated that except using GA, including the new texture features computed from the dilated mass segments improved the AUC results of the ANNs optimized

  12. Multi-Objective Optimization for Analysis of Changing Trade-Offs in the Nepalese Water–Energy–Food Nexus with Hydropower Development

    Directory of Open Access Journals (Sweden)

    Sanita Dhaubanjar

    2017-02-01

    Full Text Available While the water–energy–food nexus approach is becoming increasingly important for more efficient resource utilization and economic development, limited quantitative tools are available to incorporate the approach in decision-making. We propose a spatially explicit framework that couples two well-established water and power system models to develop a decision support tool combining multiple nexus objectives in a linear objective function. To demonstrate our framework, we compare eight Nepalese power development scenarios based on five nexus objectives: minimization of power deficit, maintenance of water availability for irrigation to support food self-sufficiency, reduction in flood risk, maintenance of environmental flows, and maximization of power export. The deterministic multi-objective optimization model is spatially resolved to enable realistic representation of the nexus linkages and accounts for power transmission constraints using an optimal power flow approach. Basin inflows, hydropower plant specifications, reservoir characteristics, reservoir rules, irrigation water demand, environmental flow requirements, power demand, and transmission line properties are provided as model inputs. The trade-offs and synergies among these objectives were visualized for each scenario under multiple environmental flow and power demand requirements. Spatially disaggregated model outputs allowed for the comparison of scenarios not only based on fulfillment of nexus objectives but also scenario compatibility with existing infrastructure, supporting the identification of projects that enhance overall system efficiency. Though the model is applied to the Nepalese nexus from a power development perspective here, it can be extended and adapted for other problems.

  13. Analysis of metal(loid)s contamination and their continuous input in soils around a zinc smelter: Development of methodology and a case study in South Korea.

    Science.gov (United States)

    Yun, Sung-Wook; Baveye, Philippe C; Kim, Dong-Hyeon; Kang, Dong-Hyeon; Lee, Si-Young; Kong, Min-Jae; Park, Chan-Gi; Kim, Hae-Do; Son, Jinkwan; Yu, Chan

    2018-07-01

    Soil contamination due to atmospheric deposition of metals originating from smelters is a global environmental problem. A common problem associated with this contamination is the discrimination between anthropic and natural contributions to soil metal concentrations: In this context, we investigated the characteristics of soil contamination in the surrounding area of a world class smelter. We attempted to combine several approaches in order to identify sources of metals in soils and to examine contamination characteristics, such as pollution level, range, and spatial distribution. Soil samples were collected at 100 sites during a field survey and total concentrations of As, Cd, Cr, Cu, Fe, Hg, Ni, Pb, and Zn were analyzed. We conducted a multivariate statistical analysis, and also examined the spatial distribution by 1) identifying the horizontal variation of metals according to particular wind directions and distance from the smelter and 2) drawing a distribution map by means of a GIS tool. As, Cd, Cu, Hg, Pb, and Zn in the soil were found to originate from smelter emissions, and As also originated from other sources such as abandoned mines and waste landfill. Among anthropogenic metals, the horizontal distribution of Cd, Hg, Pb, and Zn according to the downwind direction and distance from the smelter showed a typical feature of atmospheric deposition (regression model: y = y 0  + αe -βx ). Lithogenic Fe was used as an indicator, and it revealed the continuous input and accumulation of these four elements in the surrounding soils. Our approach was effective in clearly identifying the sources of metals and analyzing their contamination characteristics. We believe this study will provide useful information to future studies on soil pollution by metals around smelters. Copyright © 2018 Elsevier Ltd. All rights reserved.

  14. Study of Research and Development Processes through Fuzzy Super FRM Model and Optimization Solutions

    Directory of Open Access Journals (Sweden)

    Flavius Aurelian Sârbu

    2015-01-01

    Full Text Available The aim of this study is to measure resources for R&D (research and development at the regional level in Romania and also obtain primary data that will be important in making the right decisions to increase competitiveness and development based on an economic knowledge. As our motivation, we would like to emphasize that by the use of Super Fuzzy FRM model we want to determine the state of R&D processes at regional level using a mean different from the statistical survey, while by the two optimization methods we mean to provide optimization solutions for the R&D actions of the enterprises. Therefore to fulfill the above mentioned aim in this application-oriented paper we decided to use a questionnaire and for the interpretation of the results the Super Fuzzy FRM model, representing the main novelty of our paper, as this theory provides a formalism based on matrix calculus, which allows processing of large volumes of information and also delivers results difficult or impossible to see, through statistical processing. Furthermore another novelty of the paper represents the optimization solutions submitted in this work, given for the situation when the sales price is variable, and the quantity sold is constant in time and for the reverse situation.

  15. Development and optimization of gastroretentive mucoadhesive microspheres of gabapentin by Box-Behnken design.

    Science.gov (United States)

    Gaur, Praveen Kumar; Mishra, Shikha; Kumar, Avdhesh; Panda, Bibhu Prasad

    2014-06-01

    Gabapentin follows saturation kinetics for absorption because of carrier-mediated transport and narrow absorption window in stomach. There is need to develop a gastroretentive formulation to maximize the absorption without crossing the saturation threshold for absorption. The aim was to develop a gastroretentive formulation of gabapentin to increase the fraction of drug absorbed in stomach. Sodium alginate and sodium carboxymethylcellulose were used to formulate the microsphere by ionotropic gelation with calcium chloride. The formulation was optimized using a three-factor, three-level Box-Behnken design. The particle size varied from 559.50 to 801.10 μm, entrapment efficiency from 61.29 to 81.00% and in vitro release from 69.40 to 83.70%. The optimized formulation was found using point-prediction, and formulation OF-3 showed optimum results at 608.21 μm size, 79.65% entrapment efficiency and 82.72% drug release and 81% mucoadhesion up to 10 h. The drug release was controlled for more than 12 h. The particle size was most influenced by sodium alginate while entrapment efficiency and drug release depended upon both polymers. The release followed Higuchi model. Gastroretentive formulation was successfully optimized by a three-factor, three-level Box-Behnken design and found to be useful.

  16. QbD for pediatric oral lyophilisates development: risk assessment followed by screening and optimization.

    Science.gov (United States)

    Casian, Tibor; Iurian, Sonia; Bogdan, Catalina; Rus, Lucia; Moldovan, Mirela; Tomuta, Ioan

    2017-12-01

    This study proposed the development of oral lyophilisates with respect to pediatric medicine development guidelines, by applying risk management strategies and DoE as an integrated QbD approach. Product critical quality attributes were overviewed by generating Ishikawa diagrams for risk assessment purposes, considering process, formulation and methodology related parameters. Failure Mode Effect Analysis was applied to highlight critical formulation and process parameters with an increased probability of occurrence and with a high impact on the product performance. To investigate the effect of qualitative and quantitative formulation variables D-optimal designs were used for screening and optimization purposes. Process parameters related to suspension preparation and lyophilization were classified as significant factors, and were controlled by implementing risk mitigation strategies. Both quantitative and qualitative formulation variables introduced in the experimental design influenced the product's disintegration time, mechanical resistance and dissolution properties selected as CQAs. The optimum formulation selected through Design Space presented ultra-fast disintegration time (5 seconds), a good dissolution rate (above 90%) combined with a high mechanical resistance (above 600 g load). Combining FMEA and DoE allowed the science based development of a product with respect to the defined quality target profile by providing better insights on the relevant parameters throughout development process. The utility of risk management tools in pharmaceutical development was demonstrated.

  17. Risky play and children's safety: balancing priorities for optimal child development.

    Science.gov (United States)

    Brussoni, Mariana; Olsen, Lise L; Pike, Ian; Sleet, David A

    2012-08-30

    Injury prevention plays a key role in keeping children safe, but emerging research suggests that imposing too many restrictions on children's outdoor risky play hinders their development. We explore the relationship between child development, play, and conceptions of risk taking with the aim of informing child injury prevention. Generational trends indicate children's diminishing engagement in outdoor play is influenced by parental and societal concerns. We outline the importance of play as a necessary ingredient for healthy child development and review the evidence for arguments supporting the need for outdoor risky play, including: (1) children have a natural propensity towards risky play; and, (2) keeping children safe involves letting them take and manage risks. Literature from many disciplines supports the notion that safety efforts should be balanced with opportunities for child development through outdoor risky play. New avenues for investigation and action are emerging seeking optimal strategies for keeping children "as safe as necessary," not "as safe as possible." This paradigm shift represents a potential for epistemological growth as well as cross-disciplinary collaboration to foster optimal child development while preserving children's safety.

  18. Risky Play and Children’s Safety: Balancing Priorities for Optimal Child Development

    Directory of Open Access Journals (Sweden)

    David A. Sleet

    2012-08-01

    Full Text Available Injury prevention plays a key role in keeping children safe, but emerging research suggests that imposing too many restrictions on children’s outdoor risky play hinders their development. We explore the relationship between child development, play, and conceptions of risk taking with the aim of informing child injury prevention. Generational trends indicate children’s diminishing engagement in outdoor play is influenced by parental and societal concerns. We outline the importance of play as a necessary ingredient for healthy child development and review the evidence for arguments supporting the need for outdoor risky play, including: (1 children have a natural propensity towards risky play; and, (2 keeping children safe involves letting them take and manage risks. Literature from many disciplines supports the notion that safety efforts should be balanced with opportunities for child development through outdoor risky play. New avenues for investigation and action are emerging seeking optimal strategies for keeping children “as safe as necessary,” not “as safe as possible.” This paradigm shift represents a potential for epistemological growth as well as cross-disciplinary collaboration to foster optimal child development while preserving children’s safety.

  19. Rotorcraft Optimization Tools: Incorporating Rotorcraft Design Codes into Multi-Disciplinary Design, Analysis, and Optimization

    Science.gov (United States)

    Meyn, Larry A.

    2018-01-01

    One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use

  20. Optimal Control Allocation with Load Sensor Feedback for Active Load Suppression, Experiment Development

    Science.gov (United States)

    Miller, Christopher J.; Goodrick, Dan

    2017-01-01

    The problem of control command and maneuver induced structural loads is an important aspect of any control system design. The aircraft structure and the control architecture must be designed to achieve desired piloted control responses while limiting the imparted structural loads. The classical approach is to utilize high structural margins, restrict control surface commands to a limited set of analyzed combinations, and train pilots to follow procedural maneuvering limitations. With recent advances in structural sensing and the continued desire to improve safety and vehicle fuel efficiency, it is both possible and desirable to develop control architectures that enable lighter vehicle weights while maintaining and improving protection against structural damage. An optimal control technique has been explored and shown to achieve desirable vehicle control performance while limiting sensed structural loads. The subject of this paper is the design of the optimal control architecture, and provides the reader with some techniques for tailoring the architecture, along with detailed simulation results.

  1. Optimization and technological development strategies of an antimicrobial extract from Achyrocline alata assisted by statistical design.

    Directory of Open Access Journals (Sweden)

    Daniel P Demarque

    Full Text Available Achyrocline alata, known as Jateí-ka-há, is traditionally used to treat several health problems, including inflammations and infections. This study aimed to optimize an active extract against Streptococcus mutans, the main bacteria that causes caries. The extract was developed using an accelerated solvent extraction and chemometric calculations. Factorial design and response surface methodologies were used to determine the most important variables, such as active compound selectivity. The standardized extraction recovered 99% of the four main compounds, gnaphaliin, helipyrone, obtusifolin and lepidissipyrone, which represent 44% of the extract. The optimized extract of A. alata has a MIC of 62.5 μg/mL against S. mutans and could be used in mouth care products.

  2. Development of optimal management of upper gastrointestinal bleeding secondary to pancreatic sinistral portal hypertension

    Directory of Open Access Journals (Sweden)

    SONG Yang

    2014-08-01

    Full Text Available The pathogenesis of pancreatic sinistral portal hypertension (PSPH is quite different from that of cirrhotic portal hypertension, and PSPH is the only curable type of portal hypertension. Gastric variceal bleeding is a less common manifestation of PSPH; however, it probably exacerbates the patient’s condition and leads to critical illness, and inappropriate management would result in death. Therefore, it is necessary to develop the optimal management of upper gastrointestinal bleeding in PSPH patients. Splenectomy is considered as a definitive procedure, together with surgical procedures to treat underlying pancreatic diseases. For patients in poor conditions or ineligible for surgery, splenic artery coil embolization is a preferable and effective method to stop bleeding before second-stage operation. The therapeutic decision should be made individually, and the further multi-center study to optimize the management of upper gastrointestinal bleeding from PSPH is warranted.

  3. Portfolio-Scale Optimization of Customer Energy Efficiency Incentive and Marketing: Cooperative Research and Development Final Report, CRADA Number CRD-13-535

    Energy Technology Data Exchange (ETDEWEB)

    Brackney, Larry J. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-02-17

    North East utility National Grid (NGrid) is developing a portfolio-scale application of OpenStudio designed to optimize incentive and marketing expenditures for their energy efficiency (EE) programs. NGrid wishes to leverage a combination of geographic information systems (GIS), public records, customer data, and content from the Building Component Library (BCL) to form a JavaScript Object Notation (JSON) input file that is consumed by an OpenStudio-based expert system for automated model generation. A baseline model for each customer building will be automatically tuned using electricity and gas consumption data, and a set of energy conservation measures (ECMs) associated with each NGrid incentive program will be applied to the model. The simulated energy performance and return on investment (ROI) will be compared with customer hurdle rates and available incentives to A) optimize the incentive required to overcome the customer hurdle rate and B) determine if marketing activity associated with the specific ECM is warranted for that particular customer. Repeated across their portfolio, this process will enable NGrid to substantially optimize their marketing and incentive expenditures, targeting those customers that will likely adopt and benefit from specific EE programs.

  4. Extraction optimization and UHPLC method development for determination of the 20-hydroxyecdysone in Sida tuberculata leaves.

    Science.gov (United States)

    da Rosa, Hemerson S; Koetz, Mariana; Santos, Marí Castro; Jandrey, Elisa Helena Farias; Folmer, Vanderlei; Henriques, Amélia Teresinha; Mendez, Andreas Sebastian Loureiro

    2018-04-01

    Sida tuberculata (ST) is a Malvaceae species widely distributed in Southern Brazil. In traditional medicine, ST has been employed as hypoglycemic, hypocholesterolemic, anti-inflammatory and antimicrobial. Additionally, this species is chemically characterized by flavonoids, alkaloids and phytoecdysteroids mainly. The present work aimed to optimize the extractive technique and to validate an UHPLC method for the determination of 20-hydroxyecdsone (20HE) in the ST leaves. Box-Behnken Design (BBD) was used in method optimization. The extractive methods tested were: static and dynamic maceration, ultrasound, ultra-turrax and reflux. In the Box-Behnken three parameters were evaluated in three levels (-1, 0, +1), particle size, time and plant:solvent ratio. In validation method, the parameters of selectivity, specificity, linearity, limits of detection and quantification (LOD, LOQ), precision, accuracy and robustness were evaluated. The results indicate static maceration as better technique to obtain 20HE peak area in ST extract. The optimal extraction from surface response methodology was achieved with the parameters granulometry of 710 nm, 9 days of maceration and plant:solvent ratio 1:54 (w/v). The UHPLC-PDA analytical developed method showed full viability of performance, proving to be selective, linear, precise, accurate and robust for 20HE detection in ST leaves. The average content of 20HE was 0.56% per dry extract. Thus, the optimization of extractive method in ST leaves increased the concentration of 20HE in crude extract, and a reliable method was successfully developed according to validation requirements and in agreement with current legislation. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Chemometric approach for development, optimization, and validation of different chromatographic methods for separation of opium alkaloids.

    Science.gov (United States)

    Acevska, J; Stefkov, G; Petkovska, R; Kulevanova, S; Dimitrovska, A

    2012-05-01

    The excessive and continuously growing interest in the simultaneous determination of poppy alkaloids imposes the development and optimization of convenient high-throughput methods for the assessment of the qualitative and quantitative profile of alkaloids in poppy straw. Systematic optimization of two chromatographic methods (gas chromatography (GC)/flame ionization detector (FID)/mass spectrometry (MS) and reversed-phase (RP)-high-performance liquid chromatography (HPLC)/diode array detector (DAD)) for the separation of alkaloids from Papaver somniferum L. (Papaveraceae) was carried out. The effects of various conditions on the predefined chromatographic descriptors were investigated using chemometrics. A full factorial linear design of experiments for determining the relationship between chromatographic conditions and the retention behavior of the analytes was used. Central composite circumscribed design was utilized for the final method optimization. By conducting the optimization of the methods in very rational manner, a great deal of excessive and unproductive laboratory research work was avoided. The developed chromatographic methods were validated and compared in line with the resolving power, sensitivity, accuracy, speed, cost, ecological aspects, and compatibility with the poppy straw extraction procedure. The separation of the opium alkaloids using the GC/FID/MS method was achieved within 10 min, avoiding any derivatization step. This method has a stronger resolving power, shorter analysis time, better cost/effectiveness factor than the RP-HPLC/DAD method and is in line with the "green trend" of the analysis. The RP-HPLC/DAD method on the other hand displayed better sensitivity for all tested alkaloids. The proposed methods provide both fast screening and an accurate content assessment of the six alkaloids in the poppy samples obtained from the selection program of Papaver strains.

  6. Optimal sampling plan for clean development mechanism energy efficiency lighting projects

    International Nuclear Information System (INIS)

    Ye, Xianming; Xia, Xiaohua; Zhang, Jiangfeng

    2013-01-01

    Highlights: • A metering cost minimisation model is built to assist the sampling plan for CDM projects. • The model minimises the total metering cost by the determination of optimal sample size. • The required 90/10 criterion sampling accuracy is maintained. • The proposed metering cost minimisation model is applicable to other CDM projects as well. - Abstract: Clean development mechanism (CDM) project developers are always interested in achieving required measurement accuracies with the least metering cost. In this paper, a metering cost minimisation model is proposed for the sampling plan of a specific CDM energy efficiency lighting project. The problem arises from the particular CDM sampling requirement of 90% confidence and 10% precision for the small-scale CDM energy efficiency projects, which is known as the 90/10 criterion. The 90/10 criterion can be met through solving the metering cost minimisation problem. All the lights in the project are classified into different groups according to uncertainties of the lighting energy consumption, which are characterised by their statistical coefficient of variance (CV). Samples from each group are randomly selected to install power meters. These meters include less expensive ones with less functionality and more expensive ones with greater functionality. The metering cost minimisation model will minimise the total metering cost through the determination of the optimal sample size at each group. The 90/10 criterion is formulated as constraints to the metering cost objective. The optimal solution to the minimisation problem will therefore minimise the metering cost whilst meeting the 90/10 criterion, and this is verified by a case study. Relationships between the optimal metering cost and the population sizes of the groups, CV values and the meter equipment cost are further explored in three simulations. The metering cost minimisation model proposed for lighting systems is applicable to other CDM projects as

  7. Development of Legal Norms on Marriage and Divorce in Cambodia : The Civil Code Between Foreign Inputs and Local Growth (2)

    OpenAIRE

    KUONG, Teilee

    2016-01-01

    This second part of the research focuses on the development of law on divorce in Cambodia, aiming at identifying the continuation and changes in the legislative development of Cambodia in regulating divorce and post-divorce relationship. It starts with a brief overview of the legislative development from the early 20th century up till the latest codification of the 2007 Civil Code. Along this line of historical narratives, the research then examines two important elements in the legal effects...

  8. Development of Decision-Making Automated System for Optimal Placement of Physical Access Control System’s Elements

    Science.gov (United States)

    Danilova, Olga; Semenova, Zinaida

    2018-04-01

    The objective of this study is a detailed analysis of physical protection systems development for information resources. The optimization theory and decision-making mathematical apparatus is used to formulate correctly and create an algorithm of selection procedure for security systems optimal configuration considering the location of the secured object’s access point and zones. The result of this study is a software implementation scheme of decision-making system for optimal placement of the physical access control system’s elements.

  9. Depot injectable atorvastatin biodegradable in situ gel: development, optimization, in vitro, and in vivo evaluation

    Directory of Open Access Journals (Sweden)

    Ahmed TA

    2016-01-01

    Full Text Available Tarek A Ahmed,1,2 Yasser A Alharby,1 Abdel-Rahim M El-Helw,1 Khaled M Hosny,1,3 Khalid M El-Say1,21Department of Pharmaceutics and Industrial Pharmacy, Faculty of Pharmacy, King Abdulaziz University, Jeddah, Saudi Arabia; 2Department of Pharmaceutics and Industrial Pharmacy, Faculty of Pharmacy, Al-Azhar University, Cairo, Egypt; 3Department of Pharmaceutics and Industrial Pharmacy, Faculty of Pharmacy, Beni Suef University, Beni Suef, EgyptAbstract: This study aimed to develop an optimized depot injectable atorvastatin (ATR biodegradable in situ gel (ISG system with minimum initial burst using a central composite design. The factors selected were poly (D, L-lactide-co-glycolide (PLGA concentration (X1, molecular weight of polyethylene glycol (PEG (X2, and PEG concentration (X3. The independent variables were the initial burst of ATR after 2 (Y1 and 24 hours (Y2. The optimized formulation was investigated using scanning electron microscopy, Fourier transform infrared spectroscopy, and in vitro drug release in phosphate-buffered saline of pH 7.4 for 72 hours. The in vivo pharmacokinetic study of the optimized ATR-ISG and the corresponding PEG-free ATR-ISG were conducted by intramuscular injection of a single dose (2 mg/kg of ATR in male New Zealand White rabbits. A double-blind, randomized, parallel design was used in comparison with those of the marketed ATR tablet. Statistical analysis revealed that PLGA concentration and the molecular weight of PEG have pronounced effects on both Y1 and Y2. The optimized formulation was composed of 36.10% PLGA, PEG 6000, and 15.69% PEG, and exhibited characteristic in vitro release pattern with minimal initial burst. Incorporation of PEG in the formulation causes a slight decrease in the glass transition temperature value of PLGA, leading to a slight change in Fourier transform infrared spectroscopy spectrum due to possible interaction. Moreover, scanning electron microscopy photomicrograph showed smooth

  10. Engaging scientists and policy stakeholders using a land use modelling and regional scenario exercise: an input to the development of sustainability indicators for European regions

    DEFF Research Database (Denmark)

    Petrov, Laura Oana; Shahumyan, Harutyun; Williams, Brendan

    2015-01-01

    (Williams, Hughes, & Redmond, 2010; Kitchen, 2002; Hourihan, 1989). This paper investigates the Greater Dublin Region (GDR) of Ireland where urban development has been poorly controlled, leading to changes in its spatial configuration and particularly the preponderance of a sprawl pattern of development...... for a methodology for practical action to be used by scientists and stakeholders to ensure effective on-going collaborations. They also allow us to grasp crucial ideas about urban development processes, sustainable growth management and their possible consequences in the regional context in Europe and worldwide....

  11. Development of GEM detector for plasma diagnostics application: simulations addressing optimization of its performance

    Science.gov (United States)

    Chernyshova, M.; Malinowski, K.; Kowalska-Strzęciwilk, E.; Czarski, T.; Linczuk, P.; Wojeński, A.; Krawczyk, R. D.

    2017-12-01

    The advanced Soft X-ray (SXR) diagnostics setup devoted to studies of the SXR plasma emissivity is at the moment a highly relevant and important for ITER/DEMO application. Especially focusing on the energy range of tungsten emission lines, as plasma contamination by W and its transport in the plasma must be understood and monitored for W plasma-facing material. The Gas Electron Multiplier, with a spatial and energy-resolved photon detecting chamber, based SXR radiation detection system under development by our group may become such a diagnostic setup considering and solving many physical, technical and technological aspects. This work presents the results of simulations aimed to optimize a design of the detector's internal chamber and its performance. The study of the effect of electrodes alignment allowed choosing the gap distances which maximizes electron transmission and choosing the optimal magnitudes of the applied electric fields. Finally, the optimal readout structure design was identified suitable to collect a total formed charge effectively, basing on the range of the simulated electron cloud at the readout plane which was in the order of ~ 2 mm.

  12. Super-capacitors fuel-cell hybrid electric vehicle optimization and control strategy development

    International Nuclear Information System (INIS)

    Paladini, Vanessa; Donateo, Teresa; De Risi, Arturo; Laforgia, Domenico

    2007-01-01

    In the last decades, due to emissions reduction policies, research focused on alternative powertrains among which hybrid electric vehicles (HEVs) powered by fuel cells are becoming an attractive solution. One of the main issues of these vehicles is the energy management in order to improve the overall fuel economy. The present investigation aims at identifying the best hybrid vehicle configuration and control strategy to reduce fuel consumption. The study focuses on a car powered by a fuel cell and equipped with two secondary energy storage devices: batteries and super-capacitors. To model the powertrain behavior an on purpose simulation program called ECoS has been developed in Matlab/Simulink environment. The fuel cell model is based on the Amphlett theory. The battery and the super-capacitor models account for charge/discharge efficiency. The analyzed powertrain is also equipped with an energy regeneration system to recover braking energy. The numerical optimization of vehicle configuration and control strategy of the hybrid electric vehicle has been carried out with a multi objective genetic algorithm. The goal of the optimization is the reduction of hydrogen consumption while sustaining the battery state of charge. By applying the algorithm to different driving cycles, several optimized configurations have been identified and discussed

  13. Analytical development and optimization of a graphene–solution interface capacitance model

    Directory of Open Access Journals (Sweden)

    Hediyeh Karimi

    2014-05-01

    Full Text Available Graphene, which as a new carbon material shows great potential for a range of applications because of its exceptional electronic and mechanical properties, becomes a matter of attention in these years. The use of graphene in nanoscale devices plays an important role in achieving more accurate and faster devices. Although there are lots of experimental studies in this area, there is a lack of analytical models. Quantum capacitance as one of the important properties of field effect transistors (FETs is in our focus. The quantum capacitance of electrolyte-gated transistors (EGFETs along with a relevant equivalent circuit is suggested in terms of Fermi velocity, carrier density, and fundamental physical quantities. The analytical model is compared with the experimental data and the mean absolute percentage error (MAPE is calculated to be 11.82. In order to decrease the error, a new function of E composed of α and β parameters is suggested. In another attempt, the ant colony optimization (ACO algorithm is implemented for optimization and development of an analytical model to obtain a more accurate capacitance model. To further confirm this viewpoint, based on the given results, the accuracy of the optimized model is more than 97% which is in an acceptable range of accuracy.

  14. Development of Optimal Water-Resources Management Strategies for Kaidu-Kongque Watershed under Multiple Uncertainties

    Directory of Open Access Journals (Sweden)

    Y. Zhou

    2013-01-01

    Full Text Available In this study, an interval-stochastic fractile optimization (ISFO model is advanced for developing optimal water-resources management strategies under multiple uncertainties. The ISFO model can not only handle uncertainties presented in terms of probability distributions and intervals with possibility distribution boundary, but also quantify subjective information (i.e., expected system benefit preference and risk-averse attitude from different decision makers. The ISFO model is then applied to a real case of water-resources systems planning in Kaidu-kongque watershed, China, and a number of scenarios with different ecological water-allocation policies under varied p-necessity fractiles are analyzed. Results indicate that different policies for ecological water allocation can lead to varied water supplies, economic penalties, and system benefits. The solutions obtained can help decision makers identify optimized water-allocation alternatives, alleviate the water supply-demand conflict, and achieve socioeconomic and ecological sustainability, particularly when limited water resources are available for multiple competing users.

  15. Development and optimization of operational parameters of a gas-fired baking oven

    OpenAIRE

    Afolabi Tunde MORAKINYO; Babatunde OMIDIJI; Hakeem OWOLABI

    2017-01-01

    This study presented the development and optimization of operational parameters of an indigenous gas-fired bread-baking oven for small-scale entrepreneur. It is an insulated rectangular box-like chamber, made of galvanized-steel sheets and having a total dimension of 920mm×650mm×600mm. This oven consists of two baking compartments and three combustion chambers. The oven characteristics were evaluated in terms of the baking capacity, baking efficiency and weight loss of the baked bread. The ph...

  16. Development of a graphical interface computer code for reactor fuel reloading optimization

    International Nuclear Information System (INIS)

    Do Quang Binh; Nguyen Phuoc Lan; Bui Xuan Huy

    2007-01-01

    This report represents the results of the project performed in 2007. The aim of this project is to develop a graphical interface computer code that allows refueling engineers to design fuel reloading patterns for research reactor using simulated graphical model of reactor core. Besides, this code can perform refueling optimization calculations based on genetic algorithms as well as simulated annealing. The computer code was verified based on a sample problem, which relies on operational and experimental data of Dalat research reactor. This code can play a significant role in in-core fuel management practice at nuclear research reactor centers and in training. (author)

  17. Developing a Model for Optimizing Inventory of Repairable Items at Single Operating Base

    OpenAIRE

    Le, Tin

    2016-01-01

    The use of EOQ model in inventory management is popular. However, EOQ models has many disadvantages, especially, when the model is applied to manage repairable items. In order to deal with high-cost and repairable items, Craig C. Sherbrooke introduced a model in his book “Optimal Inventory Modeling of Systems: Multi-Echelon Techniques”. The research focus is to implement and develop a program to execute the single-site in-ventory model for repairable items. The model helps to significantl...

  18. Mechanistic Models for Process Development and Optimization of Fed-batch Fermentation Systems

    DEFF Research Database (Denmark)

    Mears, Lisa; Stocks, Stuart M.; Albæk, Mads O.

    2016-01-01

    This work discusses the application of mechanistic models to pilot scale filamentous fungal fermentation systems operated at Novozymes A/S. For on-line applications, a state estimator model is developed based on a stoichiometric balance in order to predict the biomass and product concentration....... This is based on on-line gas measurements and ammonia addition flow rate measurements. Additionally, a mechanistic model is applied offline as a tool for batch planning, based on definition of the process back pressure, aeration rate and stirrer speed. This allows the batch starting fill to be planned, taking...... into account the oxygen transfer conditions, as well as the evaporation rates of the system. Mechanistic models are valuable tools which are applicable for both process development and optimization. The state estimator described will be a valuable tool for future work as part of control strategy development...

  19. Ask the experts: the challenges and benefits of flow chemistry to optimize drug development.

    Science.gov (United States)

    Anderson, Neal; Gernaey, Krist V; Jamison, Timothy F; Kircher, Manfred; Wiles, Charlotte; Leadbeater, Nicholas E; Sandford, Graham; Richardson, Paul

    2012-09-01

    Against a backdrop of a struggling economic and regulatory climate, pharmaceutical companies have recently been forced to develop new ways to provide more efficient technology to meet the demands of a competitive drug industry. This issue, coupled with an increase in patent legislation and a rising generics market, makes these themes common issues in the growth of drug development. As a consequence, the importance of process chemistry and scale-up has never been more under the spotlight. Future Medicinal Chemistry wishes to share the thoughts and opinions of a variety of experts from this field, discussing issues concerning the use of flow chemistry to optimize drug development, the potential regulatory and environmental challenges faced with this, and whether the academic and industrial sectors could benefit from a more harmonized system relevant to process chemistry.

  20. The development of a safety analysis methodology for the optimized power reactor 1000

    International Nuclear Information System (INIS)

    Hwang-Yong, Jun; Yo-Han, Kim

    2005-01-01

    Korea Electric Power Research Institute (KEPRI) has been developing inhouse safety analysis methodology based on the delicate codes available to KEPRI to overcome the problems arising from currently used vendor oriented methodologies. For the Loss of Coolant Accident (LOCA) analysis, the KREM (KEPRI Realistic Evaluation Methodology) has been developed based on the RELAP-5 code. The methodology was approved for the Westinghouse 3-loop plants by the Korean regulatory organization and the project to extent the methodology to the Optimized Power Reactor 1000 (OPR1000) has been ongoing since 2001. Also, for the Non-LOCA analysis, the KNAP (Korea Non-LOCA Analysis Package) has been developed using the UNICORN-TM code system. To demonstrate the feasibility of these codes systems and methodologies, some typical cases of the design basis accidents mentioned in the final safety analysis report (FSAR) were analyzed. (author)

  1. Railway optimal network simulation for the development of regional transport-logistics system

    Directory of Open Access Journals (Sweden)

    Mikhail Borisovich Petrov

    2013-12-01

    Full Text Available The dependence of logistics on mineral fuel is a stable tendency of regions development, though when making strategic plans of logistics in the regions, it is necessary to provide the alternative possibilities of power-supply sources change together with population density, transport infrastructure peculiarities, and demographic changes forecast. On the example of timber processing complex of the Sverdlovsk region, the authors suggest the algorithm of decision of the optimal logistics infrastructure allocation. The problem of regional railway network organization at the stage of slow transition from the prolonged stagnation to the new development is carried out. The transport networks’ configurations of countries on the Pacific Rim, which successfully developed nowadays, are analyzed. The authors offer some results of regional transport network simulation on the basis of artificial intelligence method. These methods let to solve the task with incomplete data. The ways of the transport network improvement in the Sverdlovsk region are offered.

  2. Development of Off-take Model, Subcooled Boiling Model, and Radiation Heat Transfer Input Model into the MARS Code for a Regulatory Auditing of CANDU Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, C.; Rhee, B. W.; Chung, B. D. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Ahn, S. H.; Kim, M. W. [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2009-05-15

    Korea currently has four operating units of the CANDU-6 type reactor in Wolsong. However, the safety assessment system for CANDU reactors has not been fully established due to a lack of self-reliance technology. Although the CATHENA code had been introduced from AECL, it is undesirable to use a vendor's code for a regulatory auditing analysis. In Korea, the MARS code has been developed for decades and is being considered by KINS as a thermal hydraulic regulatory auditing tool for nuclear power plants. Before this decision, KINS (Korea Institute of Nuclear Safety) had developed the RELAP5/MOD3/CANDU code for CANDU safety analyses by modifying the model of the existing PWR auditing tool, RELAP5/MOD3. The main purpose of this study is to transplant the CANDU models of the RELAP5/MOD3/CANDU code to the MARS code including a quality assurance of the developed models.

  3. Development of Off-take Model, Subcooled Boiling Model, and Radiation Heat Transfer Input Model into the MARS Code for a Regulatory Auditing of CANDU Reactors

    International Nuclear Information System (INIS)

    Yoon, C.; Rhee, B. W.; Chung, B. D.; Ahn, S. H.; Kim, M. W.

    2009-01-01

    Korea currently has four operating units of the CANDU-6 type reactor in Wolsong. However, the safety assessment system for CANDU reactors has not been fully established due to a lack of self-reliance technology. Although the CATHENA code had been introduced from AECL, it is undesirable to use a vendor's code for a regulatory auditing analysis. In Korea, the MARS code has been developed for decades and is being considered by KINS as a thermal hydraulic regulatory auditing tool for nuclear power plants. Before this decision, KINS (Korea Institute of Nuclear Safety) had developed the RELAP5/MOD3/CANDU code for CANDU safety analyses by modifying the model of the existing PWR auditing tool, RELAP5/MOD3. The main purpose of this study is to transplant the CANDU models of the RELAP5/MOD3/CANDU code to the MARS code including a quality assurance of the developed models

  4. Congressional interest and input

    International Nuclear Information System (INIS)

    Donnelly, W.H.

    1985-01-01

    While congressional interest in nonproliferation policy has been evident since the 1940s, the 1970s were propitious for efforts by Congress to exert influence in this sphere. Its suspicions of the executive branch had been stirred by controversies over Vietnam and Watergate at the beginning of the decade; by the end of the decade, Congress was able to curtail the unrestrained freedom of the executive branch to carry out the vaguely stated policies of the Atomic Energy Act of 1954. Congressional nonproliferation interests were further amplified during the decade by pressures from the expanding environmental movement, which included a strong antinuclear plank. This was to bring down the powerful Atomic Energy Commission (AEC). The Energy Reorganization Act of 1974 abolished the AEC and divided its responsibilities between the new Energy Research and Development Administration (ERDA), later to become the Department of Energy (DOE), and the new Nuclear Regulatory Commission

  5. Software safety analysis on the model specified by NuSCR and SMV input language at requirements phase of software development life cycle using SMV

    International Nuclear Information System (INIS)

    Koh, Kwang Yong; Seong, Poong Hyun

    2005-01-01

    Safety-critical software process is composed of development process, verification and validation (V and V) process and safety analysis process. Safety analysis process has been often treated as an additional process and not found in a conventional software process. But software safety analysis (SSA) is required if software is applied to a safety system, and the SSA shall be performed independently for the safety software through software development life cycle (SDLC). Of all the phases in software development, requirements engineering is generally considered to play the most critical role in determining the overall software quality. NASA data demonstrate that nearly 75% of failures found in operational software were caused by errors in the requirements. The verification process in requirements phase checks the correctness of software requirements specification, and the safety analysis process analyzes the safety-related properties in detail. In this paper, the method for safety analysis at requirements phase of software development life cycle using symbolic model verifier (SMV) is proposed. Hazard is discovered by hazard analysis and in other to use SMV for the safety analysis, the safety-related properties are expressed by computation tree logic (CTL)

  6. Professional Development Aimed at Increasing the Quality of Language Input during Storybook Interactions: Lessons from One Head Start Teacher's Experiences

    Science.gov (United States)

    O'Keefe, Casey

    2018-01-01

    This article provides a review of problems associated with teachers' talk and indicators of higher quality teachers' talk for use with lower socioeconomic status (SES) Head Start students. Then it shows how one Head Start teacher, called Michele in this article, responded to professional development that was aimed at increasing the quality of…

  7. Effects of input properties, vocabulary size, and L1 on the development of third person singular –s in child L2 English

    NARCIS (Netherlands)

    Blom, W.B.T.|info:eu-repo/dai/nl/140893261; Paradis, J.; Sorenson Duncan, T.

    2012-01-01

    This study was designed to investigate the development of third-person singular (3SG) –s in children who learn English as a second language (L2). Adopting the usage-based perspective on the learning of inflection, we analyzed spontaneous speech samples collected from 15 English L2 children who were

  8. Enhanced Input in LCTL Pedagogy

    Directory of Open Access Journals (Sweden)

    Marilyn S. Manley

    2009-08-01

    Full Text Available Language materials for the more-commonly-taught languages (MCTLs often include visual input enhancement (Sharwood Smith 1991, 1993 which makes use of typographical cues like bolding and underlining to enhance the saliency of targeted forms. For a variety of reasons, this paper argues that the use of enhanced input, both visual and oral, is especially important as a tool for the lesscommonly-taught languages (LCTLs. As there continues to be a scarcity of teaching resources for the LCTLs, individual teachers must take it upon themselves to incorporate enhanced input into their own self-made materials. Specific examples of how to incorporate both visual and oral enhanced input into language teaching are drawn from the author’s own experiences teaching Cuzco Quechua. Additionally, survey results are presented from the author’s Fall 2010 semester Cuzco Quechua language students, supporting the use of both visual and oral enhanced input.

  9. Enhanced Input in LCTL Pedagogy

    Directory of Open Access Journals (Sweden)

    Marilyn S. Manley

    2010-08-01

    Full Text Available Language materials for the more-commonly-taught languages (MCTLs often include visual input enhancement (Sharwood Smith 1991, 1993 which makes use of typographical cues like bolding and underlining to enhance the saliency of targeted forms. For a variety of reasons, this paper argues that the use of enhanced input, both visual and oral, is especially important as a tool for the lesscommonly-taught languages (LCTLs. As there continues to be a scarcity of teaching resources for the LCTLs, individual teachers must take it upon themselves to incorporate enhanced input into their own self-made materials. Specific examples of how to incorporate both visual and oral enhanced input into language teaching are drawn from the author’s own experiences teaching Cuzco Quechua. Additionally, survey results are presented from the author’s Fall 2010 semester Cuzco Quechua language students, supporting the use of both visual and oral enhanced input.

  10. Development of a multi-objective PBIL evolutionary algorithm applied to a nuclear reactor core reload optimization problem

    International Nuclear Information System (INIS)

    Machado, Marcelo D.; Dchirru, Roberto

    2005-01-01

    The nuclear reactor core reload optimization problem consists in finding a pattern of partially burned-up and fresh fuels that optimizes the plant's next operation cycle. This optimization problem has been traditionally solved using an expert's knowledge, but recently artificial intelligence techniques have also been applied successfully. The artificial intelligence optimization techniques generally have a single objective. However, most real-world engineering problems, including nuclear core reload optimization, have more than one objective (multi-objective) and these objectives are usually conflicting. The aim of this work is to develop a tool to solve multi-objective problems based on the Population-Based Incremental Learning (PBIL) algorithm. The new tool is applied to solve the Angra 1 PWR core reload optimization problem with the purpose of creating a Pareto surface, so that a pattern selected from this surface can be applied for the plant's next operation cycle. (author)

  11. TAX OPTIMIZATION AS A DECISIVE FACTOR OF ECONOMIC DEVELOPMENT (THE CASE OF POLAND

    Directory of Open Access Journals (Sweden)

    Mykola Andriyash

    2016-11-01

    Full Text Available The main purpose of the paper is to compare the system of taxation and tax optimization in Poland with solutions in the selected EU Member States and its influence on economic development. The paper presents the system of taxation in Poland compared with fiscal solutions in selected EU countries. It also discusses the typology of tax solutions referring to tax optimization in Europe. Methodology. The author used the primary and secondary data from the Central Statistics Office (GUS, Pricewaterhouse Coopers’ research and the Eurostat. The research methods used for the purpose of data analysis included economic analysis of legal acts, descriptive statistics, and comparative analysis. Results showed that the level of tax loading in Poland is more moderate than in other developed countries of the world while the mechanism of administration of taxes and collections is much more successful. The share of receipts collected by the decentralized administration has been increasing steadily since the major administrative reform of 1999 and the local finance law enacted in 2004. Eliminating the category of special sections of industrial production and inclusion of revenues/income to the proposed form of income tax would be very desirable. This would indicate the practical implementation of Smith’s tax principles concerning tax equity and an ability of taxpayer to pay levies. Practical implications. Tax optimization is to balance the tax loading level through offering benefits and preferences which would stimulate the economy of the country and will not cause aggravation of the problem of relocating tax loading from one tax payer to another. Many national states, noticing the phenomena of tax evasion and tax avoidance are implementing or going to implement specific reforms which help to improve the system of tax control with the aim of creating conditions which make tax and collections evasion impossible. Value/originality. The results of the conducted

  12. Identification of Optimal Reference Genes for Normalization of qPCR Analysis during Pepper Fruit Development

    Directory of Open Access Journals (Sweden)

    Yuan Cheng

    2017-06-01

    Full Text Available Due to its high sensitivity and reproducibility, quantitative real-time PCR (qPCR is practiced as a useful research tool for targeted gene expression analysis. For qPCR operations, the normalization with suitable reference genes (RGs is a crucial step that eventually determines the reliability of the obtained results. Although pepper is considered an ideal model plant for the study of non-climacteric fruit development, at present no specific RG have been developed or validated for the qPCR analyses of pepper fruit. Therefore, this study aimed to identify stably expressed genes for their potential use as RGs in pepper fruit studies. Initially, a total of 35 putative RGs were selected by mining the pepper transcriptome data sets derived from the PGP (Pepper Genome Platform and PGD (Pepper Genome Database. Their expression stabilities were further measured in a set of pepper (Capsicum annuum L. var. 007e fruit samples, which represented four different fruit developmental stages (IM: Immature; MG: Mature green; B: Break; MR: Mature red using the qPCR analysis. Then, based on the qPCR results, three different statistical algorithms, namely geNorm, Normfinder, and boxplot, were chosen to evaluate the expression stabilities of these putative RGs. It should be noted that nine genes were proven to be qualified as RGs during pepper fruit development, namely CaREV05 (CA00g79660; CaREV08 (CA06g02180; CaREV09 (CA06g05650; CaREV16 (Capana12g002666; CaREV21 (Capana10g001439; CaREV23 (Capana05g000680; CaREV26 (Capana01g002973; CaREV27 (Capana11g000123; CaREV31 (Capana04g002411; and CaREV33 (Capana08g001826. Further analysis based on geNorm suggested that the application of the two most stably expressed genes (CaREV05 and CaREV08 would provide optimal transcript normalization in the qPCR experiments. Therefore, a new and comprehensive strategy for the identification of optimal RGs was developed. This strategy allowed for the effective normalization of the q

  13. Optimization of food materials for development of nutritious pasta utilizing groundnut meal and beetroot.

    Science.gov (United States)

    Mridula, D; Gupta, R K; Bhadwal, Sheetal; Khaira, Harjot; Tyagi, S K

    2016-04-01

    Present study was undertaken to optimize the level of food materials viz. groundnut meal, beetroot juice and refined wheat flour for development of nutritious pasta using response surface methodology. Box-benken design of experiments was used to design different experimental combinations considering 10 to 20 g groundnut meal, 6 to 18 mL beetroot juice and 80 to 90 g refined wheat flour. Quality attributes such as protein content, antioxidant activity, colour, cooking quality (solid loss, rehydration ratio and cooking time) and sensory acceptability of pasta samples were the dependent variables for the study. The results revealed that pasta samples with higher levels of groundnut meal and beetroot juice were high in antioxidant activity and overall sensory acceptability. The samples with higher content of groundnut meal indicated higher protein contents in them. On the other hand, the samples with higher beetroot juice content were high in rehydration ratio and lesser cooking time along with low solid loss in cooking water. The different level of studied food materials significantly affected the colour quality of pasta samples. Optimized combination for development of nutritious pasta consisted of 20 g groundnut meal, 18 mL beetroot juice and 83.49 g refined wheat flour with overall desirability as 0.905. This pasta sample required 5.5 min to cook and showed 1.37 % solid loss and rehydration ratio as 6.28. Pasta sample prepared following optimized formulation provided 19.56 % protein content, 23.95 % antioxidant activity and 125.89 mg/100 g total phenols with overall sensory acceptability scores 8.71.

  14. Development of integrated system of spatial coordinates for inputting cartographic information obtained as result of scientific researches at exclusion zone area

    International Nuclear Information System (INIS)

    Mazur, A.B.; Ermolenko, A.I.; Shaptala, D.V.; Postil, S.D.; Yachmenev, V.V.

    1999-01-01

    Integrated system of spatial coordinates was developed for modeling of diverse processes occurring in the exclusion zone. Technology of joint use of CAD and geoinformation system was worked out,which resulted in the 'Ukryttia' object fixing to the topographic base in accordance with geographic coordinates, editing of nomenclative sheets of the area adjacent to the ChNPP and renovation of its relief, as well as conversion of the files made in AutoCad to Arc View format files was performed

  15. High serotonin levels during brain development alter the structural input-output connectivity of neural networks in the rat somatosensory layer IV

    Directory of Open Access Journals (Sweden)

    Stéphanie eMiceli

    2013-06-01

    Full Text Available Homeostatic regulation of serotonin (5-HT concentration is critical for normal topographical organization and development of thalamocortical (TC afferent circuits. Down-regulation of the serotonin transporter (SERT and the consequent impaired reuptake of 5-HT at the synapse, results in a reduced terminal branching of developing TC afferents within the primary somatosensory cortex (S1. Despite the presence of multiple genetic models, the effect of high extracellular 5-HT levels on the structure and function of developing intracortical neural networks is far from being understood. Here, using juvenile SERT knockout (SERT-/- rats we investigated, in vitro, the effect of increased 5-HT levels on the structural organization of (i the thalamocortical projections of the ventroposteromedial thalamic nucleus towards S1, (ii the general barrel-field pattern and (iii the electrophysiological and morphological properties of the excitatory cell population in layer IV of S1 (spiny stellate and pyramidal cells. Our results confirmed previous findings that high levels of 5-HT during development lead to a reduction of the topographical precision of TCA projections towards the barrel cortex. Also, the barrel pattern was altered but not abolished in SERT-/- rats. In layer IV, both excitatory spiny stellate and pyramidal cells showed a significantly reduced intracolumnar organization of their axonal projections. In addition, the layer IV spiny stellate cells gave rise to a prominent projection towards the infragranular layer Vb. Our findings point to a structural and functional reorganization, of TCAs, as well as early stage intracortical microcircuitry, following the disruption of 5-HT reuptake during critical developmental periods. The increased projection pattern of the layer IV neurons suggests that the intracortical network changes are not limited to the main entry layer IV but may also affect the subsequent stages of the canonical circuits of the barrel

  16. Optimized solar-wind-powered drip irrigation for farming in developing countries

    Science.gov (United States)

    Barreto, Carolina M.

    The two billion people produce 80% of all food consumed in the developing world and 1.3 billion lack access to electricity. Agricultural production will have to increase by about 70% worldwide by 2050 and to achieve this about 50% more primary energy has to be made available by 2035. Energy-smart agri-food systems can improve productivity in the food sector, reduce energy poverty in rural areas and contribute to achieving food security and sustainable development. Agriculture can help reduce poverty for 75% of the world's poor, who live in rural areas and work mainly in farming. The costs associated with irrigation pumping are directly affected by energy prices and have a strong impact on farmer income. Solar-wind (SW) drip irrigation (DI) is a sustainable method to meet these challenges. This dissertation shows with onsite data the low cost of SW pumping technologies correlating the water consumption (evapotranspiration) and the water production (SW pumping). The author designed, installed, and collected operating data from the six SWDI systems in Peru and in the Tohono O'odham Nation in AZ. The author developed, tested, and a simplified model for solar engineers to size SWDI systems. The author developed a business concept to scale up the SWDI technology. The outcome was a simplified design approach for a DI system powered by low cost SW pumping systems optimized based on the logged on site data. The optimization showed that the SWDI system is an income generating technology and that by increasing the crop production per unit area, it allowed small farmers to pay for the system. The efficient system resulted in increased yields, sometimes three to four fold. The system is a model for smallholder agriculture in developing countries and can increase nutrition and greater incomes for the world's poor.

  17. Development of a Whole Body Atlas for Radiation Therapy Planning and Treatment Optimization

    International Nuclear Information System (INIS)

    Qatarneh, Sharif

    2006-01-01

    The main objective of radiation therapy is to obtain the highest possible probability of tumor cure while minimizing adverse reactions in healthy tissues. A crucial step in the treatment process is to determine the location and extent of the primary tumor and its loco regional lymphatic spread in relation to adjacent radiosensitive anatomical structures and organs at risk. These volumes must also be accurately delineated with respect to external anatomic reference points, preferably on surrounding bony structures. At the same time, it is essential to have the best possible physical and radiobiological knowledge about the radiation responsiveness of the target tissues and organs at risk in order to achieve a more accurate optimization of the treatment outcome. A computerized whole body Atlas has therefore been developed to serve as a dynamic database, with systematically integrated knowledge, comprising all necessary physical and radiobiological information about common target volumes and normal tissues. The Atlas also contains a database of segmented organs and a lymph node topography, which was based on the Visible Human dataset, to form standard reference geometry of organ systems. The reference knowledge base and the standard organ dataset can be utilized for Atlas-based image processing and analysis in radiation therapy planning and for biological optimization of the treatment outcome. Atlas-based segmentation procedures were utilized to transform the reference organ dataset of the Atlas into the geometry of individual patients. The anatomic organs and target volumes of the database can be converted by elastic transformation into those of the individual patient for final treatment planning. Furthermore, a database of reference treatment plans was started by implementing state-of-the-art biologically based radiation therapy planning techniques such as conformal, intensity modulated, and radio biologically optimized treatment planning. The computerized Atlas can

  18. Development of andrographolide loaded PLGA microspheres: optimization, characterization and in vitro-in vivo correlation.

    Science.gov (United States)

    Jiang, Yunxia; Wang, Fang; Xu, Hui; Liu, Hui; Meng, Qingguo; Liu, Wanhui

    2014-11-20

    The purpose of this study was to develop a sustained-release drug delivery system based on the injectable PLGA microspheres loaded with andrographolide. The andrographolide loaded PLGA microspheres were prepared by emulsion solvent evaporation method with optimization of formulation using response surface methodology (RSM). Physicochemical characterization, in vitro release behavior and in vivo pharmacokinetics of the optimized formulation were then evaluated. The percent absorbed in vivo was determined by deconvolution using the Loo-Riegelman method, and then the in vitro-in vivo correlation (IVIVC) was established. Results showed that the microspheres were spherical with a smooth surface. Average particle size, entrapment efficiency and drug loading were found to be 53.18±2.11 μm, 75.79±3.02% and 47.06±2.18%, respectively. In vitro release study showed a low initial burst release followed by a prolonged release up to 9 days and the release kinetics followed the Korsmeyer-Peppas model. After a single intramuscular injection, the microspheres maintained relatively high plasma concentration of andrographolide over one week. A good linear relationship was observed between the in vitro and in vivo release behavior (R(2)=0.9951). These results suggest the PLGA microspheres could be developed as a potential delivery system for andrographolide with high drug loading capacity and sustained drug release. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Development of an Analysis and Design Optimization Framework for Marine Propellers

    Science.gov (United States)

    Tamhane, Ashish C.

    In this thesis, a framework for the analysis and design optimization of ship propellers is developed. This framework can be utilized as an efficient synthesis tool in order to determine the main geometric characteristics of the propeller but also to provide the designer with the capability to optimize the shape of the blade sections based on their specific criteria. A hybrid lifting-line method with lifting-surface corrections to account for the three-dimensional flow effects has been developed. The prediction of the correction factors is achieved using Artificial Neural Networks and Support Vector Regression. This approach results in increased approximation accuracy compared to existing methods and allows for extrapolation of the correction factor values. The effect of viscosity is implemented in the framework via the coupling of the lifting line method with the open-source RANSE solver OpenFOAM for the calculation of lift, drag and pressure distribution on the blade sections using a transition kappa-o SST turbulence model. Case studies of benchmark high-speed propulsors are utilized in order to validate the proposed framework for propeller operation in open-water conditions but also in a ship's wake.

  20. Development and optimization of operational parameters of a gas-fired baking oven

    Directory of Open Access Journals (Sweden)

    Afolabi Tunde MORAKINYO

    2017-12-01

    Full Text Available This study presented the development and optimization of operational parameters of an indigenous gas-fired bread-baking oven for small-scale entrepreneur. It is an insulated rectangular box-like chamber, made of galvanized-steel sheets and having a total dimension of 920mm×650mm×600mm. This oven consists of two baking compartments and three combustion chambers. The oven characteristics were evaluated in terms of the baking capacity, baking efficiency and weight loss of the baked bread. The physical properties of the baked breads were measured and analyzed using Duncan multiple range test of one way ANOVA at significant level of p<0.05. These properties were optimized to determine the optimum baking temperature using 3D surface response plot of Statistical Release 7. The baking capacity, baking efficiency, weight loss and optimum baking temperature were: 12.5 kg/hr, 87.8%, 12.5 g, 200-220oC, respectively. The physical properties of baked bread dough were found to correspond with the imported product (control sample. These results showed that, the developed gas-fired baking oven can be adopted for baking of bread at domestic and commercial levels.

  1. Bees for development: Brazilian survey reveals how to optimize stingless beekeeping.

    Directory of Open Access Journals (Sweden)

    Rodolfo Jaffé

    Full Text Available Stingless bees are an important asset to assure plant biodiversity in many natural ecosystems, and fulfill the growing agricultural demand for pollination. However, across developing countries stingless beekeeping remains an essentially informal activity, technical knowledge is scarce, and management practices lack standardization. Here we profited from the large diversity of stingless beekeepers found in Brazil to assess the impact of particular management practices on productivity and economic revenues from the commercialization of stingless bee products. Our study represents the first large-scale effort aiming at optimizing stingless beekeeping for honey/colony production based on quantitative data. Survey data from 251 beekeepers scattered across 20 Brazilian States revealed the influence of specific management practices and other confounding factors over productivity and income indicators. Specifically, our results highlight the importance of teaching beekeepers how to inspect and feed their colonies, how to multiply them and keep track of genetic lineages, how to harvest and preserve the honey, how to use vinegar traps to control infestation by parasitic flies, and how to add value by labeling honey containers. Furthermore, beekeeping experience and the network of known beekeepers were found to be key factors influencing productivity and income. Our work provides clear guidelines to optimize stingless beekeeping and help transform the activity into a powerful tool for sustainable development.

  2. Bees for development: Brazilian survey reveals how to optimize stingless beekeeping.

    Science.gov (United States)

    Jaffé, Rodolfo; Pope, Nathaniel; Torres Carvalho, Airton; Madureira Maia, Ulysses; Blochtein, Betina; de Carvalho, Carlos Alfredo Lopes; Carvalho-Zilse, Gislene Almeida; Freitas, Breno Magalhães; Menezes, Cristiano; Ribeiro, Márcia de Fátima; Venturieri, Giorgio Cristino; Imperatriz-Fonseca, Vera Lucia

    2015-01-01

    Stingless bees are an important asset to assure plant biodiversity in many natural ecosystems, and fulfill the growing agricultural demand for pollination. However, across developing countries stingless beekeeping remains an essentially informal activity, technical knowledge is scarce, and management practices lack standardization. Here we profited from the large diversity of stingless beekeepers found in Brazil to assess the impact of particular management practices on productivity and economic revenues from the commercialization of stingless bee products. Our study represents the first large-scale effort aiming at optimizing stingless beekeeping for honey/colony production based on quantitative data. Survey data from 251 beekeepers scattered across 20 Brazilian States revealed the influence of specific management practices and other confounding factors over productivity and income indicators. Specifically, our results highlight the importance of teaching beekeepers how to inspect and feed their colonies, how to multiply them and keep track of genetic lineages, how to harvest and preserve the honey, how to use vinegar traps to control infestation by parasitic flies, and how to add value by labeling honey containers. Furthermore, beekeeping experience and the network of known beekeepers were found to be key factors influencing productivity and income. Our work provides clear guidelines to optimize stingless beekeeping and help transform the activity into a powerful tool for sustainable development.

  3. Optimized Lateral Flow Immunoassay Reader for the Detection of Infectious Diseases in Developing Countries.

    Science.gov (United States)

    Pilavaki, Evdokia; Demosthenous, Andreas

    2017-11-20

    Detection and control of infectious diseases is a major problem, especially in developing countries. Lateral flow immunoassays can be used with great success for the detection of infectious diseases. However, for the quantification of their results an electronic reader is required. This paper presents an optimized handheld electronic reader for developing countries. It features a potentially low-cost, low-power, battery-operated device with no added optical accessories. The operation of this proof of concept device is based on measuring the reflected light from the lateral flow immunoassay and translating it into the concentration of the specific analyte of interest. Characterization of the surface of the lateral flow immunoassay has been performed in order to accurately model its response to the incident light. Ray trace simulations have been performed to optimize the system and achieve maximum sensitivity by placing all the components in optimum positions. A microcontroller enables all the signal processing to be performed on the device and a Bluetooth module allows transmission of the results wirelessly to a mobile phone app. Its performance has been validated using lateral flow immunoassays with influenza A nucleoprotein in the concentration range of 0.5 ng/mL to 200 ng/mL.

  4. Optimized Lateral Flow Immunoassay Reader for the Detection of Infectious Diseases in Developing Countries

    Directory of Open Access Journals (Sweden)

    Evdokia Pilavaki

    2017-11-01

    Full Text Available Detection and control of infectious diseases is a major problem, especially in developing countries. Lateral flow immunoassays can be used with great success for the detection of infectious diseases. However, for the quantification of their results an electronic reader is required. This paper presents an optimized handheld electronic reader for developing countries. It features a potentially low-cost, low-power, battery-operated device with no added optical accessories. The operation of this proof of concept device is based on measuring the reflected light from the lateral flow immunoassay and translating it into the concentration of the specific analyte of interest. Characterization of the surface of the lateral flow immunoassay has been performed in order to accurately model its response to the incident light. Ray trace simulations have been performed to optimize the system and achieve maximum sensitivity by placing all the components in optimum positions. A microcontroller enables all the signal processing to be performed on the device and a Bluetooth module allows transmission of the results wirelessly to a mobile phone app. Its performance has been validated using lateral flow immunoassays with influenza A nucleoprotein in the concentration range of 0.5 ng/mL to 200 ng/mL.

  5. Input-constrained model predictive control via the alternating direction method of multipliers

    DEFF Research Database (Denmark)

    Sokoler, Leo Emil; Frison, Gianluca; Andersen, Martin S.

    2014-01-01

    This paper presents an algorithm, based on the alternating direction method of multipliers, for the convex optimal control problem arising in input-constrained model predictive control. We develop an efficient implementation of the algorithm for the extended linear quadratic control problem (LQCP......) with input and input-rate limits. The algorithm alternates between solving an extended LQCP and a highly structured quadratic program. These quadratic programs are solved using a Riccati iteration procedure, and a structure-exploiting interior-point method, respectively. The computational cost per iteration...... is quadratic in the dimensions of the controlled system, and linear in the length of the prediction horizon. Simulations show that the approach proposed in this paper is more than an order of magnitude faster than several state-of-the-art quadratic programming algorithms, and that the difference in computation...

  6. Adaptive RD Optimized Hybrid Sound Coding

    NARCIS (Netherlands)

    Schijndel, N.H. van; Bensa, J.; Christensen, M.G.; Colomes, C.; Edler, B.; Heusdens, R.; Jensen, J.; Jensen, S.H.; Kleijn, W.B.; Kot, V.; Kövesi, B.; Lindblom, J.; Massaloux, D.; Niamut, O.A.; Nordén, F.; Plasberg, J.H.; Vafin, R.; Virette, D.; Wübbolt, O.

    2008-01-01

    Traditionally, sound codecs have been developed with a particular application in mind, their performance being optimized for specific types of input signals, such as speech or audio (music), and application constraints, such as low bit rate, high quality, or low delay. There is, however, an

  7. Development of a stereolithography (STL input and computer numerical control (CNC output algorithm for an entry-level 3-D printer

    Directory of Open Access Journals (Sweden)

    Brown, Andrew

    2014-08-01

    Full Text Available This paper presents a prototype Stereolithography (STL file format slicing and tool-path generation algorithm, which serves as a data front-end for a Rapid Prototyping (RP entry- level three-dimensional (3-D printer. Used mainly in Additive Manufacturing (AM, 3-D printers are devices that apply plastic, ceramic, and metal, layer by layer, in all three dimensions on a flat surface (X, Y, and Z axis. 3-D printers, unfortunately, cannot print an object without a special algorithm that is required to create the Computer Numerical Control (CNC instructions for printing. An STL algorithm therefore forms a critical component for Layered Manufacturing (LM, also referred to as RP. The purpose of this study was to develop an algorithm that is capable of processing and slicing an STL file or multiple files, resulting in a tool-path, and finally compiling a CNC file for an entry-level 3- D printer. The prototype algorithm was implemented for an entry-level 3-D printer that utilises the Fused Deposition Modelling (FDM process or Solid Freeform Fabrication (SFF process; an AM technology. Following an experimental method, the full data flow path for the prototype algorithm was developed, starting with STL data files, and then processing the STL data file into a G-code file format by slicing the model and creating a tool-path. This layering method is used by most 3-D printers to turn a 2-D object into a 3-D object. The STL algorithm developed in this study presents innovative opportunities for LM, since it allows engineers and architects to transform their ideas easily into a solid model in a fast, simple, and cheap way. This is accomplished by allowing STL models to be sliced rapidly, effectively, and without error, and finally to be processed and prepared into a G-code print file.

  8. The Use of D-Optimal Mixture Design in Optimizing Development of Okara Tablet Formulation as a Dietary Supplement

    Science.gov (United States)

    Mohamad Zen, Nur Izzati; Shamsudin, Rosnah

    2015-01-01

    The usage of soy is increasing year by year. It increases the problem of financial crisis due to the limited sources of soybeans. Therefore, production of oral tablets containing the nutritious leftover of soymilk production, called okara, as the main ingredient was investigated. The okara tablets were produced using the direct compression method. The percentage of okara, guar gum, microcrystalline cellulose (Avicel PH-101), and maltodextrin influenced tablets' hardness and friability which are analyzed using a D-optimal mixture design. Composition of Avicel PH-101 had positive effects for both hardness and friability tests of the tablets. Maltodextrin and okara composition had a significant positive effect on tablets' hardness, but not on percentage of friability of tablets. However, guar gum had a negative effect on both physical tests. The optimum tablet formulation was obtained: 47.0% of okara, 2.0% of guar gum, 35.0% of Avicel PH-101, and 14.0% of maltodextrin. PMID:26171418

  9. The Use of D-Optimal Mixture Design in Optimizing Development of Okara Tablet Formulation as a Dietary Supplement

    Directory of Open Access Journals (Sweden)

    Nur Izzati Mohamad Zen

    2015-01-01

    Full Text Available The usage of soy is increasing year by year. It increases the problem of financial crisis due to the limited sources of soybeans. Therefore, production of oral tablets containing the nutritious leftover of soymilk production, called okara, as the main ingredient was investigated. The okara tablets were produced using the direct compression method. The percentage of okara, guar gum, microcrystalline cellulose (Avicel PH-101, and maltodextrin influenced tablets’ hardness and friability which are analyzed using a D-optimal mixture design. Composition of Avicel PH-101 had positive effects for both hardness and friability tests of the tablets. Maltodextrin and okara composition had a significant positive effect on tablets’ hardness, but not on percentage of friability of tablets. However, guar gum had a negative effect on both physical tests. The optimum tablet formulation was obtained: 47.0% of okara, 2.0% of guar gum, 35.0% of Avicel PH-101, and 14.0% of maltodextrin.

  10. Evaluation of Building Energy Saving Through the Development of Venetian Blinds' Optimal Control Algorithm According to the Orientation and Window-to-Wall Ratio

    Science.gov (United States)

    Kwon, Hyuk Ju; Yeon, Sang Hun; Lee, Keum Ho; Lee, Kwang Ho

    2018-02-01

    As various studies focusing on building energy saving have been continuously conducted, studies utilizing renewable energy sources, instead of fossil fuel, are needed. In particular, studies regarding solar energy are being carried out in the field of building science; in order to utilize such solar energy effectively, solar radiation being brought into the indoors should be acquired and blocked properly. Blinds are a typical solar radiation control device that is capable of controlling indoor thermal and light environments. However, slat-type blinds are manually controlled, giving a negative effect on building energy saving. In this regard, studies regarding the automatic control of slat-type blinds have been carried out for the last couple of decades. Therefore, this study aims to provide preliminary data for optimal control research through the controlling of slat angle in slat-type blinds by comprehensively considering various input variables. The window area ratio and orientation were selected as input variables. It was found that an optimal control algorithm was different among each window-to-wall ratio and window orientation. In addition, through comparing and analyzing the building energy saving performance for each condition by applying the developed algorithms to simulations, up to 20.7 % energy saving was shown in the cooling period and up to 12.3 % energy saving was shown in the heating period. In addition, building energy saving effect was greater as the window area ratio increased given the same orientation, and the effects of window-to-wall ratio in the cooling period were higher than those of window-to-wall ratio in the heating period.

  11. Characterization of seven United States coal regions. The development of optimal terrace pit coal mining systems

    Energy Technology Data Exchange (ETDEWEB)

    Wimer, R.L.; Adams, M.A.; Jurich, D.M.

    1981-02-01

    This report characterizes seven United State coal regions in the Northern Great Plains, Rocky Mountain, Interior, and Gulf Coast coal provinces. Descriptions include those of the Fort Union, Powder River, Green River, Four Corners, Lower Missouri, Illinois Basin, and Texas Gulf coal resource regions. The resource characterizations describe geologic, geographic, hydrologic, environmental and climatological conditions of each region, coal ranks and qualities, extent of reserves, reclamation requirements, and current mining activities. The report was compiled as a basis for the development of hypothetical coal mining situations for comparison of conventional and terrace pit surface mining methods, under contract to the Department of Energy, Contract No. DE-AC01-79ET10023, entitled The Development of Optimal Terrace Pit Coal Mining Systems.

  12. Technology development of maintenance optimization and reliability analysis for safety features in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Woon; Choi, Seong Soo; Lee, Dong Gue; Kim, Young Il

    1999-12-01

    The reliability data management system (RDMS) for safety systems of PHWR type plants has been developed and utilized in the reliability analysis of the special safety systems of Wolsong Unit 1,2 with plant overhaul period lengthened. The RDMS is developed for the periodic efficient reliability analysis of the safety systems of Wolsong Unit 1,2. In addition, this system provides the function of analyzing the effects on safety system unavailability if the test period of a test procedure changes as well as the function of optimizing the test periods of safety-related test procedures. The RDMS can be utilized in handling the requests of the regulatory institute actively with regard to the reliability validation of safety systems. (author)

  13. Promoting the energy structure optimization around Chinese Beijing-Tianjin area by developing biomass energy

    Science.gov (United States)

    Zhao, Li; Sun, Du; Wang, Shi-Yu; Zhao, Feng-Qing

    2017-06-01

    In recent years, remarkable achievements in the utilization of biomass energy have been made in China. However, there are still some problems, such as irrational industry layout, immature existing market survival mechanism and lack of core competitiveness. On the basis of investigation and research, some recommendations and strategies are proposed for the development of biomass energy around Chinese Beijing-Tianjin area: scientific planning and precise laying out of biomass industry; rationalizing the relationship between government and enterprises and promoting the establishment of a market-oriented survival mechanism; combining ‘supply side’ with ‘demand side’ to optimize product structure; extending industrial chain to promote industry upgrading and sustainable development; and comprehensive co-ordinating various types of biomass resources and extending product chain to achieve better economic benefits.

  14. Optimization of the weekly operation of a multipurpose hydroelectric development, including a pumped storage plant

    International Nuclear Information System (INIS)

    Popa, R; Popa, B; Popa, F; Zachia-Zlatea, D

    2010-01-01

    It is presented an optimization model based on genetic algorithms for the operation of a multipurpose hydroelectric power development consisting in a pumped storage plant (PSP) with weekly operation cycle. The lower reservoir of the PSP is supplied upstream from a peak hydropower plant (HPP) with a large reservoir and supplies the own HPP which provides the required discharges towards downstream. Under these conditions, the optimum operation of the assembly consisting in 3 reservoirs and hydropower plants becomes a difficult problem if there are considered the restrictions as regards: the gradients allowed for the reservoirs filling/emptying, compliance with of a long-term policy of the upper reservoir from the hydroelectric development and of the weekly cycle for the PSP upper reservoir, correspondence between the power output/consumption in the weekly load schedule, turning to account of the water resource at maximum overall efficiencies, etc. Maximization of the net energy value (generated minus consumed) was selected as performance function of the model, considering the differentiated price of the electric energy over the week (working or weekend days, peak, half-peak or base hours). The analysis time step was required to be of 3 hours, resulting a weekly horizon of 56 steps and 168 decision variables, respectively, for the 3 HPPs of the system. These were allowed to be the flows turbined at the HPP and the number of working hydrounits at PSP, on each time step. The numerical application has considered the guiding data of Fantanele-Tarnita-Lapustesti hydroelectric development. Results of various simulations carried out proved the qualities of the proposed optimization model, which will allow its use within a decisional support program for such a development.

  15. Optimization of the weekly operation of a multipurpose hydroelectric development, including a pumped storage plant

    Energy Technology Data Exchange (ETDEWEB)

    Popa, R; Popa, B [Faculty of Power Engineering, University Politehnica of Bucharest, 313 Spl. Independentei, sect. 6, Bucharest, 060042 (Romania); Popa, F [Institute for Hydropower Studies and Design, 5-7 Vasile Lascar, sect. 2, Bucharest, 020491 (Romania); Zachia-Zlatea, D, E-mail: bogdan.popa@rosha.r [Hidroelectrica S.A., 3 Constantin Nacu, sect. 2, Bucharest, 020995 (Romania)

    2010-08-15

    It is presented an optimization model based on genetic algorithms for the operation of a multipurpose hydroelectric power development consisting in a pumped storage plant (PSP) with weekly operation cycle. The lower reservoir of the PSP is supplied upstream from a peak hydropower plant (HPP) with a large reservoir and supplies the own HPP which provides the required discharges towards downstream. Under these conditions, the optimum operation of the assembly consisting in 3 reservoirs and hydropower plants becomes a difficult problem if there are considered the restrictions as regards: the gradients allowed for the reservoirs filling/emptying, compliance with of a long-term policy of the upper reservoir from the hydroelectric development and of the weekly cycle for the PSP upper reservoir, correspondence between the power output/consumption in the weekly load schedule, turning to account of the water resource at maximum overall efficiencies, etc. Maximization of the net energy value (generated minus consumed) was selected as performance function of the model, considering the differentiated price of the electric energy over the week (working or weekend days, peak, half-peak or base hours). The analysis time step was required to be of 3 hours, resulting a weekly horizon of 56 steps and 168 decision variables, respectively, for the 3 HPPs of the system. These were allowed to be the flows turbined at the HPP and the number of working hydrounits at PSP, on each time step. The numerical application has considered the guiding data of Fantanele-Tarnita-Lapustesti hydroelectric development. Results of various simulations carried out proved the qualities of the proposed optimization model, which will allow its use within a decisional support program for such a development.

  16. [Development of an optimized formulation of damask marmalade with low energy level using Taguchi methodology].

    Science.gov (United States)

    Villarroel, Mario; Castro, Ruth; Junod, Julio

    2003-06-01

    The goal of this present study was the development of an optimized formula of damask marmalade low in calories applying Taguchi methodology to improve the quality of this product. The selection of this methodology lies on the fact that in real life conditions the result of an experiment frequently depends on the influence of several variables, therefore, one expedite way to solve this problem is utilizing factorial desings. The influence of acid, thickener, sweetener and aroma additives, as well as time of cooking, and possible interactions among some of them, were studied trying to get the best combination of these factors to optimize the sensorial quality of an experimental formulation of dietetic damask marmalade. An orthogonal array L8 (2(7)) was applied in this experience, as well as level average analysis was carried out according Taguchi methodology to determine the suitable working levels of the design factors previously choiced, to achieve a desirable product quality. A sensory trained panel was utilized to analyze the marmalade samples using a composite scoring test with a descriptive acuantitative scale ranging from 1 = Bad, 5 = Good. It was demonstrated that the design factors sugar/aspartame, pectin and damask aroma had a significant effect (p < 0.05) on the sensory quality of the marmalade with 82% of contribution on the response. The optimal combination result to be: citric acid 0.2%; pectin 1%; 30 g sugar/16 mg aspartame/100 g, damask aroma 0.5 ml/100 g, time of cooking 5 minutes. Regarding chemical composition, the most important results turned out to be the decrease in carbohydrate content compaired with traditional marmalade with a reduction of 56% in coloric value and also the amount of dietary fiber greater than similar commercial products. Assays of storage stability were carried out on marmalade samples submitted to different temperatures held in plastic bags of different density. Non percetible sensorial, microbiological and chemical changes

  17. Nuclear reaction inputs based on effective interactions

    Energy Technology Data Exchange (ETDEWEB)

    Hilaire, S.; Peru, S.; Dubray, N.; Dupuis, M.; Bauge, E. [CEA, DAM, DIF, Arpajon (France); Goriely, S. [Universite Libre de Bruxelles, Institut d' Astronomie et d' Astrophysique, CP-226, Brussels (Belgium)

    2016-11-15

    Extensive nuclear structure studies have been performed for decades using effective interactions as sole input. They have shown a remarkable ability to describe rather accurately many types of nuclear properties. In the early 2000 s, a major effort has been engaged to produce nuclear reaction input data out of the Gogny interaction, in order to challenge its quality also with respect to nuclear reaction observables. The status of this project, well advanced today thanks to the use of modern computers as well as modern nuclear reaction codes, is reviewed and future developments are discussed. (orig.)

  18. Dynamics and controls of urban heat sink and island phenomena in a desert city: Development of a local climate zone scheme using remotely-sensed inputs

    Science.gov (United States)

    Nassar, Ahmed K.; Blackburn, G. Alan; Whyatt, J. Duncan

    2016-09-01

    This study aims to determine the dynamics and controls of Surface Urban Heat Sinks (SUHS) and Surface Urban Heat Islands (SUHI) in desert cities, using Dubai as a case study. A Local Climate Zone (LCZ) schema was developed to subdivide the city into different zones based on similarities in land cover and urban geometry. Proximity to the Gulf Coast was also determined for each LCZ. The LCZs were then used to sample seasonal and daily imagery from the MODIS thermal sensor to determine Land Surface Temperature (LST) variations relative to desert sand. Canonical correlation techniques were then applied to determine which factors explained the variability between urban and desert LST. Our results indicate that the daytime SUHS effect is greatest during the summer months (typically ∼3.0 °C) with the strongest cooling effects in open high-rise zones of the city. In contrast, the night-time SUHI effect is greatest during the winter months (typically ∼3.5 °C) with the strongest warming effects in compact mid-rise zones of the city. Proximity to the Arabian Gulf had the largest influence on both SUHS and SUHI phenomena, promoting daytime cooling in the summer months and night-time warming in the winter months. However, other parameters associated with the urban environment such as building height had an influence on daytime cooling, with larger buildings promoting shade and variations in airflow. Likewise, other parameters such as sky view factor contributed to night-time warming, with higher temperatures associated with limited views of the sky.

  19. The trans-Golgi SNARE syntaxin 10 is required for optimal development of Chlamydia trachomatis

    Directory of Open Access Journals (Sweden)

    Andrea L Lucas

    2015-09-01

    Full Text Available Chlamydia trachomatis, an obligate intracellular pathogen, grows inside of a vacuole, termed the inclusion. Within the inclusion, the organisms differentiate from the infectious elementary body (EB into the reticulate body (RB. The RB communicates with the host cell through the inclusion membrane to obtain the nutrients necessary to divide, thus expanding the chlamydial population. At late time points within the developmental cycle, the RBs respond to unknown molecular signals to redifferentiate into infectious EBs to perpetuate the infection cycle. One strategy for Chlamydia to obtain necessary nutrients and metabolites from the host is to intercept host vesicular trafficking pathways. In this study we demonstrate that a trans-Golgi soluble N-ethylmaleimide–sensitive factor attachment protein (SNARE, syntaxin 10, and/or syntaxin10-associated Golgi elements colocalize with the chlamydial inclusion. We hypothesized that Chlamydia utilizes the molecular machinery of syntaxin 10 at the inclusion membrane to intercept specific vesicular trafficking pathways in order to create and maintain an optimal intra-inclusion environment. To test this hypothesis, we used siRNA knockdown of syntaxin 10 to examine the impact of the loss of syntaxin 10 on chlamydial growth and development. Our results demonstrate that loss of syntaxin 10 leads to defects in normal chlamydial maturation including: variable inclusion size with fewer chlamydial organisms per inclusion, fewer infectious progeny, and delayed or halted RB-EB differentiation. These defects in chlamydial development correlate with an overabundance of NBD-lipid retained by inclusions cultured in syntaxin 10 knockdown cells. Overall, loss of syntaxin 10 at the inclusion membrane negatively affects Chlamydia. Understanding host machinery involved in maintaining an optimal inclusion environment to support chlamydial growth and development is critical towards understanding the molecular signals involved in

  20. Material input of nuclear fuel

    International Nuclear Information System (INIS)

    Rissanen, S.; Tarjanne, R.

    2001-01-01

    The Material Input (MI) of nuclear fuel, expressed in terms of the total amount of natural material needed for manufacturing a product, is examined. The suitability of the MI method for assessing the environmental impacts of fuels is also discussed. Material input is expressed as a Material Input Coefficient (MIC), equalling to the total mass of natural material divided by the mass of the completed product. The material input coefficient is, however, only an intermediate result, which should not be used as such for the comparison of different fuels, because the energy contents of nuclear fuel is about 100 000-fold compared to the energy contents of fossil fuels. As a final result, the material input is expressed in proportion to the amount of generated electricity, which is called MIPS (Material Input Per Service unit). Material input is a simplified and commensurable indicator for the use of natural material, but because it does not take into account the harmfulness of materials or the way how the residual material is processed, it does not alone express the amount of environmental impacts. The examination of the mere amount does not differentiate between for example coal, natural gas or waste rock containing usually just sand. Natural gas is, however, substantially more harmful for the ecosystem than sand. Therefore, other methods should also be used to consider the environmental load of a product. The material input coefficient of nuclear fuel is calculated using data from different types of mines. The calculations are made among other things by using the data of an open pit mine (Key Lake, Canada), an underground mine (McArthur River, Canada) and a by-product mine (Olympic Dam, Australia). Furthermore, the coefficient is calculated for nuclear fuel corresponding to the nuclear fuel supply of Teollisuuden Voima (TVO) company in 2001. Because there is some uncertainty in the initial data, the inaccuracy of the final results can be even 20-50 per cent. The value

  1. Integrated systems optimization model for biofuel development: The influence of environmental constraints

    Science.gov (United States)

    Housh, M.; Ng, T.; Cai, X.

    2012-12-01

    The environmental impact is one of the major concerns of biofuel development. While many other studies have examined the impact of biofuel expansion on stream flow and water quality, this study examines the problem from the other side - will and how a biofuel production target be affected by given environmental constraints. For this purpose, an integrated model comprises of different sub-systems of biofuel refineries, transportation, agriculture, water resources and crops/ethanol market has been developed. The sub-systems are integrated into one large-scale model to guide the optimal development plan considering the interdependency between the subsystems. The optimal development plan includes biofuel refineries location and capacity, refinery operation, land allocation between biofuel and food crops, and the corresponding stream flow and nitrate load in the watershed. The watershed is modeled as a network flow, in which the nodes represent sub-watersheds and the arcs are defined as the linkage between the sub-watersheds. The runoff contribution of each sub-watershed is determined based on the land cover and the water uses in that sub-watershed. Thus, decisions of other sub-systems such as the land allocation in the land use sub-system and the water use in the refinery sub-system define the sources and the sinks of the network. Environmental policies will be addressed in the integrated model by imposing stream flow and nitrate load constraints. These constraints can be specified by location and time in the watershed to reflect the spatial and temporal variation of the regulations. Preliminary results show that imposing monthly water flow constraints and yearly nitrate load constraints will change the biofuel development plan dramatically. Sensitivity analysis is performed to examine how the environmental constraints and their spatial and the temporal distribution influence the overall biofuel development plan and the performance of each of the sub

  2. Development and optimization of press coated tablets of release engineered valsartan for pulsatile delivery.

    Science.gov (United States)

    Shah, Sunny; Patel, Romik; Soniwala, Moinuddin; Chavda, Jayant

    2015-01-01

    The present work is aimed to develop and optimize pulsatile delivery during dissolution of an improved formulation of valsartan to coordinate the drug release with circadian rhythm. Preliminary studies suggested that β cyclodextrin could improve the solubility of valsartan and showed AL type solubility curve. A 1:1 stoichiometric ratio of valsartan to β cyclodextrin was revealed from phase solubility studies and Job's plot. The prepared complex showed significantly better dissolution efficiency (p valsartan β cyclodextrin complex was significantly higher (p valsartan β cyclodextrin complex were subsequently prepared and application of the Plackett-Burman screening design revealed that HPMC K4M and EC showed significant effect on lag time. A 3(2) full factorial design was used to measure the response of HPMC K4M and EC on lag time and time taken for 90% drug release (T90). The optimized batch prepared according to the levels obtained from the desirability function had a lag time of 6 h and consisted of HPMC K4M:ethylcellulose in a 1:1.5 ratio with 180 mg of coating and revealed a close agreement between observed and predicted value (R(2 )= 0.9694).

  3. Mixture experiment methods in the development and optimization of microemulsion formulations.

    Science.gov (United States)

    Furlanetto, S; Cirri, M; Piepel, G; Mennini, N; Mura, P

    2011-06-25

    Microemulsion formulations represent an interesting delivery vehicle for lipophilic drugs, allowing for improving their solubility and dissolution properties. This work developed effective microemulsion formulations using glyburide (a very poorly-water-soluble hypoglycaemic agent) as a model drug. First, the area of stable microemulsion (ME) formations was identified using a new approach based on mixture experiment methods. A 13-run mixture design was carried out in an experimental region defined by constraints on three components: aqueous, oil and surfactant/cosurfactant. The transmittance percentage (at 550 nm) of ME formulations (indicative of their transparency and thus of their stability) was chosen as the response variable. The results obtained using the mixture experiment approach corresponded well with those obtained using the traditional approach based on pseudo-ternary phase diagrams. However, the mixture experiment approach required far less experimental effort than the traditional approach. A subsequent 13-run mixture experiment, in the region of stable MEs, was then performed to identify the optimal formulation (i.e., having the best glyburide dissolution properties). Percent drug dissolved and dissolution efficiency were selected as the responses to be maximized. The ME formulation optimized via the mixture experiment approach consisted of 78% surfactant/cosurfacant (a mixture of Tween 20 and Transcutol, 1:1, v/v), 5% oil (Labrafac Hydro) and 17% aqueous phase (water). The stable region of MEs was identified using mixture experiment methods for the first time. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Development, Characterization, and Optimization of Protein Level in Date Bars Using Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Muhammad Nadeem

    2012-01-01

    Full Text Available This project was designed to produce a nourishing date bar with commercial value especially for school going children to meet their body development requirements. Protein level of date bars was optimized using response surface methodology (RSM. Economical and underutilized sources, that is, whey protein concentrate and vetch protein isolates, were explored for protein supplementation. Fourteen date bar treatments were produced using a central composite design (CCD with 2 variables and 3 levels for each variable. Date bars were then analyzed for nutritional profile. Proximate composition revealed that addition of whey protein concentrate and vetch protein isolates improved the nutritional profile of date bars. Protein level, texture, and taste were considerably improved by incorporating 6.05% whey protein concentrate and 4.35% vetch protein isolates in date bar without affecting any sensory characteristics during storage. Response surface methodology was observed as an economical and effective tool to optimize the ingredient level and to discriminate the interactive effects of independent variables.

  5. Development of a fuzzy optimization model, supporting global warming decision-making

    International Nuclear Information System (INIS)

    Leimbach, M.

    1996-01-01

    An increasing number of models have been developed to support global warming response policies. The model constructors are facing a lot of uncertainties which limit the evidence of these models. The support of climate policy decision-making is only possible in a semi-quantitative way, as presented by a Fuzzy model. The model design is based on an optimization approach, integrated in a bounded risk decision-making framework. Given some regional emission-related and impact-related restrictions, optimal emission paths can be calculated. The focus is not only on carbon dioxide but on other greenhouse gases too. In the paper, the components of the model will be described. Cost coefficients, emission boundaries and impact boundaries are represented as Fuzzy parameters. The Fuzzy model will be transformed into a computational one by using an approach of Rommelfanger. In the second part, some problems of applying the model to computations will be discussed. This includes discussions on the data situation and the presentation, as well as interpretation of results of sensitivity analyses. The advantage of the Fuzzy approach is that the requirements regarding data precision are not so strong. Hence, the effort for data acquisition can be reduced and computations can be started earlier. 9 figs., 3 tabs., 17 refs., 1 appendix

  6. The Optimization of the Local Public Policies’ Development Process Through Modeling And Simulation

    Directory of Open Access Journals (Sweden)

    Minodora URSĂCESCU

    2012-06-01

    Full Text Available The local public policies development in Romania represents an empirically realized measure, the strategic management practices in this domain not being based on a scientific instrument capable to anticipate and evaluate the results of implementing a local public policy in a logic of needs-policies-effects type. Beginning from this motivation, the purpose of the paper resides in the reconceptualization of the public policies process on functioning principles of the dynamic systems with inverse connection, by means of mathematical modeling and techniques simulation. Therefore, the research is oriented in the direction of developing an optimization method for the local public policies development process, using as instruments the mathematical modeling and the techniques simulation. The research’s main results are on the one side constituted by generating a new process concept of the local public policies, and on the other side by proposing the conceptual model of a complex software product which will permit the parameterized modeling in a virtual environment of these policies development process. The informatic product’s finality resides in modeling and simulating each local public policy type, taking into account the respective policy’s characteristics, but also the value of their appliance environment parameters in a certain moment.

  7. Isometric Scaling in Developing Long Bones Is Achieved by an Optimal Epiphyseal Growth Balance.

    Science.gov (United States)

    Stern, Tomer; Aviram, Rona; Rot, Chagai; Galili, Tal; Sharir, Amnon; Kalish Achrai, Noga; Keller, Yosi; Shahar, Ron; Zelzer, Elazar

    2015-08-01

    One of the major challenges that developing organs face is scaling, that is, the adjustment of physical proportions during the massive increase in size. Although organ scaling is fundamental for development and function, little is known about the mechanisms that regulate it. Bone superstructures are projections that typically serve for tendon and ligament insertion or articulation and, therefore, their position along the bone is crucial for musculoskeletal functionality. As bones are rigid structures that elongate only from their ends, it is unclear how superstructure positions are regulated during growth to end up in the right locations. Here, we document the process of longitudinal scaling in developing mouse long bones and uncover the mechanism that regulates it. To that end, we performed a computational analysis of hundreds of three-dimensional micro-CT images, using a newly developed method for recovering the morphogenetic sequence of developing bones. Strikingly, analysis revealed that the relative position of all superstructures along the bone is highly preserved during more than a 5-fold increase in length, indicating isometric scaling. It has been suggested that during development, bone superstructures are continuously reconstructed and relocated along the shaft, a process known as drift. Surprisingly, our results showed that most superstructures did not drift at all. Instead, we identified a novel mechanism for bone scaling, whereby each bone exhibits a specific and unique balance between proximal and distal growth rates, which accurately maintains the relative position of its superstructures. Moreover, we show mathematically that this mechanism minimizes the cumulative drift of all superstructures, thereby optimizing the scaling process. Our study reveals a general mechanism for the scaling of developing bones. More broadly, these findings suggest an evolutionary mechanism that facilitates variability in bone morphology by controlling the activity of

  8. Isometric Scaling in Developing Long Bones Is Achieved by an Optimal Epiphyseal Growth Balance

    Science.gov (United States)

    Stern, Tomer; Aviram, Rona; Rot, Chagai; Galili, Tal; Sharir, Amnon; Kalish Achrai, Noga; Keller, Yosi; Shahar, Ron; Zelzer, Elazar

    2015-01-01

    One of the major challenges that developing organs face is scaling, that is, the adjustment of physical proportions during the massive increase in size. Although organ scaling is fundamental for development and function, little is known about the mechanisms that regulate it. Bone superstructures are projections that typically serve for tendon and ligament insertion or articulation and, therefore, their position along the bone is crucial for musculoskeletal functionality. As bones are rigid structures that elongate only from their ends, it is unclear how superstructure positions are regulated during growth to end up in the right locations. Here, we document the process of longitudinal scaling in developing mouse long bones and uncover the mechanism that regulates it. To that end, we performed a computational analysis of hundreds of three-dimensional micro-CT images, using a newly developed method for recovering the morphogenetic sequence of developing bones. Strikingly, analysis revealed that the relative position of all superstructures along the bone is highly preserved during more than a 5-fold increase in length, indicating isometric scaling. It has been suggested that during development, bone superstructures are continuously reconstructed and relocated along the shaft, a process known as drift. Surprisingly, our results showed that most superstructures did not drift at all. Instead, we identified a novel mechanism for bone scaling, whereby each bone exhibits a specific and unique balance between proximal and distal growth rates, which accurately maintains the relative position of its superstructures. Moreover, we show mathematically that this mechanism minimizes the cumulative drift of all superstructures, thereby optimizing the scaling process. Our study reveals a general mechanism for the scaling of developing bones. More broadly, these findings suggest an evolutionary mechanism that facilitates variability in bone morphology by controlling the activity of

  9. Tool development for organ dose optimization taking into account the image quality in Computed Tomography

    International Nuclear Information System (INIS)

    Adrien-Decoene, Camille

    2015-01-01

    Due to the significant rise of computed tomography (CT) exams in the past few years and the increase of the collective dose due to medical exams, dose estimation in CT imaging has become a major public health issue. However dose optimization cannot be considered without taking into account the image quality which has to be good enough for radiologists. In clinical practice, optimization is obtained through empirical index and image quality using measurements performed on specific phantoms like the CATPHAN. Based on this kind of information, it is thus difficult to correctly optimize protocols regarding organ doses and radiologist criteria. Therefore our goal is to develop a tool allowing the optimization of the patient dose while preserving the image quality needed for diagnosis. The work is divided into two main parts: (i) the development of a Monte Carlo dose simulator based on the PENELOPE code, and (ii) the assessment of an objective image quality criterion. For that purpose, the GE Lightspeed VCT 64 CT tube was modelled with information provided by the manufacturer technical note and by adapting the method proposed by Turner et al (Med. Phys. 36: 2154-2164). The axial and helical movements of the X-ray tube were then implemented into the MC tool. To improve the efficiency of the simulation, two variance reduction techniques were used: a circular and a translational splitting. The splitting algorithms allow a uniform particle distribution along the gantry path to simulate the continuous gantry motion in a discrete way. Validations were performed in homogeneous conditions using a home-made phantom and the well-known CTDI phantoms. Then, dose values were measured in CIRS ATOM anthropomorphic phantom using both optically stimulated luminescence dosimeters for point doses and XR-QA Gafchromic films for relative dose maps. Comparisons between measured and simulated values enabled us to validate the MC tool used for dosimetric purposes. Finally, organ doses for

  10. Chinese version of the Optimism and Pessimism Scale: Psychometric properties in mainland China and development of a short form.

    Science.gov (United States)

    Xia, Jie; Wu, Daxing; Zhang, Jibiao; Xu, Yuanchao; Xu, Yunxuan

    2016-06-01

    This study aimed to validate the Chinese version of the Optimism and Pessimism Scale in a sample of 730 adult Chinese individuals. Confirmatory factor analyses confirmed the bidimensionality of the scale with two factors, optimism and pessimism. The total scale and optimism and pessimism factors demonstrated satisfactory reliability and validity. Population-based normative data and mean values for gender, age, and education were determined. Furthermore, we developed a 20-item short form of the Chinese version of the Optimism and Pessimism Scale with structural validity comparable to the full form. In summary, the Chinese version of the Optimism and Pessimism Scale is an appropriate and practical tool for epidemiological research in mainland China. © The Author(s) 2014.

  11. Fumed silica nanoparticle mediated biomimicry for optimal cell-material interactions for artificial organ development.

    Science.gov (United States)

    de Mel, Achala; Ramesh, Bala; Scurr, David J; Alexander, Morgan R; Hamilton, George; Birchall, Martin; Seifalian, Alexander M

    2014-03-01

    Replacement of irreversibly damaged organs due to chronic disease, with suitable tissue engineered implants is now a familiar area of interest to clinicians and multidisciplinary scientists. Ideal tissue engineering approaches require scaffolds to be tailor made to mimic physiological environments of interest with specific surface topographical and biological properties for optimal cell-material interactions. This study demonstrates a single-step procedure for inducing biomimicry in a novel nanocomposite base material scaffold, to re-create the extracellular matrix, which is required for stem cell integration and differentiation to mature cells. Fumed silica nanoparticle mediated procedure of scaffold functionalization, can be potentially adapted with multiple bioactive molecules to induce cellular biomimicry, in the development human organs. The proposed nanocomposite materials already in patients for number of implants, including world first synthetic trachea, tear ducts and vascular bypass graft. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Development of a new concrete pipe molding machine using topology optimization

    International Nuclear Information System (INIS)

    Park, Hong Seok; Dahal, Prakash; Nguyen, Trung Thanh

    2016-01-01

    Sulfur polymer concrete (SPC) is a relatively new material used to replace Portland cement for manufacturing sewer pipes. The objective of this work is to develop an efficient molding machine with an inner rotating die to mix, compress and shape the SPC pipe. First, the alternative concepts were generated based on the TRIZ principles to overcome the drawbacks of existing machines. Then, the concept scoring technique was used to identify the best design in terms of machine structure and product quality. Finally, topology optimization was applied with the support of the density method to reduce mass and to displace the inner die. Results showed that the die volume can be reduced by approximately 9% and the displacement can be decreased by approximately 3% when compared with the initial design. This work is expected to improve the manufacturing efficiency of the concrete pipe molding machine

  13. Optimization of upstream and development of cellulose hydrolysis process for cellulosic bio-ethanol production

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Hyun Jong; Wi, Seung Gon; Kim, Su Bae; Shin, You Jung; Yi, Ju Hui [Chonnam National University, Bio-Energy Research Institute, Gwangju (Korea, Republic of)

    2010-10-15

    The purpose of this project is optimization of upstream and development of cellulose hydrolysis process for cellulosic bio-ethanol production. Research scope includes 1) screening of various microorganisms from decayed biomass in order to search for more efficient lignocellulose degrading microorganism, 2) identification and verification of new cell wall degrading cellulase for application cellulose bioconversion process, and 3) identification and characterization of novel genes involved in cellulose degradation. To find good microorganism candidates for lignocellulose degrading, 75 decayed samples from different areas were assayed in triplicate and analyzed. For cloning new cell wall degrading enzymes, we selected microorganisms because it have very good lignocellulose degradation ability. From that microorganisms, we have apparently cloned a new cellulase genes (10 genes). We are applying the new cloned cellulase genes to characterize in lignocellulsoe degradation that are most important to cellulosic biofuels production

  14. Optimization of upstream and development of cellulose hydrolysis process for cellulosic bio-ethanol production

    International Nuclear Information System (INIS)

    Bae, Hyun Jong; Wi, Seung Gon; Kim, Su Bae; Shin, You Jung; Yi, Ju Hui

    2010-10-01

    The purpose of this project is optimization of upstream and development of cellulose hydrolysis process for cellulosic bio-ethanol production. Research scope includes 1) screening of various microorganisms from decayed biomass in order to search for more efficient lignocellulose degrading microorganism, 2) identification and verification of new cell wall degrading cellulase for application cellulose bioconversion process, and 3) identification and characterization of novel genes involved in cellulose degradation. To find good microorganism candidates for lignocellulose degrading, 75 decayed samples from different areas were assayed in triplicate and analyzed. For cloning new cell wall degrading enzymes, we selected microorganisms because it have very good lignocellulose degradation ability. From that microorganisms, we have apparently cloned a new cellulase genes (10 genes). We are applying the new cloned cellulase genes to characterize in lignocellulsoe degradation that are most important to cellulosic biofuels production

  15. Supply Chain Optimization of Integrated Glycerol Biorefinery: GlyThink Model Development and Application

    DEFF Research Database (Denmark)

    Loureiro da Costa Lira Gargalo, Carina; Carvalho, Ana; Gernaey, Krist

    2017-01-01

    To further advance the development and implementation of glycerol-based biorefinery concepts, it is critical to analyze the glycerol conversion into high value-added products in a holistic manner, considering both production as well as the logistics aspects related to the supply chain structure...... is able to identify operational decisions, including locations, capacity levels, technologies, and product portfolio, as well as strategic decisions such as inventory levels, production amounts, and transportation to the final markets. Several technologies are considered for the glycerol valorization...... to high value-added products. Existing countries with major production and consumption of biodiesel in Europe are considered as candidates for the facility sites and demand markets, and their spatial distribution is also carefully studied. The results showed that (i) the optimal solution that provides...

  16. Development of a new concrete pipe molding machine using topology optimization

    Energy Technology Data Exchange (ETDEWEB)

    Park, Hong Seok; Dahal, Prakash [School of Mechanical and Automotive Engineering, University of Ulsan, Ulsan (Korea, Republic of); Nguyen, Trung Thanh [Faculty of Mechanical Engineering, Le Quy Don Technical University, Hanoi (Viet Nam)

    2016-08-15

    Sulfur polymer concrete (SPC) is a relatively new material used to replace Portland cement for manufacturing sewer pipes. The objective of this work is to develop an efficient molding machine with an inner rotating die to mix, compress and shape the SPC pipe. First, the alternative concepts were generated based on the TRIZ principles to overcome the drawbacks of existing machines. Then, the concept scoring technique was used to identify the best design in terms of machine structure and product quality. Finally, topology optimization was applied with the support of the density method to reduce mass and to displace the inner die. Results showed that the die volume can be reduced by approximately 9% and the displacement can be decreased by approximately 3% when compared with the initial design. This work is expected to improve the manufacturing efficiency of the concrete pipe molding machine.

  17. Water Use Optimization Toolset Project: Development and Demonstration Phase Draft Report

    Energy Technology Data Exchange (ETDEWEB)

    Gasper, John R. [Argonne National Laboratory; Veselka, Thomas D. [Argonne National Laboratory; Mahalik, Matthew R. [Argonne National Laboratory; Hayse, John W. [Argonne National Laboratory; Saha, Samrat [Argonne National Laboratory; Wigmosta, Mark S. [PNNL; Voisin, Nathalie [PNNL; Rakowski, Cynthia [PNNL; Coleman, Andre [PNNL; Lowry, Thomas S. [SNL

    2014-05-19

    This report summarizes the results of the development and demonstration phase of the Water Use Optimization Toolset (WUOT) project. It identifies the objective and goals that guided the project, as well as demonstrating potential benefits that could be obtained by applying the WUOT in different geo-hydrologic systems across the United States. A major challenge facing conventional hydropower plants is to operate more efficiently while dealing with an increasingly uncertain water-constrained environment and complex electricity markets. The goal of this 3-year WUOT project, which is funded by the U.S. Department of Energy (DOE), is to improve water management, resulting in more energy, revenues, and grid services from available water, and to enhance environmental benefits from improved hydropower operations and planning while maintaining institutional water delivery requirements. The long-term goal is for the WUOT to be used by environmental analysts and deployed by hydropower schedulers and operators to assist in market, dispatch, and operational decisions.

  18. Automation of Geometry Input for Building Code Compliance Check

    DEFF Research Database (Denmark)

    Petrova, Ekaterina Aleksandrova; Johansen, Peter Lind; Jensen, Rasmus Lund

    2017-01-01

    Documentation of compliance with the energy performance regulations at the end of the detailed design phase is mandatory for building owners in Denmark. Therefore, besides multidisciplinary input, the building design process requires various iterative analyses, so that the optimal solutions can...... be identified amongst multiple alternatives. However, meeting performance criteria is often associated with manual data inputs and retroactive modifications of the design. Due to poor interoperability between the authoring tools and the compliance check program, the processes are redundant and inefficient...... from building geometry created in Autodesk Revit and its translation to input for compliance check analysis....

  19. Development of a neuro-fuzzy technique for automated parameter optimization of inverse treatment planning

    International Nuclear Information System (INIS)

    Stieler, Florian; Yan, Hui; Lohr, Frank; Wenz, Frederik; Yin, Fang-Fang

    2009-01-01

    Parameter optimization in the process of inverse treatment planning for intensity modulated radiation therapy (IMRT) is mainly conducted by human planners in order to create a plan with the desired dose distribution. To automate this tedious process, an artificial intelligence (AI) guided system was developed and examined. The AI system can automatically accomplish the optimization process based on prior knowledge operated by several fuzzy inference systems (FIS). Prior knowledge, which was collected from human planners during their routine trial-and-error process of inverse planning, has first to be 'translated' to a set of 'if-then rules' for driving the FISs. To minimize subjective error which could be costly during this knowledge acquisition process, it is necessary to find a quantitative method to automatically accomplish this task. A well-developed machine learning technique, based on an adaptive neuro fuzzy inference system (ANFIS), was introduced in this study. Based on this approach, prior knowledge of a fuzzy inference system can be quickly collected from observation data (clinically used constraints). The learning capability and the accuracy of such a system were analyzed by generating multiple FIS from data collected from an AI system with known settings and rules. Multiple analyses showed good agreements of FIS and ANFIS according to rules (error of the output values of ANFIS based on the training data from FIS of 7.77 ± 0.02%) and membership functions (3.9%), thus suggesting that the 'behavior' of an FIS can be propagated to another, based on this process. The initial experimental results on a clinical case showed that ANFIS is an effective way to build FIS from practical data, and analysis of ANFIS and FIS with clinical cases showed good planning results provided by ANFIS. OAR volumes encompassed by characteristic percentages of isodoses were reduced by a mean of between 0 and 28%. The study demonstrated a feasible way

  20. X-ray backscatter imaging for radiography by selective detection and snapshot: Evolution, development, and optimization

    Science.gov (United States)

    Shedlock, Daniel

    Compton backscatter imaging (CBI) is a single-sided imaging technique that uses the penetrating power of radiation and unique interaction properties of radiation with matter to image subsurface features. CBI has a variety of applications that include non-destructive interrogation, medical imaging, security and military applications. Radiography by selective detection (RSD), lateral migration radiography (LMR) and shadow aperture backscatter radiography (SABR) are different CBI techniques that are being optimized and developed. Radiography by selective detection (RSD) is a pencil beam Compton backscatter imaging technique that falls between highly collimated and uncollimated techniques. Radiography by selective detection uses a combination of single- and multiple-scatter photons from a projected area below a collimation plane to generate an image. As a result, the image has a combination of first- and multiple-scatter components. RSD techniques offer greater subsurface resolution than uncollimated techniques, at speeds at least an order of magnitude faster than highly collimated techniques. RSD scanning systems have evolved from a prototype into near market-ready scanning devices for use in a variety of single-sided imaging applications. The design has changed to incorporate state-of-the-art detectors and electronics optimized for backscatter imaging with an emphasis on versatility, efficiency and speed. The RSD system has become more stable, about 4 times faster, and 60% lighter while maintaining or improving image quality and contrast over the past 3 years. A new snapshot backscatter radiography (SBR) CBI technique, shadow aperture backscatter radiography (SABR), has been developed from concept and proof-of-principle to a functional laboratory prototype. SABR radiography uses digital detection media and shaded aperture configurations to generate near-surface Compton backscatter images without scanning, similar to how transmission radiographs are taken. Finally, a

  1. Development of a neuro-fuzzy technique for automated parameter optimization of inverse treatment planning

    Directory of Open Access Journals (Sweden)

    Wenz Frederik

    2009-09-01

    Full Text Available Abstract Background Parameter optimization in the process of inverse treatment planning for intensity modulated radiation therapy (IMRT is mainly conducted by human planners in order to create a plan with the desired dose distribution. To automate this tedious process, an artificial intelligence (AI guided system was developed and examined. Methods The AI system can automatically accomplish the optimization process based on prior knowledge operated by several fuzzy inference systems (FIS. Prior knowledge, which was collected from human planners during their routine trial-and-error process of inverse planning, has first to be "translated" to a set of "if-then rules" for driving the FISs. To minimize subjective error which could be costly during this knowledge acquisition process, it is necessary to find a quantitative method to automatically accomplish this task. A well-developed machine learning technique, based on an adaptive neuro fuzzy inference system (ANFIS, was introduced in this study. Based on this approach, prior knowledge of a fuzzy inference system can be quickly collected from observation data (clinically used constraints. The learning capability and the accuracy of such a system were analyzed by generating multiple FIS from data collected from an AI system with known settings and rules. Results Multiple analyses showed good agreements of FIS and ANFIS according to rules (error of the output values of ANFIS based on the training data from FIS of 7.77 ± 0.02% and membership functions (3.9%, thus suggesting that the "behavior" of an FIS can be propagated to another, based on this process. The initial experimental results on a clinical case showed that ANFIS is an effective way to build FIS from practical data, and analysis of ANFIS and FIS with clinical cases showed good planning results provided by ANFIS. OAR volumes encompassed by characteristic percentages of isodoses were reduced by a mean of between 0 and 28%. Conclusion The

  2. Automation of Geometry Input for Building Code Compliance Check

    DEFF Research Database (Denmark)

    Petrova, Ekaterina Aleksandrova; Johansen, Peter Lind; Jensen, Rasmus Lund

    2017-01-01

    Documentation of compliance with the energy performance regulations at the end of the detailed design phase is mandatory for building owners in Denmark. Therefore, besides multidisciplinary input, the building design process requires various iterative analyses, so that the optimal solutions can....... That has left the industry in constant pursuit of possibilities for integration of the tool within the Building Information Modelling environment so that the potential provided by the latter can be harvested and the processed can be optimized. This paper presents a solution for automated data extraction...... from building geometry created in Autodesk Revit and its translation to input for compliance check analysis....

  3. Development of an Optimal Power Control Scheme for Wave-Offshore Hybrid Generation Systems

    Directory of Open Access Journals (Sweden)

    Seungmin Jung

    2015-08-01

    Full Text Available Integration technology of various distribution systems for improving renewable energy utilization has been receiving attention in the power system industry. The wave-offshore hybrid generation system (HGS, which has a capacity of over 10 MW, was recently developed by adopting several voltage source converters (VSC, while a control method for adopted power conversion systems has not yet been configured in spite of the unique system characteristics of the designated structure. This paper deals with a reactive power assignment method for the developed hybrid system to improve the power transfer efficiency of the entire system. Through the development and application processes for an optimization algorithm utilizing the real-time active power profiles of each generator, a feasibility confirmation of power transmission loss reduction was implemented. To find the practical effect of the proposed control scheme, the real system information regarding the demonstration process was applied from case studies. Also, an evaluation for the loss of the improvement rate was calculated.

  4. Development of an Empirical Model for Optimization of Machining Parameters to Minimize Power Consumption

    Science.gov (United States)

    Kant Garg, Girish; Garg, Suman; Sangwan, K. S.

    2018-04-01

    The manufacturing sector consumes huge energy demand and the machine tools used in this sector have very less energy efficiency. Selection of the optimum machining parameters for machine tools is significant for energy saving and for reduction of environmental emission. In this work an empirical model is developed to minimize the power consumption using response surface methodology. The experiments are performed on a lathe machine tool during the turning of AISI 6061 Aluminum with coated tungsten inserts. The relationship between the power consumption and machining parameters is adequately modeled. This model is used for formulation of minimum power consumption criterion as a function of optimal machining parameters using desirability function approach. The influence of machining parameters on the energy consumption has been found using the analysis of variance. The validation of the developed empirical model is proved using the confirmation experiments. The results indicate that the developed model is effective and has potential to be adopted by the industry for minimum power consumption of machine tools.

  5. Optimizing real time fMRI neurofeedback for therapeutic discovery and development

    Science.gov (United States)

    Stoeckel, L.E.; Garrison, K.A.; Ghosh, S.; Wighton, P.; Hanlon, C.A.; Gilman, J.M.; Greer, S.; Turk-Browne, N.B.; deBettencourt, M.T.; Scheinost, D.; Craddock, C.; Thompson, T.; Calderon, V.; Bauer, C.C.; George, M.; Breiter, H.C.; Whitfield-Gabrieli, S.; Gabrieli, J.D.; LaConte, S.M.; Hirshberg, L.; Brewer, J.A.; Hampson, M.; Van Der Kouwe, A.; Mackey, S.; Evins, A.E.

    2014-01-01

    While reducing the burden of brain disorders remains a top priority of organizations like the World Health Organization and National Institutes of Health, the development of novel, safe and effective treatments for brain disorders has been slow. In this paper, we describe the state of the science for an emerging technology, real time functional magnetic resonance imaging (rtfMRI) neurofeedback, in clinical neurotherapeutics. We review the scientific potential of rtfMRI and outline research strategies to optimize the development and application of rtfMRI neurofeedback as a next generation therapeutic tool. We propose that rtfMRI can be used to address a broad range of clinical problems by improving our understanding of brain–behavior relationships in order to develop more specific and effective interventions for individuals with brain disorders. We focus on the use of rtfMRI neurofeedback as a clinical neurotherapeutic tool to drive plasticity in brain function, cognition, and behavior. Our overall goal is for rtfMRI to advance personalized assessment and intervention approaches to enhance resilience and reduce morbidity by correcting maladaptive patterns of brain function in those with brain disorders. PMID:25161891

  6. Development of an improved genetic algorithm and its application in the optimal design of ship nuclear power system

    International Nuclear Information System (INIS)

    Jia Baoshan; Yu Jiyang; You Songbo

    2005-01-01

    This article focuses on the development of an improved genetic algorithm and its application in the optimal design of the ship nuclear reactor system, whose goal is to find a combination of system parameter values that minimize the mass or volume of the system given the power capacity requirement and safety criteria. An improved genetic algorithm (IGA) was developed using an 'average fitness value' grouping + 'specified survival probability' rank selection method and a 'separate-recombine' duplication operator. Combining with a simulated annealing algorithm (SAA) that continues the local search after the IGA reaches a satisfactory point, the algorithm gave satisfactory optimization results from both search efficiency and accuracy perspectives. This IGA-SAA algorithm successfully solved the design optimization problem of ship nuclear power system. It is an advanced and efficient methodology that can be applied to the similar optimization problems in other areas. (authors)

  7. Development of a parallel genetic algorithm using MPI and its application in a nuclear reactor core. Design optimization

    International Nuclear Information System (INIS)

    Waintraub, Marcel; Pereira, Claudio M.N.A.; Baptista, Rafael P.

    2005-01-01

    This work presents the development of a distributed parallel genetic algorithm applied to a nuclear reactor core design optimization. In the implementation of the parallelism, a 'Message Passing Interface' (MPI) library, standard for parallel computation in distributed memory platforms, has been used. Another important characteristic of MPI is its portability for various architectures. The main objectives of this paper are: validation of the results obtained by the application of this algorithm in a nuclear reactor core optimization problem, through comparisons with previous results presented by Pereira et al.; and performance test of the Brazilian Nuclear Engineering Institute (IEN) cluster in reactors physics optimization problems. The experiments demonstrated that the developed parallel genetic algorithm using the MPI library presented significant gains in the obtained results and an accentuated reduction of the processing time. Such results ratify the use of the parallel genetic algorithms for the solution of nuclear reactor core optimization problems. (author)

  8. Turn customer input into innovation.

    Science.gov (United States)

    Ulwick, Anthony W

    2002-01-01

    It's difficult to find a company these days that doesn't strive to be customer-driven. Too bad, then, that most companies go about the process of listening to customers all wrong--so wrong, in fact, that they undermine innovation and, ultimately, the bottom line. What usually happens is this: Companies ask their customers what they want. Customers offer solutions in the form of products or services. Companies then deliver these tangibles, and customers just don't buy. The reason is simple--customers aren't expert or informed enough to come up with solutions. That's what your R&D team is for. Rather, customers should be asked only for outcomes--what they want a new product or service to do for them. The form the solutions take should be up to you, and you alone. Using Cordis Corporation as an example, this article describes, in fine detail, a series of effective steps for capturing, analyzing, and utilizing customer input. First come indepth interviews, in which a moderator works with customers to deconstruct a process or activity in order to unearth "desired outcomes." Addressing participants' comments one at a time, the moderator rephrases them to be both unambiguous and measurable. Once the interviews are complete, researchers then compile a comprehensive list of outcomes that participants rank in order of importance and degree to which they are satisfied by existing products. Finally, using a simple mathematical formula called the "opportunity calculation," researchers can learn the relative attractiveness of key opportunity areas. These data can be used to uncover opportunities for product development, to properly segment markets, and to conduct competitive analysis.

  9. Optimizing the multimodal approach to pancreatic cyst fluid diagnosis: developing a volume-based triage protocol.

    Science.gov (United States)

    Chai, Siaw Ming; Herba, Karl; Kumarasinghe, M Priyanthi; de Boer, W Bastiaan; Amanuel, Benhur; Grieu-Iacopetta, Fabienne; Lim, Ee Mun; Segarajasingam, Dev; Yusoff, Ian; Choo, Chris; Frost, Felicity

    2013-02-01

    The objective of this study was to develop a triage algorithm to optimize diagnostic yield from cytology, carcinoembryonic antigen (CEA), and v-Ki-ras2 Kirsten rat sarcoma viral oncogene homolog (KRAS) testing on different components of a single pancreatic cyst fluid specimen. The authors also sought to determine whether cell block supernatant was suitable for CEA and KRAS testing. Fifty-four pancreatic cysts were triaged according to a volume-dependent protocol to generate fluid (neat and supernatant) and cell block specimens for cytology, comparative CEA, and KRAS testing. Follow-up histology, diagnostic cytology, or a combined clinicopathologic interpretation was recorded as the final diagnosis. There were 26 mucinous cystic lesions and 28 nonmucinous cystic lesions with volumes ranging from 0.3 mL to 55 mL. Testing different components of the specimens (cell block, neat, and/or supernatant) enabled all laboratory investigations to be performed on 50 of 54 cyst fluids (92.6%). Interpretive concordance was observed in 17 of 17 cases (100%) and in 35 of 40 cases (87.5%) that had multiple components tested for CEA and KRAS mutations, respectively. An elevated CEA level (>192 ng/mL) was the most sensitive test for the detection of a mucinous cystic lesion (62.5%) versus KRAS mutation (56%) and "positive" cytology (61.5%). KRAS mutations were identified in 2 of 25 mucinous cystic lesions (8%) in which cytology and CEA levels were not contributory. A volume-based protocol using different components of the specimen was able to optimize diagnostic yield in pancreatic cyst fluids. KRAS mutation testing increased diagnostic yield when combined with cytology and CEA analysis. The current results demonstrated that supernatant is comparable to neat fluid and cell block material for CEA and KRAS testing. Copyright © 2012 American Cancer Society.

  10. Development of non-bonded interaction parameters between graphene and water using particle swarm optimization.

    Science.gov (United States)

    Bejagam, Karteek K; Singh, Samrendra; Deshmukh, Sanket A

    2018-05-05

    New Lennard-Jones parameters have been developed to describe the interactions between atomistic model of graphene, represented by REBO potential, and five commonly used all-atom water models, namely SPC, SPC/E, SPC/Fw, SPC/Fd, and TIP3P/Fs by employing particle swarm optimization (PSO) method. These new parameters were optimized to reproduce the macroscopic contact angle of water on a graphene sheet. The calculated line tension was in the order of 10 -11 J/m for the droplets of all water models. Our molecular dynamics simulations indicate the preferential orientation of water molecules near graphene-water interface with one OH bond pointing toward the graphene surface. Detailed analysis of simulation trajectories reveals the presence of water molecules with ≤∼1, ∼2, and ∼4 hydrogen bonds at the surface of air-water interface, graphene-water interface, and bulk region of the water droplet, respectively. Presence of water molecules with ≤∼1 and ∼2 hydrogen bonds suggest the existence of water clusters of different sizes at these interfaces. The trends observed in the libration, bending, and stretching bands of the vibrational spectra are closely associated with these structural features of water. The inhomogeneity in hydrogen bond network of water at the air-water and graphene-water interface is manifested by broadening of the peaks in the libration band for water present at these interfaces. The stretching band for the molecules in water droplet shows a blue shift as compared to the pure bulk water, which conjecture the presence of weaker hydrogen bond network in a droplet. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  11. Optimization and development of antidiabetic phytosomes by the Box-Behnken design.

    Science.gov (United States)

    Rathee, Sushila; Kamboj, Anjoo

    2018-06-01

    Researchers have extensively reviewed on herbs and natural products for their marked clinical efficacy in some recent years, however, maximum of the newly discovered bioactive constituents offer poor bioavailability due to their large size molecules or to their poor miscibility with oils and lipids, thereby limiting their ability to pass across the lipid-rich outer membranes of the enterocytes of the small intestine. Phytosomes are more bioavailable as compared to herbal extracts owing to their enhanced capacity to cross the bio-membranes and thus reaching the systemic circulation. This study was aimed to investigate the development and optimization of antidiabetic phytosomes using a three-factor, three-level the Box-Behnken design (17 batches). The fruits of Citrullus colocynthis (L.) Momordica balsamina and Momordica dioica were extracted using Soxhlet's apparatus. The phytochemical fingerprint profile of the combined methanolic extracts was done by using high-performance thin layer chromatography (HPTLC). The polynomial quadratic equation analysis was designed to study the response (entrapment efficiency (EE), % yield) of independent significant factors at different levels. Phytosomes were characterized in terms of drug content, particle size, EE, zeta potential and in vitro dissolution. TEM analysis revealed good stability and a spherical, self-closed structure of phytosomes in complex formulations. Average particle size was found to 450 nm. Total flavonoid content was found to be 10.0 ± 0.002 μg/g. Optimized formulation was selected and was prepared using A (1:3), B (60 °C) and C (2.5 h) to give maximum yield and entrapment efficiencies (72% and 92.1 ± 5.1%). Phytosomes were found to have antidiabetic activity comparable to metformin in low dose. HPTLC showed the presence of the phyto-constituent quercetin.

  12. Automation of RELAP5 input calibration and code validation using genetic algorithm

    International Nuclear Information System (INIS)

    Phung, Viet-Anh; Kööp, Kaspar; Grishchenko, Dmitry; Vorobyev, Yury; Kudinov, Pavel

    2016-01-01

    Highlights: • Automated input calibration and code validation using genetic algorithm is presented. • Predictions generally overlap experiments for individual system response quantities (SRQs). • It was not possible to predict simultaneously experimental maximum flow rate and oscillation period. • Simultaneous consideration of multiple SRQs is important for code validation. - Abstract: Validation of system thermal-hydraulic codes is an important step in application of the codes to reactor safety analysis. The goal of the validation process is to determine how well a code can represent physical reality. This is achieved by comparing predicted and experimental system response quantities (SRQs) taking into account experimental and modelling uncertainties. Parameters which are required for the code input but not measured directly in the experiment can become an important source of uncertainty in the code validation process. Quantification of such parameters is often called input calibration. Calibration and uncertainty quantification may become challenging tasks when the number of calibrated input parameters and SRQs is large and dependencies between them are complex. If only engineering judgment is employed in the process, the outcome can be prone to so called “user effects”. The goal of this work is to develop an automated approach to input calibration and RELAP5 code validation against data on two-phase natural circulation flow instability. Multiple SRQs are used in both calibration and validation. In the input calibration, we used genetic algorithm (GA), a heuristic global optimization method, in order to minimize the discrepancy between experimental and simulation data by identifying optimal combinations of uncertain input parameters in the calibration process. We demonstrate the importance of the proper selection of SRQs and respective normalization and weighting factors in the fitness function. In the code validation, we used maximum flow rate as the

  13. Automation of RELAP5 input calibration and code validation using genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Phung, Viet-Anh, E-mail: vaphung@kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Kööp, Kaspar, E-mail: kaspar@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Grishchenko, Dmitry, E-mail: dmitry@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden); Vorobyev, Yury, E-mail: yura3510@gmail.com [National Research Center “Kurchatov Institute”, Kurchatov square 1, Moscow 123182 (Russian Federation); Kudinov, Pavel, E-mail: pavel@safety.sci.kth.se [Division of Nuclear Power Safety, Royal Institute of Technology, Roslagstullsbacken 21, 10691 Stockholm (Sweden)

    2016-04-15

    Highlights: • Automated input calibration and code validation using genetic algorithm is presented. • Predictions generally overlap experiments for individual system response quantities (SRQs). • It was not possible to predict simultaneously experimental maximum flow rate and oscillation period. • Simultaneous consideration of multiple SRQs is important for code validation. - Abstract: Validation of system thermal-hydraulic codes is an important step in application of the codes to reactor safety analysis. The goal of the validation process is to determine how well a code can represent physical reality. This is achieved by comparing predicted and experimental system response quantities (SRQs) taking into account experimental and modelling uncertainties. Parameters which are required for the code input but not measured directly in the experiment can become an important source of uncertainty in the code validation process. Quantification of such parameters is often called input calibration. Calibration and uncertainty quantification may become challenging tasks when the number of calibrated input parameters and SRQs is large and dependencies between them are complex. If only engineering judgment is employed in the process, the outcome can be prone to so called “user effects”. The goal of this work is to develop an automated approach to input calibration and RELAP5 code validation against data on two-phase natural circulation flow instability. Multiple SRQs are used in both calibration and validation. In the input calibration, we used genetic algorithm (GA), a heuristic global optimization method, in order to minimize the discrepancy between experimental and simulation data by identifying optimal combinations of uncertain input parameters in the calibration process. We demonstrate the importance of the proper selection of SRQs and respective normalization and weighting factors in the fitness function. In the code validation, we used maximum flow rate as the

  14. Optimization of Heat Exchangers

    International Nuclear Information System (INIS)

    Catton, Ivan

    2010-01-01

    The objective of this research is to develop tools to design and optimize heat exchangers (HE) and compact heat exchangers (CHE) for intermediate loop heat transport systems found in the very high temperature reator (VHTR) and other Generation IV designs by addressing heat transfer surface augmentation and conjugate modeling. To optimize heat exchanger, a fast running model must be created that will allow for multiple designs to be compared quickly. To model a heat exchanger, volume averaging theory, VAT, is used. VAT allows for the conservation of mass, momentum and energy to be solved for point by point in a 3 dimensional computer model of a heat exchanger. The end product of this project is a computer code that can predict an optimal configuration for a heat exchanger given only a few constraints (input fluids, size, cost, etc.). As VAT computer code can be used to model characteristics (pumping power, temperatures, and cost) of heat exchangers more quickly than traditional CFD or experiment, optimization of every geometric parameter simultaneously can be made. Using design of experiment, DOE and genetric algorithms, GE, to optimize the results of the computer code will improve heat exchanger design.

  15. Development and parameter optimization of maize flat bread supplemented with asparagus bean flour

    Directory of Open Access Journals (Sweden)

    Tajamul Rouf SHAH

    Full Text Available Abstract The purpose of this study was to develop maize flat bread supplemented with asparagus bean flour (ABF. Preliminary study was conducted for maximum supplementation of ABF on the basis of sensory attributes and it was found that 15% ABF can be supplemented. Further a composite flour containing 85% maize flour (MF and 15% ABF was used for the preparation of flat bread. The effect of baking temperature (200 to 235 °C and baking time [time 1 (surface 1 and time 2 (surface 2] (70 to 120 sec on product responses such as sensory characteristics (overall color, appearance, flavor, taste, mouth feel, overall acceptability, texture (shear value and moisture content were studied. Results indicated that baking temperature and baking time had significant (p < 0.05 positive effect on sensory characteristics and shear value, while significant (p < 0.05 negative effect on moisture content. Numerical optimization resulted in baking temperature 225 °C, baking time 1 (120 sec for surface 1 and time 2 (116 sec for surface 2 to develop a flat bread with best quality.

  16. Optimization of controlled processes in combined-cycle plant (new developments and researches)

    Science.gov (United States)

    Tverskoy, Yu S.; Muravev, I. K.

    2017-11-01

    All modern complex technical systems, including power units of TPP and nuclear power plants, work in the system-forming structure of multifunctional APCS. The development of the modern APCS mathematical support allows bringing the automation degree to the solution of complex optimization problems of equipment heat-mass-exchange processes in real time. The difficulty of efficient management of a binary power unit is related to the need to solve jointly at least three problems. The first problem is related to the physical issues of combined-cycle technologies. The second problem is determined by the criticality of the CCGT operation to changes in the regime and climatic factors. The third problem is related to a precise description of a vector of controlled coordinates of a complex technological object. To obtain a joint solution of this complex of interconnected problems, the methodology of generalized thermodynamic analysis, methods of the theory of automatic control and mathematical modeling are used. In the present report, results of new developments and studies are shown. These results allow improving the principles of process control and the automatic control systems structural synthesis of power units with combined-cycle plants that provide attainable technical and economic efficiency and operational reliability of equipment.

  17. Process of optimization of retail trade spatial development with application of locational-alocational models

    Directory of Open Access Journals (Sweden)

    Kukrika Milan

    2008-01-01

    Full Text Available This article gives a simple and brief scope of structure and usage of location-allocation models in territory planning of retail network, trying to show the main shortage of some given models and the primary direction of their future improving. We give an inspection of theirs main usage and give an explanation of basic factors that models take in consideration during the process of demand allocation. Location-allocation models are an important segment of development of spatial retail network optimization process. Their future improvement is going towards their approximation and integration with spatial-interaction models. In this way, much better methodology of planning and directing spatial development of trade general. Methodology which we have used in this research paper is based on the literature and research projects in the area. Using this methodology in analyzing parts of Serbian territory through usage of location-allocation models, showed the need for creating special software for calculating matrix with recursions. Considering the fact that the integration of location-allocation models with GIS still didn't occur, all the results acquired during the calculation of methaformula has been brought into ArcGIS 9.2 software and presented as maps.

  18. Development of hydraulic analysis code for optimizing thermo-chemical is process reactors

    International Nuclear Information System (INIS)

    Terada, Atsuhiko; Hino, Ryutaro; Hirayama, Toshio; Nakajima, Norihiro; Sugiyama, Hitoshi

    2007-01-01

    The Japan Atomic Energy Agency has been conducting study on thermochemical IS process for water splitting hydrogen production. Based on the test results and know-how obtained through the bench-scale test, a pilot test plant, which has a hydrogen production performance of 30 Nm 3 /h, is being designed conceptually as the next step of the IS process development. In design of the IS pilot plant, it is important to make chemical reactors compact with high performance from the viewpoint of plant cost reduction. A new hydraulic analytical code has been developed for optimizing mixing performance of multi-phase flow involving chemical reactions especially in the Bunsen reactor. Complex flow pattern with gas-liquid chemical interaction involving flow instability will be characterized in the Bunsen reactor. Preliminary analytical results obtained with above mentioned code, especially flow patterns induced by swirling flow agreed well with that measured by water experiments, which showed vortex breakdown pattern in a simplified Bunsen reactor. (author)

  19. Systematic development and optimization of chemically defined medium supporting high cell density growth of Bacillus coagulans.

    Science.gov (United States)

    Chen, Yu; Dong, Fengqing; Wang, Yonghong

    2016-09-01

    With determined components and experimental reducibility, the chemically defined medium (CDM) and the minimal chemically defined medium (MCDM) are used in many metabolism and regulation studies. This research aimed to develop the chemically defined medium supporting high cell density growth of Bacillus coagulans, which is a promising producer of lactic acid and other bio-chemicals. In this study, a systematic methodology combining the experimental technique with flux balance analysis (FBA) was proposed to design and simplify a CDM. The single omission technique and single addition technique were employed to determine the essential and stimulatory compounds, before the optimization of their concentrations by the statistical method. In addition, to improve the growth rationally, in silico omission and addition were performed by FBA based on the construction of a medium-size metabolic model of B. coagulans 36D1. Thus, CDMs were developed to obtain considerable biomass production of at least five B. coagulans strains, in which two model strains B. coagulans 36D1 and ATCC 7050 were involved.

  20. SAT, a flexible and optimized Web application for SSR marker development

    Directory of Open Access Journals (Sweden)

    Rami Jean-François

    2007-11-01

    Full Text Available Abstract Background Simple Sequence Repeats (SSRs, or microsatellites, are among the most powerful genetic markers known. A common method for the development of SSR markers is the construction of genomic DNA libraries enriched for SSR sequences, followed by DNA sequencing. However, designing optimal SSR markers from bulk sequence data is a laborious and time-consuming process. Results SAT (SSR Analysis Tool is a user-friendly Web application developed to minimize tedious manual operations and reduce errors. This tool facilitates the integration, analysis and display of sequence data from SSR-enriched libraries. SAT is designed to successively perform base calling and quality evaluation of chromatograms, eliminate cloning vector, adaptors and low quality sequences, detect chimera or partially digested sequences, search for SSR motifs, cluster and assemble the redundant sequences, and design SSR primer pairs. An additional virtual PCR step establishes primer specificity. Users may modify the different parameters of each step of the SAT analysis. Although certain steps are compulsory, such as SSR motifs search and sequence assembly, users do not have to run the entire pipeline, and they can choose selectively which steps to perform. A database allows users to store and query results, and to redo individual steps of the workflow. Conclusion The SAT Web application is available at http://sat.cirad.fr/sat, and a standalone command-line version is also freely downloadable. Users must send an email to the SAT administrator tropgene@cirad.fr to request a login and password.