WorldWideScience

Sample records for robust process integration

  1. Conceptual information processing: A robust approach to KBS-DBMS integration

    Science.gov (United States)

    Lazzara, Allen V.; Tepfenhart, William; White, Richard C.; Liuzzi, Raymond

    1987-01-01

    Integrating the respective functionality and architectural features of knowledge base and data base management systems is a topic of considerable interest. Several aspects of this topic and associated issues are addressed. The significance of integration and the problems associated with accomplishing that integration are discussed. The shortcomings of current approaches to integration and the need to fuse the capabilities of both knowledge base and data base management systems motivates the investigation of information processing paradigms. One such paradigm is concept based processing, i.e., processing based on concepts and conceptual relations. An approach to robust knowledge and data base system integration is discussed by addressing progress made in the development of an experimental model for conceptual information processing.

  2. An optimization methodology for identifying robust process integration investments under uncertainty

    International Nuclear Information System (INIS)

    Svensson, Elin; Berntsson, Thore; Stroemberg, Ann-Brith; Patriksson, Michael

    2009-01-01

    Uncertainties in future energy prices and policies strongly affect decisions on investments in process integration measures in industry. In this paper, we present a five-step methodology for the identification of robust investment alternatives incorporating explicitly such uncertainties in the optimization model. Methods for optimization under uncertainty (or, stochastic programming) are thus combined with a deep understanding of process integration and process technology in order to achieve a framework for decision-making concerning the investment planning of process integration measures under uncertainty. The proposed methodology enables the optimization of investments in energy efficiency with respect to their net present value or an environmental objective. In particular, as a result of the optimization approach, complex investment alternatives, allowing for combinations of energy efficiency measures, can be analyzed. Uncertainties as well as time-dependent parameters, such as energy prices and policies, are modelled using a scenario-based approach, enabling the identification of robust investment solutions. The methodology is primarily an aid for decision-makers in industry, but it will also provide insight for policy-makers into how uncertainties regarding future price levels and policy instruments affect the decisions on investments in energy efficiency measures. (author)

  3. An optimization methodology for identifying robust process integration investments under uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Elin; Berntsson, Thore [Department of Energy and Environment, Division of Heat and Power Technology, Chalmers University of Technology, SE-412 96 Goeteborg (Sweden); Stroemberg, Ann-Brith [Fraunhofer-Chalmers Research Centre for Industrial Mathematics, Chalmers Science Park, SE-412 88 Gothenburg (Sweden); Patriksson, Michael [Department of Mathematical Sciences, Chalmers University of Technology and Department of Mathematical Sciences, University of Gothenburg, SE-412 96 Goeteborg (Sweden)

    2009-02-15

    Uncertainties in future energy prices and policies strongly affect decisions on investments in process integration measures in industry. In this paper, we present a five-step methodology for the identification of robust investment alternatives incorporating explicitly such uncertainties in the optimization model. Methods for optimization under uncertainty (or, stochastic programming) are thus combined with a deep understanding of process integration and process technology in order to achieve a framework for decision-making concerning the investment planning of process integration measures under uncertainty. The proposed methodology enables the optimization of investments in energy efficiency with respect to their net present value or an environmental objective. In particular, as a result of the optimization approach, complex investment alternatives, allowing for combinations of energy efficiency measures, can be analyzed. Uncertainties as well as time-dependent parameters, such as energy prices and policies, are modelled using a scenario-based approach, enabling the identification of robust investment solutions. The methodology is primarily an aid for decision-makers in industry, but it will also provide insight for policy-makers into how uncertainties regarding future price levels and policy instruments affect the decisions on investments in energy efficiency measures. (author)

  4. A 2-Dof LQR based PID controller for integrating processes considering robustness/performance tradeoff.

    Science.gov (United States)

    Srivastava, Saurabh; Pandit, V S

    2017-11-01

    This paper focuses on the analytical design of a Proportional Integral and Derivative (PID) controller together with a unique set point filter that makes the overall Two-Degree of-Freedom (2-Dof) control system for integrating processes with time delay. The PID controller tuning is based on the Linear Quadratic Regulator (LQR) using dominant pole placement approach to obtain good regulatory response. The set point filter is designed with the calculated PID parameters and using a single filter time constant (λ) to precisely control the servo response. The effectiveness of the proposed methodology is demonstrated through a series of illustrative examples using real industrial integrated process models. The whole range of PID parameters is obtained for each case in a tradeoff between the robustness of the closed loop system measured in terms of Maximum Sensitivity (M s ) and the load disturbance measured in terms of Integral of Absolute Errors (IAE). Results show improved closed loop response in terms of regulatory and servo responses with less control efforts when compared with the latest PID tuning methods of integrating systems. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Benefits of using an optimization methodology for identifying robust process integration investments under uncertainty-A pulp mill example

    International Nuclear Information System (INIS)

    Svensson, Elin; Berntsson, Thore; Stroemberg, Ann-Brith

    2009-01-01

    This paper presents a case study on the optimization of process integration investments in a pulp mill considering uncertainties in future electricity and biofuel prices and CO 2 emissions charges. The work follows the methodology described in Svensson et al. [Svensson, E., Berntsson, T., Stroemberg, A.-B., Patriksson, M., 2008b. An optimization methodology for identifying robust process integration investments under uncertainty. Energy Policy, in press, (doi:10.1016/j.enpol.2008.10.023)] where a scenario-based approach is proposed for the modelling of uncertainties. The results show that the proposed methodology provides a way to handle the time dependence and the uncertainties of the parameters. For the analyzed case, a robust solution is found which turns out to be a combination of two opposing investment strategies. The difference between short-term and strategic views for the investment decision is analyzed and it is found that uncertainties are increasingly important to account for as a more strategic view is employed. Furthermore, the results imply that the obvious effect of policy instruments aimed at decreasing CO 2 emissions is, in applications like this, an increased profitability for all energy efficiency investments, and not as much a shift between different alternatives

  6. Benefits of using an optimization methodology for identifying robust process integration investments under uncertainty-A pulp mill example

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Elin [Department of Energy and Environment, Division of Heat and Power Technology, Chalmers University of Technology, SE-412 96 Goeteborg (Sweden)], E-mail: elin.svensson@chalmers.se; Berntsson, Thore [Department of Energy and Environment, Division of Heat and Power Technology, Chalmers University of Technology, SE-412 96 Goeteborg (Sweden); Stroemberg, Ann-Brith [Fraunhofer-Chalmers Research Centre for Industrial Mathematics, Chalmers Science Park, SE-412 88 Gothenburg (Sweden)

    2009-03-15

    This paper presents a case study on the optimization of process integration investments in a pulp mill considering uncertainties in future electricity and biofuel prices and CO{sub 2} emissions charges. The work follows the methodology described in Svensson et al. [Svensson, E., Berntsson, T., Stroemberg, A.-B., Patriksson, M., 2008b. An optimization methodology for identifying robust process integration investments under uncertainty. Energy Policy, in press, (doi:10.1016/j.enpol.2008.10.023)] where a scenario-based approach is proposed for the modelling of uncertainties. The results show that the proposed methodology provides a way to handle the time dependence and the uncertainties of the parameters. For the analyzed case, a robust solution is found which turns out to be a combination of two opposing investment strategies. The difference between short-term and strategic views for the investment decision is analyzed and it is found that uncertainties are increasingly important to account for as a more strategic view is employed. Furthermore, the results imply that the obvious effect of policy instruments aimed at decreasing CO{sub 2} emissions is, in applications like this, an increased profitability for all energy efficiency investments, and not as much a shift between different alternatives.

  7. Benefits of using an optimization methodology for identifying robust process integration investments under uncertainty. A pulp mill example

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Elin; Berntsson, Thore [Department of Energy and Environment, Division of Heat and Power Technology, Chalmers University of Technology, SE-412 96 Goeteborg (Sweden); Stroemberg, Ann-Brith [Fraunhofer-Chalmers Research Centre for Industrial Mathematics, Chalmers Science Park, SE-412 88 Gothenburg (Sweden)

    2009-03-15

    This paper presents a case study on the optimization of process integration investments in a pulp mill considering uncertainties in future electricity and biofuel prices and CO{sub 2} emissions charges. The work follows the methodology described in Svensson et al. [Svensson, E., Berntsson, T., Stroemberg, A.-B., Patriksson, M., 2008b. An optimization methodology for identifying robust process integration investments under uncertainty. Energy Policy, in press, doi:10.1016/j.enpol.2008.10.023] where a scenario-based approach is proposed for the modelling of uncertainties. The results show that the proposed methodology provides a way to handle the time dependence and the uncertainties of the parameters. For the analyzed case, a robust solution is found which turns out to be a combination of two opposing investment strategies. The difference between short-term and strategic views for the investment decision is analyzed and it is found that uncertainties are increasingly important to account for as a more strategic view is employed. Furthermore, the results imply that the obvious effect of policy instruments aimed at decreasing CO{sub 2} emissions is, in applications like this, an increased profitability for all energy efficiency investments, and not as much a shift between different alternatives. (author)

  8. Integrated process development-a robust, rapid method for inclusion body harvesting and processing at the microscale level.

    Science.gov (United States)

    Walther, Cornelia; Kellner, Martin; Berkemeyer, Matthias; Brocard, Cécile; Dürauer, Astrid

    2017-10-21

    Escherichia coli stores large amounts of highly pure product within inclusion bodies (IBs). To take advantage of this beneficial feature, after cell disintegration, the first step to optimal product recovery is efficient IB preparation. This step is also important in evaluating upstream optimization and process development, due to the potential impact of bioprocessing conditions on product quality and on the nanoscale properties of IBs. Proper IB preparation is often neglected, due to laboratory-scale methods requiring large amounts of materials and labor. Miniaturization and parallelization can accelerate analyses of individual processing steps and provide a deeper understanding of up- and downstream processing interdependencies. Consequently, reproducible, predictive microscale methods are in demand. In the present study, we complemented a recently established high-throughput cell disruption method with a microscale method for preparing purified IBs. This preparation provided results comparable to laboratory-scale IB processing, regarding impurity depletion, and product loss. Furthermore, with this method, we performed a "design of experiments" study to demonstrate the influence of fermentation conditions on the performance of subsequent downstream steps and product quality. We showed that this approach provided a 300-fold reduction in material consumption for each fermentation condition and a 24-fold reduction in processing time for 24 samples.

  9. Integrated hot-melt extrusion - injection molding continuous tablet manufacturing platform: Effects of critical process parameters and formulation attributes on product robustness and dimensional stability.

    Science.gov (United States)

    Desai, Parind M; Hogan, Rachael C; Brancazio, David; Puri, Vibha; Jensen, Keith D; Chun, Jung-Hoon; Myerson, Allan S; Trout, Bernhardt L

    2017-10-05

    This study provides a framework for robust tablet development using an integrated hot-melt extrusion-injection molding (IM) continuous manufacturing platform. Griseofulvin, maltodextrin, xylitol and lactose were employed as drug, carrier, plasticizer and reinforcing agent respectively. A pre-blended drug-excipient mixture was fed from a loss-in-weight feeder to a twin-screw extruder. The extrudate was subsequently injected directly into the integrated IM unit and molded into tablets. Tablets were stored in different storage conditions up to 20 weeks to monitor physical stability and were evaluated by polarized light microscopy, DSC, SEM, XRD and dissolution analysis. Optimized injection pressure provided robust tablet formulations. Tablets manufactured at low and high injection pressures exhibited the flaws of sink marks and flashing respectively. Higher solidification temperature during IM process reduced the thermal induced residual stress and prevented chipping and cracking issues. Polarized light microscopy revealed a homogeneous dispersion of crystalline griseofulvin in an amorphous matrix. DSC underpinned the effect of high tablet residual moisture on maltodextrin-xylitol phase separation that resulted in dimensional instability. Tablets with low residual moisture demonstrated long term dimensional stability. This study serves as a model for IM tablet formulations for mechanistic understanding of critical process parameters and formulation attributes required for optimal product performance. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Robust Reliability or reliable robustness? - Integrated consideration of robustness and reliability aspects

    DEFF Research Database (Denmark)

    Kemmler, S.; Eifler, Tobias; Bertsche, B.

    2015-01-01

    products are and vice versa. For a comprehensive understanding and to use existing synergies between both domains, this paper discusses the basic principles of Reliability- and Robust Design theory. The development of a comprehensive model will enable an integrated consideration of both domains...

  11. Assessment of Process Robustness for Mass Customization

    DEFF Research Database (Denmark)

    Nielsen, Kjeld; Brunø, Thomas Ditlev

    2013-01-01

    robustness and their capability to develop it. Through literature study and analysis of robust process design characteristics a number of metrics are described which can be used for assessment. The metrics are evaluated and analyzed to be applied as KPI’s to help MC companies prioritize efforts in business...

  12. Integrated robust controller for vehicle path following

    Energy Technology Data Exchange (ETDEWEB)

    Mashadi, Behrooz; Ahmadizadeh, Pouyan, E-mail: p-ahmadizadeh@iust.ac.ir; Majidi, Majid, E-mail: m-majidi@iust.ac.ir [Iran University of Science and Technology, School of Automotive Engineering (Iran, Islamic Republic of); Mahmoodi-Kaleybar, Mehdi, E-mail: m-mahmoodi-k@iust.ac.ir [Iran University of Science and Technology, School of Mechanical Engineering (Iran, Islamic Republic of)

    2015-02-15

    The design of an integrated 4WS+DYC control system to guide a vehicle on a desired path is presented. The lateral dynamics of the path follower vehicle is formulated by considering important parameters. To reduce the effect of uncertainties in vehicle parameters, a robust controller is designed based on a μ-synthesis approach. Numerical simulations are performed using a nonlinear vehicle model in MATLAB environment in order to investigate the effectiveness of the designed controller. Results of simulations show that the controller has a profound ability to making the vehicle track the desired path in the presence of uncertainties.

  13. Integrated robust controller for vehicle path following

    International Nuclear Information System (INIS)

    Mashadi, Behrooz; Ahmadizadeh, Pouyan; Majidi, Majid; Mahmoodi-Kaleybar, Mehdi

    2015-01-01

    The design of an integrated 4WS+DYC control system to guide a vehicle on a desired path is presented. The lateral dynamics of the path follower vehicle is formulated by considering important parameters. To reduce the effect of uncertainties in vehicle parameters, a robust controller is designed based on a μ-synthesis approach. Numerical simulations are performed using a nonlinear vehicle model in MATLAB environment in order to investigate the effectiveness of the designed controller. Results of simulations show that the controller has a profound ability to making the vehicle track the desired path in the presence of uncertainties

  14. Hybrid robust model based on an improved functional link neural network integrating with partial least square (IFLNN-PLS) and its application to predicting key process variables.

    Science.gov (United States)

    He, Yan-Lin; Xu, Yuan; Geng, Zhi-Qiang; Zhu, Qun-Xiong

    2016-03-01

    In this paper, a hybrid robust model based on an improved functional link neural network integrating with partial least square (IFLNN-PLS) is proposed. Firstly, an improved functional link neural network with small norm of expanded weights and high input-output correlation (SNEWHIOC-FLNN) was proposed for enhancing the generalization performance of FLNN. Unlike the traditional FLNN, the expanded variables of the original inputs are not directly used as the inputs in the proposed SNEWHIOC-FLNN model. The original inputs are attached to some small norm of expanded weights. As a result, the correlation coefficient between some of the expanded variables and the outputs is enhanced. The larger the correlation coefficient is, the more relevant the expanded variables tend to be. In the end, the expanded variables with larger correlation coefficient are selected as the inputs to improve the performance of the traditional FLNN. In order to test the proposed SNEWHIOC-FLNN model, three UCI (University of California, Irvine) regression datasets named Housing, Concrete Compressive Strength (CCS), and Yacht Hydro Dynamics (YHD) are selected. Then a hybrid model based on the improved FLNN integrating with partial least square (IFLNN-PLS) was built. In IFLNN-PLS model, the connection weights are calculated using the partial least square method but not the error back propagation algorithm. Lastly, IFLNN-PLS was developed as an intelligent measurement model for accurately predicting the key variables in the Purified Terephthalic Acid (PTA) process and the High Density Polyethylene (HDPE) process. Simulation results illustrated that the IFLNN-PLS could significant improve the prediction performance. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  15. Robustness studies on coal gasification process variables

    African Journals Online (AJOL)

    coal before feeding to the gasification process [1]. .... to-control variables will make up the terms in the response surface model for the ... Montgomery (1999) explained that all the Taguchi engineering objectives for a robust ..... software [3].

  16. Robustness of multimodal processes itineraries

    DEFF Research Database (Denmark)

    Bocewicz, G.; Banaszak, Z.; Nielsen, Izabela Ewa

    2013-01-01

    itineraries for assumed (O-D) trip. Since itinerary planning problem, constitutes a common routing and scheduling decision faced by travelers, hence the main question regards of itinerary replanning and particularly a method aimed at prototyping of mode sequences and paths selections. The declarative model......This paper concerns multimodal transport systems (MTS) represented by a supernetworks in which several unimodal networks are connected by transfer links and focuses on the scheduling problems encountered in these systems. Assuming unimodal networks are modeled as cyclic lines, i.e. the routes...... of multimodal processes driven itinerary planning problem is our main contribution. Illustrative examples providing alternative itineraries in some cases of MTS malfunction are presented....

  17. Robust control charts in statistical process control

    NARCIS (Netherlands)

    Nazir, H.Z.

    2014-01-01

    The presence of outliers and contaminations in the output of the process highly affects the performance of the design structures of commonly used control charts and hence makes them of less practical use. One of the solutions to deal with this problem is to use control charts which are robust

  18. Stochastic simulation and robust design optimization of integrated photonic filters

    Directory of Open Access Journals (Sweden)

    Weng Tsui-Wei

    2016-07-01

    Full Text Available Manufacturing variations are becoming an unavoidable issue in modern fabrication processes; therefore, it is crucial to be able to include stochastic uncertainties in the design phase. In this paper, integrated photonic coupled ring resonator filters are considered as an example of significant interest. The sparsity structure in photonic circuits is exploited to construct a sparse combined generalized polynomial chaos model, which is then used to analyze related statistics and perform robust design optimization. Simulation results show that the optimized circuits are more robust to fabrication process variations and achieve a reduction of 11%–35% in the mean square errors of the 3 dB bandwidth compared to unoptimized nominal designs.

  19. Robust processing of mining subsidence monitoring data

    Energy Technology Data Exchange (ETDEWEB)

    Mingzhong, Wang; Guogang, Huang [Pingdingshan Mining Bureau (China); Yunjia, Wang; Guogangli, [China Univ. of Mining and Technology, Xuzhou (China)

    1997-12-31

    Since China began to do research on mining subsidence in 1950s, more than one thousand lines have been observed. Yet, monitoring data sometimes contain quite a lot of outliers because of the limit of observation and geological mining conditions. In China, nowdays, the method of processing mining subsidence monitoring data is based on the principle of the least square method. It is possible to produce lower accuracy, less reliability, or even errors. For reason given above, the authors, according to Chinese actual situation, have done some research work on the robust processing of mining subsidence monitoring data in respect of how to get prediction parameters. The authors have derived related formulas, designed some computational programmes, done a great quantity of actual calculation and simulation, and achieved good results. (orig.)

  20. Robust processing of mining subsidence monitoring data

    Energy Technology Data Exchange (ETDEWEB)

    Wang Mingzhong; Huang Guogang [Pingdingshan Mining Bureau (China); Wang Yunjia; Guogangli [China Univ. of Mining and Technology, Xuzhou (China)

    1996-12-31

    Since China began to do research on mining subsidence in 1950s, more than one thousand lines have been observed. Yet, monitoring data sometimes contain quite a lot of outliers because of the limit of observation and geological mining conditions. In China, nowdays, the method of processing mining subsidence monitoring data is based on the principle of the least square method. It is possible to produce lower accuracy, less reliability, or even errors. For reason given above, the authors, according to Chinese actual situation, have done some research work on the robust processing of mining subsidence monitoring data in respect of how to get prediction parameters. The authors have derived related formulas, designed some computational programmes, done a great quantity of actual calculation and simulation, and achieved good results. (orig.)

  1. Robust MPC with Output Feedback of Integrating Systems

    Directory of Open Access Journals (Sweden)

    J. M. Perez

    2012-01-01

    Full Text Available In this work, it is presented a new contribution to the design of a robust MPC with output feedback, input constraints, and uncertain model. Multivariable predictive controllers have been used in industry to reduce the variability of the process output and to allow the operation of the system near to the constraints, where it is usually located the optimum operating point. For this reason, new controllers have been developed with the objective of achieving better performance, simpler control structure, and robustness with respect to model uncertainty. In this work, it is proposed a model predictive controller based on a nonminimal state space model where the state is perfectly known. It is an infinite prediction horizon controller, and it is assumed that there is uncertainty in the stable part of the model, which may also include integrating modes that are frequently present in the process plants. The method is illustrated with a simulation example of the process industry using linear models based on a real process.

  2. Adaptive integral robust control and application to electromechanical servo systems.

    Science.gov (United States)

    Deng, Wenxiang; Yao, Jianyong

    2017-03-01

    This paper proposes a continuous adaptive integral robust control with robust integral of the sign of the error (RISE) feedback for a class of uncertain nonlinear systems, in which the RISE feedback gain is adapted online to ensure the robustness against disturbances without the prior bound knowledge of the additive disturbances. In addition, an adaptive compensation integrated with the proposed adaptive RISE feedback term is also constructed to further reduce design conservatism when the system also exists parametric uncertainties. Lyapunov analysis reveals the proposed controllers could guarantee the tracking errors are asymptotically converging to zero with continuous control efforts. To illustrate the high performance nature of the developed controllers, numerical simulations are provided. At the end, an application case of an actual electromechanical servo system driven by motor is also studied, with some specific design consideration, and comparative experimental results are obtained to verify the effectiveness of the proposed controllers. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  3. The Integrated Renovation Process

    DEFF Research Database (Denmark)

    Galiotto, Nicolas; Heiselberg, Per; Knudstrup, Mary-Ann

    The Integrated Renovation Process (IRP) is a user customized methodology based on judiciously selected constructivist and interactive multi-criteria decision making methods (Galiotto, Heiselberg, & Knudstrup, 2014 (expected)). When applied for home renovation, the Integrated Renovation Process...

  4. Silicon integrated circuit process

    International Nuclear Information System (INIS)

    Lee, Jong Duck

    1985-12-01

    This book introduces the process of silicon integrated circuit. It is composed of seven parts, which are oxidation process, diffusion process, ion implantation process such as ion implantation equipment, damage, annealing and influence on manufacture of integrated circuit and device, chemical vapor deposition process like silicon Epitaxy LPCVD and PECVD, photolithography process, including a sensitizer, spin, harden bake, reflection of light and problems related process, infrared light bake, wet-etch, dry etch, special etch and problems of etching, metal process like metal process like metal-silicon connection, aluminum process, credibility of aluminum and test process.

  5. Silicon integrated circuit process

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Duck

    1985-12-15

    This book introduces the process of silicon integrated circuit. It is composed of seven parts, which are oxidation process, diffusion process, ion implantation process such as ion implantation equipment, damage, annealing and influence on manufacture of integrated circuit and device, chemical vapor deposition process like silicon Epitaxy LPCVD and PECVD, photolithography process, including a sensitizer, spin, harden bake, reflection of light and problems related process, infrared light bake, wet-etch, dry etch, special etch and problems of etching, metal process like metal process like metal-silicon connection, aluminum process, credibility of aluminum and test process.

  6. Robust digital processing of speech signals

    CERN Document Server

    Kovacevic, Branko; Veinović, Mladen; Marković, Milan

    2017-01-01

    This book focuses on speech signal phenomena, presenting a robustification of the usual speech generation models with regard to the presumed types of excitation signals, which is equivalent to the introduction of a class of nonlinear models and the corresponding criterion functions for parameter estimation. Compared to the general class of nonlinear models, such as various neural networks, these models possess good properties of controlled complexity, the option of working in “online” mode, as well as a low information volume for efficient speech encoding and transmission. Providing comprehensive insights, the book is based on the authors’ research, which has already been published, supplemented by additional texts discussing general considerations of speech modeling, linear predictive analysis and robust parameter estimation.

  7. Integrated Renovation Process

    DEFF Research Database (Denmark)

    Galiotto, Nicolas; Heiselberg, Per; Knudstrup, Mary-Ann

    2016-01-01

    renovation to be overcome. The homeowners were better integrated and their preferences and immaterial values were better taken into account. To keep the decision-making process economically viable and timely, the process as known today still needs to be improved, and new tools need to be developed....... This paper presents a new scheme: the integrated renovation process. One successful case study is introduced, and recommendations for future developments needed in the field are provided....

  8. Deep Coupled Integration of CSAC and GNSS for Robust PNT

    Directory of Open Access Journals (Sweden)

    Lin Ma

    2015-09-01

    Full Text Available Global navigation satellite systems (GNSS are the most widely used positioning, navigation, and timing (PNT technology. However, a GNSS cannot provide effective PNT services in physical blocks, such as in a natural canyon, canyon city, underground, underwater, and indoors. With the development of micro-electromechanical system (MEMS technology, the chip scale atomic clock (CSAC gradually matures, and performance is constantly improved. A deep coupled integration of CSAC and GNSS is explored in this thesis to enhance PNT robustness. “Clock coasting” of CSAC provides time synchronized with GNSS and optimizes navigation equations. However, errors of clock coasting increase over time and can be corrected by GNSS time, which is stable but noisy. In this paper, weighted linear optimal estimation algorithm is used for CSAC-aided GNSS, while Kalman filter is used for GNSS-corrected CSAC. Simulations of the model are conducted, and field tests are carried out. Dilution of precision can be improved by integration. Integration is more accurate than traditional GNSS. When only three satellites are visible, the integration still works, whereas the traditional method fails. The deep coupled integration of CSAC and GNSS can improve the accuracy, reliability, and availability of PNT.

  9. Deep Coupled Integration of CSAC and GNSS for Robust PNT.

    Science.gov (United States)

    Ma, Lin; You, Zheng; Li, Bin; Zhou, Bin; Han, Runqi

    2015-09-11

    Global navigation satellite systems (GNSS) are the most widely used positioning, navigation, and timing (PNT) technology. However, a GNSS cannot provide effective PNT services in physical blocks, such as in a natural canyon, canyon city, underground, underwater, and indoors. With the development of micro-electromechanical system (MEMS) technology, the chip scale atomic clock (CSAC) gradually matures, and performance is constantly improved. A deep coupled integration of CSAC and GNSS is explored in this thesis to enhance PNT robustness. "Clock coasting" of CSAC provides time synchronized with GNSS and optimizes navigation equations. However, errors of clock coasting increase over time and can be corrected by GNSS time, which is stable but noisy. In this paper, weighted linear optimal estimation algorithm is used for CSAC-aided GNSS, while Kalman filter is used for GNSS-corrected CSAC. Simulations of the model are conducted, and field tests are carried out. Dilution of precision can be improved by integration. Integration is more accurate than traditional GNSS. When only three satellites are visible, the integration still works, whereas the traditional method fails. The deep coupled integration of CSAC and GNSS can improve the accuracy, reliability, and availability of PNT.

  10. Robust collaborative process interactions under system crash and network failures

    NARCIS (Netherlands)

    Wang, Lei; Wombacher, Andreas; Ferreira Pires, Luis; van Sinderen, Marten J.; Chi, Chihung

    2013-01-01

    With the possibility of system crashes and network failures, the design of robust client/server interactions for collaborative process execution is a challenge. If a business process changes its state, it sends messages to the relevant processes to inform about this change. However, server crashes

  11. Mechanisms and coherences of robust design methodology: a robust design process proposal

    DEFF Research Database (Denmark)

    Göhler, Simon Moritz; Christensen, Martin Ebro; Howard, Thomas J.

    2016-01-01

    Although robust design (RD) methods are recognised as a way of developing mechanical products with consistent and predictable performance and quality, they do not experience widespread success in industry. One reason being the lack of a coherent RD process (RDP). In this contribution we analyse...

  12. Robust adaptive multichannel SAR processing based on covariance matrix reconstruction

    Science.gov (United States)

    Tan, Zhen-ya; He, Feng

    2018-04-01

    With the combination of digital beamforming (DBF) processing, multichannel synthetic aperture radar(SAR) systems in azimuth promise well in high-resolution and wide-swath imaging, whereas conventional processing methods don't take the nonuniformity of scattering coefficient into consideration. This paper brings up a robust adaptive Multichannel SAR processing method which utilizes the Capon spatial spectrum estimator to obtain the spatial spectrum distribution over all ambiguous directions first, and then the interference-plus-noise covariance Matrix is reconstructed based on definition to acquire the Multichannel SAR processing filter. The performance of processing under nonuniform scattering coefficient is promoted by this novel method and it is robust again array errors. The experiments with real measured data demonstrate the effectiveness and robustness of the proposed method.

  13. Multivariable robust control of an integrated nuclear power reactor

    Directory of Open Access Journals (Sweden)

    A. Etchepareborda

    2002-12-01

    Full Text Available The design of the main control system of the CAREM nuclear power plant is presented. This plant is an inherently safe low-power nuclear reactor with natural convection on the primary coolant circuit and is self-pressurized with a steam dome on the top of the pressure vessel (PV. It is an integrated reactor as the whole primary coolant circuit is within the PV. The primary circuit transports the heat to the secondary circuit through once-through steam generators (SG. There is a feedwater valve at the inlet of the SG and a turbine valve at the outlet of the SG. The manipulated variables are the aperture of these valves and the reactivity of the control rods. The control target is to regulate the primary and secondary pressures and to monitor steam flow reference ramps on a range of nominal flow from 100% to 40%. The requirements for the control system are robust stability, low-order simple controllers and transient/permanent error bounding. The controller design is based on a detailed RETRAN plant model, from which linear perturbed open-loop dynamic models at different powers are identified. Two low-order nominal models with their associated uncertainties are chosen for two different power ranges. Robust controllers with acceptable performances are designed for each range. Numerical optimization based on the loop-shaping method is used for the controller design. The designed controllers are implemented in the RETRAN model and tested in simulations achieving successful results.

  14. Robust output LQ optimal control via integral sliding modes

    CERN Document Server

    Fridman, Leonid; Bejarano, Francisco Javier

    2014-01-01

    Featuring original research from well-known experts in the field of sliding mode control, this monograph presents new design schemes for implementing LQ control solutions in situations where the output system is the only information provided about the state of the plant. This new design works under the restrictions of matched disturbances without losing its desirable features. On the cutting-edge of optimal control research, Robust Output LQ Optimal Control via Integral Sliding Modes is an excellent resource for both graduate students and professionals involved in linear systems, optimal control, observation of systems with unknown inputs, and automatization. In the theory of optimal control, the linear quadratic (LQ) optimal problem plays an important role due to its physical meaning, and its solution is easily given by an algebraic Riccati equation. This solution turns out to be restrictive, however, because of two assumptions: the system must be free from disturbances and the entire state vector must be kn...

  15. Integrating Globality and Locality for Robust Representation Based Classification

    Directory of Open Access Journals (Sweden)

    Zheng Zhang

    2014-01-01

    Full Text Available The representation based classification method (RBCM has shown huge potential for face recognition since it first emerged. Linear regression classification (LRC method and collaborative representation classification (CRC method are two well-known RBCMs. LRC and CRC exploit training samples of each class and all the training samples to represent the testing sample, respectively, and subsequently conduct classification on the basis of the representation residual. LRC method can be viewed as a “locality representation” method because it just uses the training samples of each class to represent the testing sample and it cannot embody the effectiveness of the “globality representation.” On the contrary, it seems that CRC method cannot own the benefit of locality of the general RBCM. Thus we propose to integrate CRC and LRC to perform more robust representation based classification. The experimental results on benchmark face databases substantially demonstrate that the proposed method achieves high classification accuracy.

  16. The Integrated Renovation Process

    DEFF Research Database (Denmark)

    Galiotto, Nicolas

    and constructivist multiple criteria decision-making analysis method is selected for developing the work further. The method is introduced and applied to the renovation of a multi-residential historic building. Furthermore, a new scheme, the Integrated Renovation Process, is presented. Finally, the methodology...... is applied to two single-family homes. In practice, such a scheme allowed most informational barriers to sustainable home renovation to be overcome. The homeowners were better integrated and their preferences and immaterial values were better taken into account. They assimilated the multiple benefits...... to keep the decision making process economically viable and timely, the process still needs to be improved and new tools need to be developed....

  17. Nano integrated circuit process

    International Nuclear Information System (INIS)

    Yoon, Yung Sup

    2004-02-01

    This book contains nine chapters, which are introduction of manufacture of semiconductor chip, oxidation such as Dry-oxidation, wet oxidation, oxidation model and oxide film, diffusion like diffusion process, diffusion equation, diffusion coefficient and diffusion system, ion implantation, including ion distribution, channeling, multiimplantation and masking and its system, sputtering such as CVD and PVD, lithography, wet etch and dry etch, interconnection and flattening like metal-silicon connection, silicide, multiple layer metal process and flattening, an integrated circuit process, including MOSFET and CMOS.

  18. Nano integrated circuit process

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Yung Sup

    2004-02-15

    This book contains nine chapters, which are introduction of manufacture of semiconductor chip, oxidation such as Dry-oxidation, wet oxidation, oxidation model and oxide film, diffusion like diffusion process, diffusion equation, diffusion coefficient and diffusion system, ion implantation, including ion distribution, channeling, multiimplantation and masking and its system, sputtering such as CVD and PVD, lithography, wet etch and dry etch, interconnection and flattening like metal-silicon connection, silicide, multiple layer metal process and flattening, an integrated circuit process, including MOSFET and CMOS.

  19. Robust

    DEFF Research Database (Denmark)

    2017-01-01

    Robust – Reflections on Resilient Architecture’, is a scientific publication following the conference of the same name in November of 2017. Researches and PhD-Fellows, associated with the Masters programme: Cultural Heritage, Transformation and Restoration (Transformation), at The Royal Danish...

  20. Integrated biofuels process synthesis

    DEFF Research Database (Denmark)

    Torres-Ortega, Carlo Edgar; Rong, Ben-Guang

    2017-01-01

    Second and third generation bioethanol and biodiesel are more environmentally friendly fuels than gasoline and petrodiesel, andmore sustainable than first generation biofuels. However, their production processes are more complex and more expensive. In this chapter, we describe a two-stage synthesis......% used for bioethanol process), and steam and electricity from combustion (54%used as electricity) in the bioethanol and biodiesel processes. In the second stage, we saved about 5% in equipment costs and 12% in utility costs for bioethanol separation. This dual synthesis methodology, consisting of a top......-level screening task followed by a down-level intensification task, proved to be an efficient methodology for integrated biofuel process synthesis. The case study illustrates and provides important insights into the optimal synthesis and intensification of biofuel production processes with the proposed synthesis...

  1. Robust Gaussian Process Regression with a Student-t Likelihood

    NARCIS (Netherlands)

    Jylänki, P.P.; Vanhatalo, J.; Vehtari, A.

    2011-01-01

    This paper considers the robust and efficient implementation of Gaussian process regression with a Student-t observation model, which has a non-log-concave likelihood. The challenge with the Student-t model is the analytically intractable inference which is why several approximative methods have

  2. An Integrated Desgin Process

    DEFF Research Database (Denmark)

    Petersen, Mads Dines; Knudstrup, Mary-Ann

    2010-01-01

    Present paper is placed in the discussion about how sustainable measures are integrated in the design process by architectural offices. It presents results from interviews with four leading Danish architectural offices working with sustainable architecture and their experiences with it, as well...... as the requirements they meet in terms of how to approach the design process – especially focused on the early stages like a competition. The interviews focus on their experiences with working in multidisciplinary teams and using digital tools to support their work with sustainable issues. The interviews show...... the environmental measures cannot be discarded due to extra costs....

  3. An Integrated Design Process

    DEFF Research Database (Denmark)

    Petersen, Mads Dines; Knudstrup, Mary-Ann

    2010-01-01

    Present paper is placed in the discussion about how sustainable measures are integrated in the design process by architectural offices. It presents results from interviews with four leading Danish architectural offices working with sustainable architecture and their experiences with it, as well...... as the requirements they meet in terms of how to approach the design process – especially focused on the early stages like a competition. The interviews focus on their experiences with working in multidisciplinary teams and using digital tools to support their work with sustainable issues. The interviews show...... the environmental measures cannot be discarded due to extra costs....

  4. The Integrated Renovation Process

    DEFF Research Database (Denmark)

    Galiotto, Nicolas; Heiselberg, Per; Knudstrup, Mary-Ann

    they get more time for the cost optimization and the qualitative analysis of the users’ needs and behaviours. In order to reach a fossil free energy building stock within an acceptable time frame, it is primordial that researchers, politicians and the building industry work hand in hand. Indeed, in order...... to overcome the financial barriers to energy renovation and bring a new type of building experts in the building renovation sector, cost optimization tools for building renovation have been and can be developed but new legislation and politico-economic supports are still much needed. We present in this report...... a new contribution from the research and industry sides and results reached with the newly developed methodology, but without a significant contribution from the politico-economic and legislation sides . The experiences met during application of the Integrated Renovation Process are described...

  5. Buried waste integrated demonstration technology integration process

    International Nuclear Information System (INIS)

    Ferguson, J.S.; Ferguson, J.E.

    1992-04-01

    A Technology integration Process was developed for the Idaho National Energy Laboratories (INEL) Buried Waste Integrated Demonstration (BWID) Program to facilitate the transfer of technology and knowledge from industry, universities, and other Federal agencies into the BWID; to successfully transfer demonstrated technology and knowledge from the BWID to industry, universities, and other Federal agencies; and to share demonstrated technologies and knowledge between Integrated Demonstrations and other Department of Energy (DOE) spread throughout the DOE Complex. This document also details specific methods and tools for integrating and transferring technologies into or out of the BWID program. The document provides background on the BWID program and technology development needs, demonstrates the direction of technology transfer, illustrates current processes for this transfer, and lists points of contact for prospective participants in the BWID technology transfer efforts. The Technology Integration Process was prepared to ensure compliance with the requirements of DOE's Office of Technology Development (OTD)

  6. Reactive Robustness and Integrated Approaches for Railway Optimization Problems

    DEFF Research Database (Denmark)

    Haahr, Jørgen Thorlund

    journeys helps the driver to drive efficiently and enhances robustness in a realistic (dynamic) environment. Four international scientific prizes have been awarded for distinct parts of the research during the course of this PhD project. The first prize was awarded for work during the \\2014 RAS Problem...... to absorb or withstand unexpected events such as delays. Making robust plans is central in order to maintain a safe and timely railway operation. This thesis focuses on reactive robustness, i.e., the ability to react once a plan is rendered infeasible in operation due to disruptions. In such time...... Solving Competition", where a freight yard optimization problem was considered. The second junior (PhD) prize was awared for the work performed in the \\ROADEF/EURO Challenge 2014: Trains don't vanish!", where the planning of rolling stock movements at a large station was considered. An honorable mention...

  7. Integrating robust timetabling in line plan optimization for railway systems

    DEFF Research Database (Denmark)

    Burggraeve, Sofie; Bull, Simon Henry; Vansteenwegen, Pieter

    2017-01-01

    We propose a heuristic algorithm to build a railway line plan from scratch that minimizes passenger travel time and operator cost and for which a feasible and robust timetable exists. A line planning module and a timetabling module work iteratively and interactively. The line planning module......, but is constrained by limited shunt capacity. While the operator and passenger cost remain close to those of the initially and (for these costs) optimally built line plan, the timetable corresponding to the finally developed robust line plan significantly improves the minimum buffer time, and thus the robustness...... creates an initial line plan. The timetabling module evaluates the line plan and identifies a critical line based on minimum buffer times between train pairs. The line planning module proposes a new line plan in which the time length of the critical line is modified in order to provide more flexibility...

  8. Robust media processing on programmable power-constrained systems

    Science.gov (United States)

    McVeigh, Jeff

    2005-03-01

    To achieve consumer-level quality, media systems must process continuous streams of audio and video data while maintaining exacting tolerances on sampling rate, jitter, synchronization, and latency. While it is relatively straightforward to design fixed-function hardware implementations to satisfy worst-case conditions, there is a growing trend to utilize programmable multi-tasking solutions for media applications. The flexibility of these systems enables support for multiple current and future media formats, which can reduce design costs and time-to-market. This paper provides practical engineering solutions to achieve robust media processing on such systems, with specific attention given to power-constrained platforms. The techniques covered in this article utilize the fundamental concepts of algorithm and software optimization, software/hardware partitioning, stream buffering, hierarchical prioritization, and system resource and power management. A novel enhancement to dynamically adjust processor voltage and frequency based on buffer fullness to reduce system power consumption is examined in detail. The application of these techniques is provided in a case study of a portable video player implementation based on a general-purpose processor running a non real-time operating system that achieves robust playback of synchronized H.264 video and MP3 audio from local storage and streaming over 802.11.

  9. Human-Systems Integration Processes

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this project is to baseline a Human-Systems Integration Processes (HSIP) document as a companion to the NASA-STD-3001 and Human Integration Design...

  10. Compactness and robustness: Applications in the solution of integral equations for chemical kinetics and electromagnetic scattering

    Science.gov (United States)

    Zhou, Yajun

    This thesis employs the topological concept of compactness to deduce robust solutions to two integral equations arising from chemistry and physics: the inverse Laplace problem in chemical kinetics and the vector wave scattering problem in dielectric optics. The inverse Laplace problem occurs in the quantitative understanding of biological processes that exhibit complex kinetic behavior: different subpopulations of transition events from the "reactant" state to the "product" state follow distinct reaction rate constants, which results in a weighted superposition of exponential decay modes. Reconstruction of the rate constant distribution from kinetic data is often critical for mechanistic understandings of chemical reactions related to biological macromolecules. We devise a "phase function approach" to recover the probability distribution of rate constants from decay data in the time domain. The robustness (numerical stability) of this reconstruction algorithm builds upon the continuity of the transformations connecting the relevant function spaces that are compact metric spaces. The robust "phase function approach" not only is useful for the analysis of heterogeneous subpopulations of exponential decays within a single transition step, but also is generalizable to the kinetic analysis of complex chemical reactions that involve multiple intermediate steps. A quantitative characterization of the light scattering is central to many meteoro-logical, optical, and medical applications. We give a rigorous treatment to electromagnetic scattering on arbitrarily shaped dielectric media via the Born equation: an integral equation with a strongly singular convolution kernel that corresponds to a non-compact Green operator. By constructing a quadratic polynomial of the Green operator that cancels out the kernel singularity and satisfies the compactness criterion, we reveal the universality of a real resonance mode in dielectric optics. Meanwhile, exploiting the properties of

  11. Proscription supports robust perceptual integration by suppression in human visual cortex.

    Science.gov (United States)

    Rideaux, Reuben; Welchman, Andrew E

    2018-04-17

    Perception relies on integrating information within and between the senses, but how does the brain decide which pieces of information should be integrated and which kept separate? Here we demonstrate how proscription can be used to solve this problem: certain neurons respond best to unrealistic combinations of features to provide 'what not' information that drives suppression of unlikely perceptual interpretations. First, we present a model that captures both improved perception when signals are consistent (and thus should be integrated) and robust estimation when signals are conflicting. Second, we test for signatures of proscription in the human brain. We show that concentrations of inhibitory neurotransmitter GABA in a brain region intricately involved in integrating cues (V3B/KO) correlate with robust integration. Finally, we show that perturbing excitation/inhibition impairs integration. These results highlight the role of proscription in robust perception and demonstrate the functional purpose of 'what not' sensors in supporting sensory estimation.

  12. Decentralising Multicell Cooperative Processing: A Novel Robust Framework

    Directory of Open Access Journals (Sweden)

    Gesbert David

    2009-01-01

    Full Text Available Multicell cooperative processing (MCP has the potential to boost spectral efficiency and improve fairness of cellular systems. However the typical centralised conception for MCP incurs significant infrastructural overheads which increase the system costs and hinder the practical implementation of MCP. In Frequency Division Duplexing systems each user feeds back its Channel State Information (CSI only to one Base Station (BS. Therefore collaborating BSs need to be interconnected via low-latency backhaul links, and a Control Unit is necessary in order to gather user CSI, perform scheduling, and coordinate transmission. In this paper a new framework is proposed that allows MCP on the downlink while circumventing the aforementioned costly modifications on the existing infrastructure of cellular systems. Each MS feeds back its CSI to all collaborating BSs, and the needed operations of user scheduling and signal processing are performed in a distributed fashion by the involved BSs. Furthermore the proposed framework is shown to be robust against feedback errors when quantized CSI feedback and linear precoding are employed.

  13. Biochemical Process Development and Integration | Bioenergy | NREL

    Science.gov (United States)

    Biochemical Process Development and Integration Biochemical Process Development and Integration Our conversion and separation processes to pilot-scale integrated process development and scale up. We also Publications Accounting for all sugar produced during integrated production of ethanol from lignocellulosic

  14. Deep Coupled Integration of CSAC and GNSS for Robust PNT

    OpenAIRE

    Ma, Lin; You, Zheng; Li, Bin; Zhou, Bin; Han, Runqi

    2015-01-01

    Global navigation satellite systems (GNSS) are the most widely used positioning, navigation, and timing (PNT) technology. However, a GNSS cannot provide effective PNT services in physical blocks, such as in a natural canyon, canyon city, underground, underwater, and indoors. With the development of micro-electromechanical system (MEMS) technology, the chip scale atomic clock (CSAC) gradually matures, and performance is constantly improved. A deep coupled integration of CSAC and GNSS is explo...

  15. Integrated Optical Information Processing

    Science.gov (United States)

    1988-08-01

    applications in optical disk memory systems [91. This device is constructed in a glass /SiO2/Si waveguide. The choice of a Si substrate allows for the...contact mask) were formed in the photoresist deposited on all of the samples, we covered the unwanted gratings on each sample with cover glass slides...processing, let us consider TeO2 (v, = 620 m/s) as a potential substrate for applications requiring large time delays. This con- sideration is despite

  16. Integrated process status overview

    International Nuclear Information System (INIS)

    Gertman, D.I.; Gaudio, P. Jr.

    1986-01-01

    This report summarizes findings to date with the IPSO, a large plant status overview currently under development at the OECD Halden Reactor Project. As part of a joint Halden and Combustion Engineering project, the overview is being tested in part to determine whether the large screen overview concept being entertained for use in the nuclear power plant (NPP) industry will facilitate operator performance. To this end an interactive simulation technique was used to establish a proof-of-principle test for the IPSO. Process control, operations, and human factors experts at Halden participated in the test and evaluation

  17. Adaptive GSA-based optimal tuning of PI controlled servo systems with reduced process parametric sensitivity, robust stability and controller robustness.

    Science.gov (United States)

    Precup, Radu-Emil; David, Radu-Codrut; Petriu, Emil M; Radac, Mircea-Bogdan; Preitl, Stefan

    2014-11-01

    This paper suggests a new generation of optimal PI controllers for a class of servo systems characterized by saturation and dead zone static nonlinearities and second-order models with an integral component. The objective functions are expressed as the integral of time multiplied by absolute error plus the weighted sum of the integrals of output sensitivity functions of the state sensitivity models with respect to two process parametric variations. The PI controller tuning conditions applied to a simplified linear process model involve a single design parameter specific to the extended symmetrical optimum (ESO) method which offers the desired tradeoff to several control system performance indices. An original back-calculation and tracking anti-windup scheme is proposed in order to prevent the integrator wind-up and to compensate for the dead zone nonlinearity of the process. The minimization of the objective functions is carried out in the framework of optimization problems with inequality constraints which guarantee the robust stability with respect to the process parametric variations and the controller robustness. An adaptive gravitational search algorithm (GSA) solves the optimization problems focused on the optimal tuning of the design parameter specific to the ESO method and of the anti-windup tracking gain. A tuning method for PI controllers is proposed as an efficient approach to the design of resilient control systems. The tuning method and the PI controllers are experimentally validated by the adaptive GSA-based tuning of PI controllers for the angular position control of a laboratory servo system.

  18. Robust Methods for Image Processing in Anthropology and Biomedicine

    Czech Academy of Sciences Publication Activity Database

    Kalina, Jan

    -, č. 86 (2011), s. 53-53 ISSN 0926-4981 Institutional research plan: CEZ:AV0Z10300504 Keywords : image analysis * robust estimation * forensic anthropology Subject RIV: BB - Applied Statistics, Operational Research

  19. Integrated stationary Ornstein-Uhlenbeck process, and double integral processes

    Science.gov (United States)

    Abundo, Mario; Pirozzi, Enrica

    2018-03-01

    We find a representation of the integral of the stationary Ornstein-Uhlenbeck (ISOU) process in terms of Brownian motion Bt; moreover, we show that, under certain conditions on the functions f and g , the double integral process (DIP) D(t) = ∫βt g(s) (∫αs f(u) dBu) ds can be thought as the integral of a suitable Gauss-Markov process. Some theoretical and application details are given, among them we provide a simulation formula based on that representation by which sample paths, probability densities and first passage times of the ISOU process are obtained; the first-passage times of the DIP are also studied.

  20. Integrated direct/indirect adaptive robust motion trajectory tracking control of pneumatic cylinders

    Science.gov (United States)

    Meng, Deyuan; Tao, Guoliang; Zhu, Xiaocong

    2013-09-01

    This paper studies the precision motion trajectory tracking control of a pneumatic cylinder driven by a proportional-directional control valve. An integrated direct/indirect adaptive robust controller is proposed. The controller employs a physical model based indirect-type parameter estimation to obtain reliable estimates of unknown model parameters, and utilises a robust control method with dynamic compensation type fast adaptation to attenuate the effects of parameter estimation errors, unmodelled dynamics and disturbances. Due to the use of projection mapping, the robust control law and the parameter adaption algorithm can be designed separately. Since the system model uncertainties are unmatched, the recursive backstepping technology is adopted to design the robust control law. Extensive comparative experimental results are presented to illustrate the effectiveness of the proposed controller and its performance robustness to parameter variations and sudden disturbances.

  1. Teaching Process Design through Integrated Process Synthesis

    Science.gov (United States)

    Metzger, Matthew J.; Glasser, Benjamin J.; Patel, Bilal; Hildebrandt, Diane; Glasser, David

    2012-01-01

    The design course is an integral part of chemical engineering education. A novel approach to the design course was recently introduced at the University of the Witwatersrand, Johannesburg, South Africa. The course aimed to introduce students to systematic tools and techniques for setting and evaluating performance targets for processes, as well as…

  2. The LEAN Payload Integration Process

    Science.gov (United States)

    Jordan, Lee P.; Young, Yancy; Rice, Amanda

    2011-01-01

    It is recognized that payload development and integration with the International Space Station (ISS) can be complex. This streamlined integration approach is a first step toward simplifying payload integration; making it easier to fly payloads on ISS, thereby increasing feasibility and interest for more research and commercial organizations to sponsor ISS payloads and take advantage of the ISS as a National Laboratory asset. The streamlined integration approach was addressed from the perspective of highly likely initial payload types to evolve from the National Lab Pathfinder program. Payloads to be accommodated by the Expedite the Processing of Experiments for Space Station (EXPRESS) Racks and Microgravity Sciences Glovebox (MSG) pressurized facilities have been addressed. It is hoped that the streamlined principles applied to these types of payloads will be analyzed and implemented in the future for other host facilities as well as unpressurized payloads to be accommodated by the EXPRESS Logistics Carrier (ELC). Further, a payload does not have to be classified as a National Lab payload in order to be processed according to the lean payload integration process; any payload that meets certain criteria can follow the lean payload integration process.

  3. Robust integration schemes for junction-based modulators in a 200mm CMOS compatible silicon photonic platform (Conference Presentation)

    Science.gov (United States)

    Szelag, Bertrand; Abraham, Alexis; Brision, Stéphane; Gindre, Paul; Blampey, Benjamin; Myko, André; Olivier, Segolene; Kopp, Christophe

    2017-05-01

    Silicon photonic is becoming a reality for next generation communication system addressing the increasing needs of HPC (High Performance Computing) systems and datacenters. CMOS compatible photonic platforms are developed in many foundries integrating passive and active devices. The use of existing and qualified microelectronics process guarantees cost efficient and mature photonic technologies. Meanwhile, photonic devices have their own fabrication constraints, not similar to those of cmos devices, which can affect their performances. In this paper, we are addressing the integration of PN junction Mach Zehnder modulator in a 200mm CMOS compatible photonic platform. Implantation based device characteristics are impacted by many process variations among which screening layer thickness, dopant diffusion, implantation mask overlay. CMOS devices are generally quite robust with respect to these processes thanks to dedicated design rules. For photonic devices, the situation is different since, most of the time, doped areas must be carefully located within waveguides and CMOS solutions like self-alignment to the gate cannot be applied. In this work, we present different robust integration solutions for junction-based modulators. A simulation setup has been built in order to optimize of the process conditions. It consist in a Mathlab interface coupling process and device electro-optic simulators in order to run many iterations. Illustrations of modulator characteristic variations with process parameters are done using this simulation setup. Parameters under study are, for instance, X and Y direction lithography shifts, screening oxide and slab thicknesses. A robust process and design approach leading to a pn junction Mach Zehnder modulator insensitive to lithography misalignment is then proposed. Simulation results are compared with experimental datas. Indeed, various modulators have been fabricated with different process conditions and integration schemes. Extensive

  4. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    Science.gov (United States)

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  5. Human Integration Design Processes (HIDP)

    Science.gov (United States)

    Boyer, Jennifer

    2014-01-01

    The purpose of the Human Integration Design Processes (HIDP) document is to provide human-systems integration design processes, including methodologies and best practices that NASA has used to meet human systems and human rating requirements for developing crewed spacecraft. HIDP content is framed around human-centered design methodologies and processes in support of human-system integration requirements and human rating. NASA-STD-3001, Space Flight Human-System Standard, is a two-volume set of National Aeronautics and Space Administration (NASA) Agency-level standards established by the Office of the Chief Health and Medical Officer, directed at minimizing health and performance risks for flight crews in human space flight programs. Volume 1 of NASA-STD-3001, Crew Health, sets standards for fitness for duty, space flight permissible exposure limits, permissible outcome limits, levels of medical care, medical diagnosis, intervention, treatment and care, and countermeasures. Volume 2 of NASASTD- 3001, Human Factors, Habitability, and Environmental Health, focuses on human physical and cognitive capabilities and limitations and defines standards for spacecraft (including orbiters, habitats, and suits), internal environments, facilities, payloads, and related equipment, hardware, and software with which the crew interfaces during space operations. The NASA Procedural Requirements (NPR) 8705.2B, Human-Rating Requirements for Space Systems, specifies the Agency's human-rating processes, procedures, and requirements. The HIDP was written to share NASA's knowledge of processes directed toward achieving human certification of a spacecraft through implementation of human-systems integration requirements. Although the HIDP speaks directly to implementation of NASA-STD-3001 and NPR 8705.2B requirements, the human-centered design, evaluation, and design processes described in this document can be applied to any set of human-systems requirements and are independent of reference

  6. Integration process and logistics results

    International Nuclear Information System (INIS)

    2004-01-01

    The Procurement and Logistics functions have gone through a process of integration since the beginning of integrated management of Asco and Vandellos II up to the present. These are functions that are likely to be designed for delivering a single product to the rest of the organization, defined from a high level of expectations, and that admit simplifications and materialization of synergy's as they are approached from an integrated perspective. The analyzed functions are as follows: Service and Material Purchasing, Warehouse and Material Management, and Documentation and General Services Management. In all case, to accomplish the integration, objectives, procedures and information systems were unified. As for the organization, a decision was made in each case on whether or not to out source. The decisive corporate strategy to integrate, resulting in actions such as moving corporate headquarters to Vandellos II, corporate consolidation, regulation of employment and implementation of the ENDESA Group Economic Information System (SIE) , has shaped this process, which at present can be considered as practically complete. (Author)

  7. Multi-Hypothesis Modelling Capabilities for Robust Data-Model Integration

    Science.gov (United States)

    Walker, A. P.; De Kauwe, M. G.; Lu, D.; Medlyn, B.; Norby, R. J.; Ricciuto, D. M.; Rogers, A.; Serbin, S.; Weston, D. J.; Ye, M.; Zaehle, S.

    2017-12-01

    Large uncertainty is often inherent in model predictions due to imperfect knowledge of how to describe the mechanistic processes (hypotheses) that a model is intended to represent. Yet this model hypothesis uncertainty (MHU) is often overlooked or informally evaluated, as methods to quantify and evaluate MHU are limited. MHU is increased as models become more complex because each additional processes added to a model comes with inherent MHU as well as parametric unceratinty. With the current trend of adding more processes to Earth System Models (ESMs), we are adding uncertainty, which can be quantified for parameters but not MHU. Model inter-comparison projects do allow for some consideration of hypothesis uncertainty but in an ad hoc and non-independent fashion. This has stymied efforts to evaluate ecosystem models against data and intepret the results mechanistically because it is not simple to interpret exactly why a model is producing the results it does and identify which model assumptions are key as they combine models of many sub-systems and processes, each of which may be conceptualised and represented mathematically in various ways. We present a novel modelling framework—the multi-assumption architecture and testbed (MAAT)—that automates the combination, generation, and execution of a model ensemble built with different representations of process. We will present the argument that multi-hypothesis modelling needs to be considered in conjunction with other capabilities (e.g. the Predictive Ecosystem Analyser; PecAn) and statistical methods (e.g. sensitivity anaylsis, data assimilation) to aid efforts in robust data model integration to enhance our predictive understanding of biological systems.

  8. Robust fractional-order proportional-integral observer for synchronization of chaotic fractional-order systems

    KAUST Repository

    N U+02BC Doye, Ibrahima

    2018-02-13

    In this paper, we propose a robust fractional-order proportional-integral U+0028 FOPI U+0029 observer for the synchronization of nonlinear fractional-order chaotic systems. The convergence of the observer is proved, and sufficient conditions are derived in terms of linear matrix inequalities U+0028 LMIs U+0029 approach by using an indirect Lyapunov method. The proposed U+0028 FOPI U+0029 observer is robust against Lipschitz additive nonlinear uncertainty. It is also compared to the fractional-order proportional U+0028 FOP U+0029 observer and its performance is illustrated through simulations done on the fractional-order chaotic Lorenz system.

  9. Robust fractional-order proportional-integral observer for synchronization of chaotic fractional-order systems

    KAUST Repository

    N U+02BC Doye, Ibrahima; Salama, Khaled N.; Laleg-Kirati, Taous-Meriem

    2018-01-01

    In this paper, we propose a robust fractional-order proportional-integral U+0028 FOPI U+0029 observer for the synchronization of nonlinear fractional-order chaotic systems. The convergence of the observer is proved, and sufficient conditions are derived in terms of linear matrix inequalities U+0028 LMIs U+0029 approach by using an indirect Lyapunov method. The proposed U+0028 FOPI U+0029 observer is robust against Lipschitz additive nonlinear uncertainty. It is also compared to the fractional-order proportional U+0028 FOP U+0029 observer and its performance is illustrated through simulations done on the fractional-order chaotic Lorenz system.

  10. Robust Control of Underactuated Systems: Higher Order Integral Sliding Mode Approach

    Directory of Open Access Journals (Sweden)

    Sami ud Din

    2016-01-01

    Full Text Available This paper presents a robust control design for the class of underactuated uncertain nonlinear systems. Either the nonlinear model of the underactuated systems is transformed into an input output form and then an integral manifold is devised for the control design purpose or an integral manifold is defined directly for the concerned class. Having defined the integral manifolds discontinuous control laws are designed which are capable of maintaining sliding mode from the very beginning. The closed loop stability of these systems is presented in an impressive way. The effectiveness and demand of the designed control laws are verified via the simulation and experimental results of ball and beam system.

  11. Using adaptive processes and adverse outcome pathways to develop meaningful, robust, and actionable environmental monitoring programs.

    Science.gov (United States)

    Arciszewski, Tim J; Munkittrick, Kelly R; Scrimgeour, Garry J; Dubé, Monique G; Wrona, Fred J; Hazewinkel, Rod R

    2017-09-01

    The primary goals of environmental monitoring are to indicate whether unexpected changes related to development are occurring in the physical, chemical, and biological attributes of ecosystems and to inform meaningful management intervention. Although achieving these objectives is conceptually simple, varying scientific and social challenges often result in their breakdown. Conceptualizing, designing, and operating programs that better delineate monitoring, management, and risk assessment processes supported by hypothesis-driven approaches, strong inference, and adverse outcome pathways can overcome many of the challenges. Generally, a robust monitoring program is characterized by hypothesis-driven questions associated with potential adverse outcomes and feedback loops informed by data. Specifically, key and basic features are predictions of future observations (triggers) and mechanisms to respond to success or failure of those predictions (tiers). The adaptive processes accelerate or decelerate the effort to highlight and overcome ignorance while preventing the potentially unnecessary escalation of unguided monitoring and management. The deployment of the mutually reinforcing components can allow for more meaningful and actionable monitoring programs that better associate activities with consequences. Integr Environ Assess Manag 2017;13:877-891. © 2017 The Authors. Integrated Environmental Assessment and Management Published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). © 2017 The Authors. Integrated Environmental Assessment and Management Published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).

  12. Robust parameter design for integrated circuit fabrication procedure with respect to categorical characteristic

    International Nuclear Information System (INIS)

    Sohn, S.Y.

    1999-01-01

    We consider a robust parameter design of the process for forming contact windows in complementary metal-oxide semiconductor circuits. Robust design is often used to find the optimal levels of process conditions which would provide the output of consistent quality as close to a target value. In this paper, we analyze the results of the fractional factorial design of nine factors: mask dimension, viscosity, bake temperature, spin speed, bake time, aperture, exposure time, developing time, etch time, where the outcome of the experiment is measured in terms of a categorized window size with five categories. Random effect analysis is employed to model both the mean and variance of categorized window size as functions of some controllable factors as well as random errors. Empirical Bayes' procedures are then utilized to fit both the models, and to eventually find the robust design of CMOS circuit process by means of a Bootstrap resampling approach

  13. Robust parameter design for integrated circuit fabrication procedure with respect to categorical characteristic

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, S.Y

    1999-12-01

    We consider a robust parameter design of the process for forming contact windows in complementary metal-oxide semiconductor circuits. Robust design is often used to find the optimal levels of process conditions which would provide the output of consistent quality as close to a target value. In this paper, we analyze the results of the fractional factorial design of nine factors: mask dimension, viscosity, bake temperature, spin speed, bake time, aperture, exposure time, developing time, etch time, where the outcome of the experiment is measured in terms of a categorized window size with five categories. Random effect analysis is employed to model both the mean and variance of categorized window size as functions of some controllable factors as well as random errors. Empirical Bayes' procedures are then utilized to fit both the models, and to eventually find the robust design of CMOS circuit process by means of a Bootstrap resampling approach.

  14. Product integration rules at Clenshaw-Curtis and related points: A robust implementation

    International Nuclear Information System (INIS)

    Adam, G.; Nobile, A.

    1989-12-01

    Product integration rules generalizing the Fejer, Clenshaw-Curtis and Filippi quadrature rules respectively are derived for integrals with trigonometric and hyperbolic weight factors. The study puts in evidence the existence of well-conditioned fully analytic solutions, in terms of hypergeometric functions 0 F 1 . An a priori error estimator is discussed which is shown both to avoid wasteful invocation of the integration rule and to increase significantly the robustness of the automatic quadrature procedure. Then, specializing to extended Clenshaw-Curtis (ECC) rules, three types of a posteriori error estimates are considered and the existence of a great risk of their failure is put into evidence by large scale validation tests. An empirical error estimator, superseding them at slowly varying integrands, is found to result in a spectacular increase in the output reliability. Finally, enhancements in the control of the interval subdivision strategy aiming at increasing code robustness is discussed. Comparison with the code DQAWO of QUADPACK, extending over a statistics of about hundred thousand solved integrals, is illustrative for the increased robustness and error estimate reliability of our computer code implementation of the ECC rules. (author). 19 refs, 8 tabs

  15. A climate robust integrated modelling framework for regional impact assessment of climate change

    Science.gov (United States)

    Janssen, Gijs; Bakker, Alexander; van Ek, Remco; Groot, Annemarie; Kroes, Joop; Kuiper, Marijn; Schipper, Peter; van Walsum, Paul; Wamelink, Wieger; Mol, Janet

    2013-04-01

    Decision making towards climate proofing the water management of regional catchments can benefit greatly from the availability of a climate robust integrated modelling framework, capable of a consistent assessment of climate change impacts on the various interests present in the catchments. In the Netherlands, much effort has been devoted to developing state-of-the-art regional dynamic groundwater models with a very high spatial resolution (25x25 m2). Still, these models are not completely satisfactory to decision makers because the modelling concepts do not take into account feedbacks between meteorology, vegetation/crop growth, and hydrology. This introduces uncertainties in forecasting the effects of climate change on groundwater, surface water, agricultural yields, and development of groundwater dependent terrestrial ecosystems. These uncertainties add to the uncertainties about the predictions on climate change itself. In order to create an integrated, climate robust modelling framework, we coupled existing model codes on hydrology, agriculture and nature that are currently in use at the different research institutes in the Netherlands. The modelling framework consists of the model codes MODFLOW (groundwater flow), MetaSWAP (vadose zone), WOFOST (crop growth), SMART2-SUMO2 (soil-vegetation) and NTM3 (nature valuation). MODFLOW, MetaSWAP and WOFOST are coupled online (i.e. exchange information on time step basis). Thus, changes in meteorology and CO2-concentrations affect crop growth and feedbacks between crop growth, vadose zone water movement and groundwater recharge are accounted for. The model chain WOFOST-MetaSWAP-MODFLOW generates hydrological input for the ecological prediction model combination SMART2-SUMO2-NTM3. The modelling framework was used to support the regional water management decision making process in the 267 km2 Baakse Beek-Veengoot catchment in the east of the Netherlands. Computations were performed for regionalized 30-year climate change

  16. Advanced process monitoring and feedback control to enhance cell culture process production and robustness.

    Science.gov (United States)

    Zhang, An; Tsang, Valerie Liu; Moore, Brandon; Shen, Vivian; Huang, Yao-Ming; Kshirsagar, Rashmi; Ryll, Thomas

    2015-12-01

    It is a common practice in biotherapeutic manufacturing to define a fixed-volume feed strategy for nutrient feeds, based on historical cell demand. However, once the feed volumes are defined, they are inflexible to batch-to-batch variations in cell growth and physiology and can lead to inconsistent productivity and product quality. In an effort to control critical quality attributes and to apply process analytical technology (PAT), a fully automated cell culture feedback control system has been explored in three different applications. The first study illustrates that frequent monitoring and automatically controlling the complex feed based on a surrogate (glutamate) level improved protein production. More importantly, the resulting feed strategy was translated into a manufacturing-friendly manual feed strategy without impact on product quality. The second study demonstrates the improved process robustness of an automated feed strategy based on online bio-capacitance measurements for cell growth. In the third study, glucose and lactate concentrations were measured online and were used to automatically control the glucose feed, which in turn changed lactate metabolism. These studies suggest that the auto-feedback control system has the potential to significantly increase productivity and improve robustness in manufacturing, with the goal of ensuring process performance and product quality consistency. © 2015 Wiley Periodicals, Inc.

  17. Securing a robust electrical discharge drilling process by means of flow rate control

    Science.gov (United States)

    Risto, Matthias; Munz, Markus; Haas, Ruediger; Abdolahi, Ali

    2017-10-01

    This paper deals with the increase of the process robustness while drilling cemented carbide using electrical discharge machining (EDM). A demand for high efficiency in the resulting diameter is equivalent with a high robustness of the EDM drilling process. Analysis were done to investigate the process robustness (standard deviation of the borehole diameter) when drilling cemented carbide. The investigation has shown that the dielectric flow rate changes over the drilling process. In this case the flow rate decreased with a shorter tool electrode due to an uneven wear of the tool electrode's cross section. Using a controlled flow rate during the drilling process has led to a reduced standard deviation of the borehole diameter, thus to a higher process robustness when drilling cemented carbide.

  18. Data Integration Tool: From Permafrost Data Translation Research Tool to A Robust Research Application

    Science.gov (United States)

    Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Strawhacker, C.; Pulsifer, P. L.; Thurmes, N.

    2016-12-01

    The United States National Science Foundation funded PermaData project led by the National Snow and Ice Data Center (NSIDC) with a team from the Global Terrestrial Network for Permafrost (GTN-P) aimed to improve permafrost data access and discovery. We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the GTN-P. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets. Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs. Originally it was written to capture a scientist's personal, iterative, data manipulation and quality control process of visually and programmatically iterating through inconsistent input data, examining it to find problems, adding operations to address the problems, and rerunning until the data could be translated into the GTN-P standard format. Iterative development of this tool led to a Fortran/Python hybrid then, with consideration of users, licensing, version control, packaging, and workflow, to a publically available, robust, usable application. Transitioning to Python allowed the use of open source frameworks for the workflow core and integration with a javascript graphical workflow interface. DIT is targeted to automatically handle 90% of the data processing for field scientists, modelers, and non-discipline scientists. It is available as an open source tool in GitHub packaged for a subset of Mac, Windows, and UNIX systems as a desktop application with a graphical workflow manager. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets

  19. A fast and robust hepatocyte quantification algorithm including vein processing

    Directory of Open Access Journals (Sweden)

    Homeyer André

    2010-03-01

    Full Text Available Abstract Background Quantification of different types of cells is often needed for analysis of histological images. In our project, we compute the relative number of proliferating hepatocytes for the evaluation of the regeneration process after partial hepatectomy in normal rat livers. Results Our presented automatic approach for hepatocyte (HC quantification is suitable for the analysis of an entire digitized histological section given in form of a series of images. It is the main part of an automatic hepatocyte quantification tool that allows for the computation of the ratio between the number of proliferating HC-nuclei and the total number of all HC-nuclei for a series of images in one processing run. The processing pipeline allows us to obtain desired and valuable results for a wide range of images with different properties without additional parameter adjustment. Comparing the obtained segmentation results with a manually retrieved segmentation mask which is considered to be the ground truth, we achieve results with sensitivity above 90% and false positive fraction below 15%. Conclusions The proposed automatic procedure gives results with high sensitivity and low false positive fraction and can be applied to process entire stained sections.

  20. An Integrated Environmental Assessment of Green and Gray Infrastructure Strategies for Robust Decision Making.

    Science.gov (United States)

    Casal-Campos, Arturo; Fu, Guangtao; Butler, David; Moore, Andrew

    2015-07-21

    The robustness of a range of watershed-scale "green" and "gray" drainage strategies in the future is explored through comprehensive modeling of a fully integrated urban wastewater system case. Four socio-economic future scenarios, defined by parameters affecting the environmental performance of the system, are proposed to account for the uncertain variability of conditions in the year 2050. A regret-based approach is applied to assess the relative performance of strategies in multiple impact categories (environmental, economic, and social) as well as to evaluate their robustness across future scenarios. The concept of regret proves useful in identifying performance trade-offs and recognizing states of the world most critical to decisions. The study highlights the robustness of green strategies (particularly rain gardens, resulting in half the regret of most options) over end-of-pipe gray alternatives (surface water separation or sewer and storage rehabilitation), which may be costly (on average, 25% of the total regret of these options) and tend to focus on sewer flooding and CSO alleviation while compromising on downstream system performance (this accounts for around 50% of their total regret). Trade-offs and scenario regrets observed in the analysis suggest that the combination of green and gray strategies may still offer further potential for robustness.

  1. A Robust WLS Power System State Estimation Method Integrating a Wide-Area Measurement System and SCADA Technology

    Directory of Open Access Journals (Sweden)

    Tao Jin

    2015-04-01

    Full Text Available With the development of modern society, the scale of the power system is rapidly increased accordingly, and the framework and mode of running of power systems are trending towards more complexity. It is nowadays much more important for the dispatchers to know exactly the state parameters of the power network through state estimation. This paper proposes a robust power system WLS state estimation method integrating a wide-area measurement system (WAMS and SCADA technology, incorporating phasor measurements and the results of the traditional state estimator in a post-processing estimator, which greatly reduces the scale of the non-linear estimation problem as well as the number of iterations and the processing time per iteration. This paper firstly analyzes the wide-area state estimation model in detail, then according to the issue that least squares does not account for bad data and outliers, the paper proposes a robust weighted least squares (WLS method that combines a robust estimation principle with least squares by equivalent weight. The performance assessment is discussed through setting up mathematical models of the distribution network. The effectiveness of the proposed method was proved to be accurate and reliable by simulations and experiments.

  2. MPLEx: a Robust and Universal Protocol for Single-Sample Integrative Proteomic, Metabolomic, and Lipidomic Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Nakayasu, Ernesto S.; Nicora, Carrie D.; Sims, Amy C.; Burnum-Johnson, Kristin E.; Kim, Young-Mo; Kyle, Jennifer E.; Matzke, Melissa M.; Shukla, Anil K.; Chu, Rosalie K.; Schepmoes, Athena A.; Jacobs, Jon M.; Baric, Ralph S.; Webb-Robertson, Bobbie-Jo; Smith, Richard D.; Metz, Thomas O.; Chia, Nicholas

    2016-05-03

    ABSTRACT

    Integrative multi-omics analyses can empower more effective investigation and complete understanding of complex biological systems. Despite recent advances in a range of omics analyses, multi-omic measurements of the same sample are still challenging and current methods have not been well evaluated in terms of reproducibility and broad applicability. Here we adapted a solvent-based method, widely applied for extracting lipids and metabolites, to add proteomics to mass spectrometry-based multi-omics measurements. Themetabolite,protein, andlipidextraction (MPLEx) protocol proved to be robust and applicable to a diverse set of sample types, including cell cultures, microbial communities, and tissues. To illustrate the utility of this protocol, an integrative multi-omics analysis was performed using a lung epithelial cell line infected with Middle East respiratory syndrome coronavirus, which showed the impact of this virus on the host glycolytic pathway and also suggested a role for lipids during infection. The MPLEx method is a simple, fast, and robust protocol that can be applied for integrative multi-omic measurements from diverse sample types (e.g., environmental,in vitro, and clinical).

    IMPORTANCEIn systems biology studies, the integration of multiple omics measurements (i.e., genomics, transcriptomics, proteomics, metabolomics, and lipidomics) has been shown to provide a more complete and informative view of biological pathways. Thus, the prospect of extracting different types of molecules (e.g., DNAs, RNAs, proteins, and metabolites) and performing multiple omics measurements on single samples is very attractive, but such studies are challenging due to the fact that the extraction conditions differ according to the molecule type. Here, we adapted an organic solvent-based extraction method that demonstrated

  3. Mutual Information Based Dynamic Integration of Multiple Feature Streams for Robust Real-Time LVCSR

    Science.gov (United States)

    Sato, Shoei; Kobayashi, Akio; Onoe, Kazuo; Homma, Shinichi; Imai, Toru; Takagi, Tohru; Kobayashi, Tetsunori

    We present a novel method of integrating the likelihoods of multiple feature streams, representing different acoustic aspects, for robust speech recognition. The integration algorithm dynamically calculates a frame-wise stream weight so that a higher weight is given to a stream that is robust to a variety of noisy environments or speaking styles. Such a robust stream is expected to show discriminative ability. A conventional method proposed for the recognition of spoken digits calculates the weights front the entropy of the whole set of HMM states. This paper extends the dynamic weighting to a real-time large-vocabulary continuous speech recognition (LVCSR) system. The proposed weight is calculated in real-time from mutual information between an input stream and active HMM states in a searchs pace without an additional likelihood calculation. Furthermore, the mutual information takes the width of the search space into account by calculating the marginal entropy from the number of active states. In this paper, we integrate three features that are extracted through auditory filters by taking into account the human auditory system's ability to extract amplitude and frequency modulations. Due to this, features representing energy, amplitude drift, and resonant frequency drifts, are integrated. These features are expected to provide complementary clues for speech recognition. Speech recognition experiments on field reports and spontaneous commentary from Japanese broadcast news showed that the proposed method reduced error words by 9.2% in field reports and 4.7% in spontaneous commentaries relative to the best result obtained from a single stream.

  4. Robust modelling and simulation integration of SIMIO with coloured petri nets

    CERN Document Server

    De La Mota, Idalia Flores; Mujica Mota, Miguel; Angel Piera, Miquel

    2017-01-01

    This book presents for the first time a methodology that combines the power of a modelling formalism such as colored petri nets with the flexibility of a discrete event program such as SIMIO. Industrial practitioners have seen the growth of simulation as a methodology for tacking problems in which variability is the common denominator. Practically all industrial systems, from manufacturing to aviation are considered stochastic systems. Different modelling techniques have been developed as well as mathematical techniques for formalizing the cause-effect relationships in industrial and complex systems. The methodology in this book illustrates how complexity in modelling can be tackled by the use of coloured petri nets, while at the same time the variability present in systems is integrated in a robust fashion. The book can be used as a concise guide for developing robust models, which are able to efficiently simulate the cause-effect relationships present in complex industrial systems without losing the simulat...

  5. Complete achromatic and robustness electro-optic switch between two integrated optical waveguides

    Science.gov (United States)

    Huang, Wei; Kyoseva, Elica

    2018-01-01

    In this paper, we present a novel design of electro-optic modulator and optical switching device, based on current integrated optics technique. The advantages of our optical switching device are broadband of input light wavelength, robustness against varying device length and operation voltages, with reference to previous design. Conforming to our results of previous paper [Huang et al, phys. lett. a, 90, 053837], the coupling of the waveguides has a hyperbolic-secant shape. while detuning has a sign flip at maximum coupling, we called it as with a sign flip of phase mismatch model. The a sign flip of phase mismatch model can produce complete robust population transfer. In this paper, we enhance this device to switch light intensity controllable, by tuning external electric field based on electro-optic effect.

  6. Leveraging the Cloud for Robust and Efficient Lunar Image Processing

    Science.gov (United States)

    Chang, George; Malhotra, Shan; Wolgast, Paul

    2011-01-01

    The Lunar Mapping and Modeling Project (LMMP) is tasked to aggregate lunar data, from the Apollo era to the latest instruments on the LRO spacecraft, into a central repository accessible by scientists and the general public. A critical function of this task is to provide users with the best solution for browsing the vast amounts of imagery available. The image files LMMP manages range from a few gigabytes to hundreds of gigabytes in size with new data arriving every day. Despite this ever-increasing amount of data, LMMP must make the data readily available in a timely manner for users to view and analyze. This is accomplished by tiling large images into smaller images using Hadoop, a distributed computing software platform implementation of the MapReduce framework, running on a small cluster of machines locally. Additionally, the software is implemented to use Amazon's Elastic Compute Cloud (EC2) facility. We also developed a hybrid solution to serve images to users by leveraging cloud storage using Amazon's Simple Storage Service (S3) for public data while keeping private information on our own data servers. By using Cloud Computing, we improve upon our local solution by reducing the need to manage our own hardware and computing infrastructure, thereby reducing costs. Further, by using a hybrid of local and cloud storage, we are able to provide data to our users more efficiently and securely. 12 This paper examines the use of a distributed approach with Hadoop to tile images, an approach that provides significant improvements in image processing time, from hours to minutes. This paper describes the constraints imposed on the solution and the resulting techniques developed for the hybrid solution of a customized Hadoop infrastructure over local and cloud resources in managing this ever-growing data set. It examines the performance trade-offs of using the more plentiful resources of the cloud, such as those provided by S3, against the bandwidth limitations such use

  7. Robust client/server shared state interactions of collaborative process with system crash and network failures

    NARCIS (Netherlands)

    Wang, Lei; Wombacher, Andreas; Ferreira Pires, Luis; van Sinderen, Marten J.; Chi, Chihung

    With the possibility of system crashes and network failures, the design of robust client/server interactions for collaborative process execution is a challenge. If a business process changes state, it sends messages to relevant processes to inform about this change. However, server crashes and

  8. Robust localisation of automated guided vehicles for computer-integrated manufacturing environments

    Directory of Open Access Journals (Sweden)

    Dixon, R. C.

    2013-05-01

    Full Text Available As industry moves toward an era of complete automation and mass customisation, automated guided vehicles (AGVs are used as material handling systems. However, the current techniques that provide navigation, control, and manoeuvrability of automated guided vehicles threaten to create bottlenecks and inefficiencies in manufacturing environments that strive towards the optimisation of part production. This paper proposes a decentralised localisation technique for an automated guided vehicle without any non-holonomic constraints. Incorporation of these vehicles into the material handling system of a computer-integrated manufacturing environment would increase the characteristics of robustness, efficiency, flexibility, and advanced manoeuvrability.

  9. New results on the robust stability of PID controllers with gain and phase margins for UFOPTD processes.

    Science.gov (United States)

    Jin, Q B; Liu, Q; Huang, B

    2016-03-01

    This paper considers the problem of determining all the robust PID (proportional-integral-derivative) controllers in terms of the gain and phase margins (GPM) for open-loop unstable first order plus time delay (UFOPTD) processes. It is the first time that the feasible ranges of the GPM specifications provided by a PID controller are given for UFOPTD processes. A gain and phase margin tester is used to modify the original model, and the ranges of the margin specifications are derived such that the modified model can be stabilized by a stabilizing PID controller based on Hermite-Biehlers Theorem. Furthermore, we obtain all the controllers satisfying a given margin specification. Simulation studies show how to use the results to design a robust PID controller. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  10. Logistics integration processes in the food industry

    OpenAIRE

    Giménez, Cristina

    2003-01-01

    This paper analyses the integration process that firms follow to implement Supply Chain Management (SCM). This study has been inspired in the integration model proposed by Stevens (1989). He suggests that companies internally integrate first and then extend integration to other supply chain members, such as customers and suppliers. To analyse the integration process a survey was conducted among Spanish food manufacturers. The results show that there are companies in three different integratio...

  11. Practical Robust Optimization Method for Unit Commitment of a System with Integrated Wind Resource

    Directory of Open Access Journals (Sweden)

    Yuanchao Yang

    2017-01-01

    Full Text Available Unit commitment, one of the significant tasks in power system operations, faces new challenges as the system uncertainty increases dramatically due to the integration of time-varying resources, such as wind. To address these challenges, we propose the formulation and solution of a generalized unit commitment problem for a system with integrated wind resources. Given the prespecified interval information acquired from real central wind forecasting system for uncertainty representation of nodal wind injections with their correlation information, the proposed unit commitment problem solution is computationally tractable and robust against all uncertain wind power injection realizations. We provide a solution approach to tackle this problem with complex mathematical basics and illustrate the capabilities of the proposed mixed integer solution approach on the large-scale power system of the Northwest China Grid. The numerical results demonstrate that the approach is realistic and not overly conservative in terms of the resulting dispatch cost outcomes.

  12. Design process robustness: A bi-partite network analysis reveals the central importance of people

    DEFF Research Database (Denmark)

    Piccolo, Sebastiano; Jørgensen, Sune Lehmann; Maier, Anja

    2018-01-01

    , reducing the risk of rework and delays. Although there has been much progress in modelling and understanding design processes, little is known about the interplay between people and the activities they perform and its influence on design process robustness. To analyse this interplay, we model a large...

  13. A robust method for processing scanning probe microscopy images and determining nanoobject position and dimensions

    NARCIS (Netherlands)

    Silly, F.

    2009-01-01

    P>Processing of scanning probe microscopy (SPM) images is essential to explore nanoscale phenomena. Image processing and pattern recognition techniques are developed to improve the accuracy and consistency of nanoobject and surface characterization. We present a robust and versatile method to

  14. Low-Power Photoplethysmogram Acquisition Integrated Circuit with Robust Light Interference Compensation

    Directory of Open Access Journals (Sweden)

    Jongpal Kim

    2015-12-01

    Full Text Available To overcome light interference, including a large DC offset and ambient light variation, a robust photoplethysmogram (PPG readout chip is fabricated using a 0.13-μm complementary metal–oxide–semiconductor (CMOS process. Against the large DC offset, a saturation detection and current feedback circuit is proposed to compensate for an offset current of up to 30 μA. For robustness against optical path variation, an automatic emitted light compensation method is adopted. To prevent ambient light interference, an alternating sampling and charge redistribution technique is also proposed. In the proposed technique, no additional power is consumed, and only three differential switches and one capacitor are required. The PPG readout channel consumes 26.4 μW and has an input referred current noise of 260 pArms.

  15. Low-Power Photoplethysmogram Acquisition Integrated Circuit with Robust Light Interference Compensation.

    Science.gov (United States)

    Kim, Jongpal; Kim, Jihoon; Ko, Hyoungho

    2015-12-31

    To overcome light interference, including a large DC offset and ambient light variation, a robust photoplethysmogram (PPG) readout chip is fabricated using a 0.13-μm complementary metal-oxide-semiconductor (CMOS) process. Against the large DC offset, a saturation detection and current feedback circuit is proposed to compensate for an offset current of up to 30 μA. For robustness against optical path variation, an automatic emitted light compensation method is adopted. To prevent ambient light interference, an alternating sampling and charge redistribution technique is also proposed. In the proposed technique, no additional power is consumed, and only three differential switches and one capacitor are required. The PPG readout channel consumes 26.4 μW and has an input referred current noise of 260 pArms.

  16. Integrated modelling in materials and process technology

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri

    2008-01-01

    Integrated modelling of entire process sequences and the subsequent in-service conditions, and multiphysics modelling of the single process steps are areas that increasingly support optimisation of manufactured parts. In the present paper, three different examples of modelling manufacturing...... processes from the viewpoint of combined materials and process modelling are presented: solidification of thin walled ductile cast iron, integrated modelling of spray forming and multiphysics modelling of friction stir welding. The fourth example describes integrated modelling applied to a failure analysis...

  17. Quantifying the robustness of process manufacturing concept – A medical product case study

    DEFF Research Database (Denmark)

    Boorla, Srinivasa Murthy; Troldtoft, M.E.; Eifler, Tobias

    2017-01-01

    Product robustness refers to the consistency of performance of all of the units produced. It is often the case that process manufactured products are not designed concurrently, so by the end of the product design phase the Process Manufacturing Concept (PMC) has yet to be decided. Allocating...... the unit-to-unit robustness of an early-stage for a PMC is proposed. The method uses variability and adjustability information from the manufacturing concept in combination with sensitivity information from products' design to predict its functional performance variation. A Technology maturation factor...... process capable tolerances to the product during the design phase is therefore not possible. The robustness of the concept (how capable it is to achieve the product specification), only becomes clear at this late stage and thus after testing and iteration. In this article, a method for calculating...

  18. Determining the optimum process parameter for grinding operations using robust process

    Energy Technology Data Exchange (ETDEWEB)

    Neseli, Suley Man; Asilturk, Ilhan; Celik, Levent [Univ. of Selcuk, Konya (Turkmenistan)

    2012-11-15

    We applied combined response surface methodology (RSM) and Taguchi methodology (TM) to determine optimum parameters for minimum surface roughness (Ra) and vibration (Vb) in external cylindrical grinding. First, an experiment was conducted in a CNC cylindrical grinding machine. The TM using L{sup 27} orthogonal array was applied to the design of the experiment. The three input parameters were workpiece revolution, feed rate and depth of cut; the outputs were vibrations and surface roughness. Second, to minimize wheel vibration and surface roughness, two optimized models were developed using computer aided single objective optimization. The experimental and statistical results revealed that the most significant grinding parameter for surface roughness and vibration is workpiece revolution followed by the depth of cut. The predicted values and measured values were fairly close, which indicates 2 ( 94.99 R{sup 2Ra}=and 2 92.73) R{sup 2Vb}=that the developed models can be effectively used to predict surface roughness and vibration in the grinding. The established model for determination of optimal operating conditions shows that a hybrid approach can lead to success of a robust process.

  19. Determining the optimum process parameter for grinding operations using robust process

    International Nuclear Information System (INIS)

    Neseli, Suley Man; Asilturk, Ilhan; Celik, Levent

    2012-01-01

    We applied combined response surface methodology (RSM) and Taguchi methodology (TM) to determine optimum parameters for minimum surface roughness (Ra) and vibration (Vb) in external cylindrical grinding. First, an experiment was conducted in a CNC cylindrical grinding machine. The TM using L 27 orthogonal array was applied to the design of the experiment. The three input parameters were workpiece revolution, feed rate and depth of cut; the outputs were vibrations and surface roughness. Second, to minimize wheel vibration and surface roughness, two optimized models were developed using computer aided single objective optimization. The experimental and statistical results revealed that the most significant grinding parameter for surface roughness and vibration is workpiece revolution followed by the depth of cut. The predicted values and measured values were fairly close, which indicates 2 ( 94.99 R 2Ra =and 2 92.73) R 2Vb =that the developed models can be effectively used to predict surface roughness and vibration in the grinding. The established model for determination of optimal operating conditions shows that a hybrid approach can lead to success of a robust process

  20. Gear hot forging process robust design based on finite element method

    International Nuclear Information System (INIS)

    Xuewen, Chen; Won, Jung Dong

    2008-01-01

    During the hot forging process, the shaping property and forging quality will fluctuate because of die wear, manufacturing tolerance, dimensional variation caused by temperature and the different friction conditions, etc. In order to control this variation in performance and to optimize the process parameters, a robust design method is proposed in this paper, based on the finite element method for the hot forging process. During the robust design process, the Taguchi method is the basic robust theory. The finite element analysis is incorporated in order to simulate the hot forging process. In addition, in order to calculate the objective function value, an orthogonal design method is selected to arrange experiments and collect sample points. The ANOVA method is employed to analyze the relationships of the design parameters and design objectives and to find the best parameters. Finally, a case study for the gear hot forging process is conducted. With the objective to reduce the forging force and its variation, the robust design mathematical model is established. The optimal design parameters obtained from this study indicate that the forging force has been reduced and its variation has been controlled

  1. Combining shallow and deep processing for a robust, fast, deep-linguistic dependency parser

    OpenAIRE

    Schneider, G

    2004-01-01

    This paper describes Pro3Gres, a fast, robust, broad-coverage parser that delivers deep-linguistic grammatical relation structures as output, which are closer to predicate-argument structures and more informative than pure constituency structures. The parser stays as shallow as is possible for each task, combining shallow and deep-linguistic methods by integrating chunking and by expressing the majority of long-distance dependencies in a context-free way. It combines statistical and rule-base...

  2. A preferential design approach for energy-efficient and robust implantable neural signal processing hardware.

    Science.gov (United States)

    Narasimhan, Seetharam; Chiel, Hillel J; Bhunia, Swarup

    2009-01-01

    For implantable neural interface applications, it is important to compress data and analyze spike patterns across multiple channels in real time. Such a computational task for online neural data processing requires an innovative circuit-architecture level design approach for low-power, robust and area-efficient hardware implementation. Conventional microprocessor or Digital Signal Processing (DSP) chips would dissipate too much power and are too large in size for an implantable system. In this paper, we propose a novel hardware design approach, referred to as "Preferential Design" that exploits the nature of the neural signal processing algorithm to achieve a low-voltage, robust and area-efficient implementation using nanoscale process technology. The basic idea is to isolate the critical components with respect to system performance and design them more conservatively compared to the noncritical ones. This allows aggressive voltage scaling for low power operation while ensuring robustness and area efficiency. We have applied the proposed approach to a neural signal processing algorithm using the Discrete Wavelet Transform (DWT) and observed significant improvement in power and robustness over conventional design.

  3. A robust, efficient and accurate β- pdf integration algorithm in nonpremixed turbulent combustion

    International Nuclear Information System (INIS)

    Liu, H.; Lien, F.S.; Chui, E.

    2005-01-01

    Among many presumed-shape pdf approaches, the presumed β-function pdf is widely used in nonpremixed turbulent combustion models in the literature. However, singularity difficulties at Z = 0 and 1, Z being the mixture fraction, may be encountered in the numerical integration of the b-function pdf and there are few publications addressing this issue to date. The present study proposes an efficient, robust and accurate algorithm to overcome these numerical difficulties. The present treatment of the β-pdf integration is firstly used in the Burke-Schumann solution in conjunction with the k - ε turbulent model in the case of CH 4 /H 2 bluff-body jets and flames. Afterward it is extended to a more complex model, the laminar flamelet model, for the same flow. Numerical results obtained by using the proposed β-pdf integration method are compared to experimental values of the velocity field, temperature and constituent mass fraction to illustrate the efficiency and accuracy of the present method. (author)

  4. Variational and robust density fitting of four-center two-electron integrals in local metrics

    Science.gov (United States)

    Reine, Simen; Tellgren, Erik; Krapp, Andreas; Kjærgaard, Thomas; Helgaker, Trygve; Jansik, Branislav; Høst, Stinne; Salek, Paweł

    2008-09-01

    Density fitting is an important method for speeding up quantum-chemical calculations. Linear-scaling developments in Hartree-Fock and density-functional theories have highlighted the need for linear-scaling density-fitting schemes. In this paper, we present a robust variational density-fitting scheme that allows for solving the fitting equations in local metrics instead of the traditional Coulomb metric, as required for linear scaling. Results of fitting four-center two-electron integrals in the overlap and the attenuated Gaussian damped Coulomb metric are presented, and we conclude that density fitting can be performed in local metrics at little loss of chemical accuracy. We further propose to use this theory in linear-scaling density-fitting developments.

  5. Final report for the Integrated and Robust Security Infrastructure (IRSI) laboratory directed research and development project

    Energy Technology Data Exchange (ETDEWEB)

    Hutchinson, R.L.; Hamilton, V.A.; Istrail, G.G.; Espinoza, J.; Murphy, M.D.

    1997-11-01

    This report describes the results of a Sandia-funded laboratory-directed research and development project titled {open_quotes}Integrated and Robust Security Infrastructure{close_quotes} (IRSI). IRSI was to provide a broad range of commercial-grade security services to any software application. IRSI has two primary goals: application transparency and manageable public key infrastructure. IRSI must provide its security services to any application without the need to modify the application to invoke the security services. Public key mechanisms are well suited for a network with many end users and systems. There are many issues that make it difficult to deploy and manage a public key infrastructure. IRSI addressed some of these issues to create a more manageable public key infrastructure.

  6. A robust direct-integration method for rotorcraft maneuver and periodic response

    Science.gov (United States)

    Panda, Brahmananda

    1992-01-01

    The Newmark-Beta method and the Newton-Raphson iteration scheme are combined to develop a direct-integration method for evaluating the maneuver and periodic-response expressions for rotorcraft. The method requires the generation of Jacobians and includes higher derivatives in the formulation of the geometric stiffness matrix to enhance the convergence of the system. The method leads to effective convergence with nonlinear structural dynamics and aerodynamic terms. Singularities in the matrices can be addressed with the method as they arise from a Lagrange multiplier approach for coupling equations with nonlinear constraints. The method is also shown to be general enough to handle singularities from quasisteady control-system models. The method is shown to be more general and robust than the similar 2GCHAS method for analyzing rotorcraft dynamics.

  7. Robust Low Cost Liquid Rocket Combustion Chamber by Advanced Vacuum Plasma Process

    Science.gov (United States)

    Holmes, Richard; Elam, Sandra; Ellis, David L.; McKechnie, Timothy; Hickman, Robert; Rose, M. Franklin (Technical Monitor)

    2001-01-01

    Next-generation, regeneratively cooled rocket engines will require materials that can withstand high temperatures while retaining high thermal conductivity. Fabrication techniques must be cost efficient so that engine components can be manufactured within the constraints of shrinking budgets. Three technologies have been combined to produce an advanced liquid rocket engine combustion chamber at NASA-Marshall Space Flight Center (MSFC) using relatively low-cost, vacuum-plasma-spray (VPS) techniques. Copper alloy NARloy-Z was replaced with a new high performance Cu-8Cr-4Nb alloy developed by NASA-Glenn Research Center (GRC), which possesses excellent high-temperature strength, creep resistance, and low cycle fatigue behavior combined with exceptional thermal stability. Functional gradient technology, developed building composite cartridges for space furnaces was incorporated to add oxidation resistant and thermal barrier coatings as an integral part of the hot wall of the liner during the VPS process. NiCrAlY, utilized to produce durable protective coating for the space shuttle high pressure fuel turbopump (BPFTP) turbine blades, was used as the functional gradient material coating (FGM). The FGM not only serves as a protection from oxidation or blanching, the main cause of engine failure, but also serves as a thermal barrier because of its lower thermal conductivity, reducing the temperature of the combustion liner 200 F, from 1000 F to 800 F producing longer life. The objective of this program was to develop and demonstrate the technology to fabricate high-performance, robust, inexpensive combustion chambers for advanced propulsion systems (such as Lockheed-Martin's VentureStar and NASA's Reusable Launch Vehicle, RLV) using the low-cost VPS process. VPS formed combustion chamber test articles have been formed with the FGM hot wall built in and hot fire tested, demonstrating for the first time a coating that will remain intact through the hot firing test, and with

  8. The Integrated Design Process (IDP)

    DEFF Research Database (Denmark)

    Hansen, Hanne Tine Ring; Knudstrup, Mary-Ann

    2005-01-01

    the different parameters and products can interact, and which consequences this would have on a project. The IDP does not ensure aesthetic or sustainable solutions, but it enables the designer to control the many parameters that must be considered and integrated in the project when creating more holistic...

  9. Integrating deep and shallow natural language processing components : representations and hybrid architectures

    OpenAIRE

    Schäfer, Ulrich

    2006-01-01

    We describe basic concepts and software architectures for the integration of shallow and deep (linguistics-based, semantics-oriented) natural language processing (NLP) components. The main goal of this novel, hybrid integration paradigm is improving robustness of deep processing. After an introduction to constraint-based natural language parsing, we give an overview of typical shallow processing tasks. We introduce XML standoff markup as an additional abstraction layer that eases integration ...

  10. Integrated control system for electron beam processes

    Science.gov (United States)

    Koleva, L.; Koleva, E.; Batchkova, I.; Mladenov, G.

    2018-03-01

    The ISO/IEC 62264 standard is widely used for integration of the business systems of a manufacturer with the corresponding manufacturing control systems based on hierarchical equipment models, functional data and manufacturing operations activity models. In order to achieve the integration of control systems, formal object communication models must be developed, together with manufacturing operations activity models, which coordinate the integration between different levels of control. In this article, the development of integrated control system for electron beam welding process is presented as part of a fully integrated control system of an electron beam plant, including also other additional processes: surface modification, electron beam evaporation, selective melting and electron beam diagnostics.

  11. Robust multi-site MR data processing: iterative optimization of bias correction, tissue classification, and registration.

    Science.gov (United States)

    Young Kim, Eun; Johnson, Hans J

    2013-01-01

    A robust multi-modal tool, for automated registration, bias correction, and tissue classification, has been implemented for large-scale heterogeneous multi-site longitudinal MR data analysis. This work focused on improving the an iterative optimization framework between bias-correction, registration, and tissue classification inspired from previous work. The primary contributions are robustness improvements from incorporation of following four elements: (1) utilize multi-modal and repeated scans, (2) incorporate high-deformable registration, (3) use extended set of tissue definitions, and (4) use of multi-modal aware intensity-context priors. The benefits of these enhancements were investigated by a series of experiments with both simulated brain data set (BrainWeb) and by applying to highly-heterogeneous data from a 32 site imaging study with quality assessments through the expert visual inspection. The implementation of this tool is tailored for, but not limited to, large-scale data processing with great data variation with a flexible interface. In this paper, we describe enhancements to a joint registration, bias correction, and the tissue classification, that improve the generalizability and robustness for processing multi-modal longitudinal MR scans collected at multi-sites. The tool was evaluated by using both simulated and simulated and human subject MRI images. With these enhancements, the results showed improved robustness for large-scale heterogeneous MRI processing.

  12. Resilient and Robust High Performance Computing Platforms for Scientific Computing Integrity

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Yier [Univ. of Central Florida, Orlando, FL (United States)

    2017-07-14

    As technology advances, computer systems are subject to increasingly sophisticated cyber-attacks that compromise both their security and integrity. High performance computing platforms used in commercial and scientific applications involving sensitive, or even classified data, are frequently targeted by powerful adversaries. This situation is made worse by a lack of fundamental security solutions that both perform efficiently and are effective at preventing threats. Current security solutions fail to address the threat landscape and ensure the integrity of sensitive data. As challenges rise, both private and public sectors will require robust technologies to protect its computing infrastructure. The research outcomes from this project try to address all these challenges. For example, we present LAZARUS, a novel technique to harden kernel Address Space Layout Randomization (KASLR) against paging-based side-channel attacks. In particular, our scheme allows for fine-grained protection of the virtual memory mappings that implement the randomization. We demonstrate the effectiveness of our approach by hardening a recent Linux kernel with LAZARUS, mitigating all of the previously presented side-channel attacks on KASLR. Our extensive evaluation shows that LAZARUS incurs only 0.943% overhead for standard benchmarks, and is therefore highly practical. We also introduced HA2lloc, a hardware-assisted allocator that is capable of leveraging an extended memory management unit to detect memory errors in the heap. We also perform testing using HA2lloc in a simulation environment and find that the approach is capable of preventing common memory vulnerabilities.

  13. A Robust Photogrammetric Processing Method of Low-Altitude UAV Images

    Directory of Open Access Journals (Sweden)

    Mingyao Ai

    2015-02-01

    Full Text Available Low-altitude Unmanned Aerial Vehicles (UAV images which include distortion, illumination variance, and large rotation angles are facing multiple challenges of image orientation and image processing. In this paper, a robust and convenient photogrammetric approach is proposed for processing low-altitude UAV images, involving a strip management method to automatically build a standardized regional aerial triangle (AT network, a parallel inner orientation algorithm, a ground control points (GCPs predicting method, and an improved Scale Invariant Feature Transform (SIFT method to produce large number of evenly distributed reliable tie points for bundle adjustment (BA. A multi-view matching approach is improved to produce Digital Surface Models (DSM and Digital Orthophoto Maps (DOM for 3D visualization. Experimental results show that the proposed approach is robust and feasible for photogrammetric processing of low-altitude UAV images and 3D visualization of products.

  14. Integrating operation design into infrastructure planning to foster robustness of planned water systems

    Science.gov (United States)

    Bertoni, Federica; Giuliani, Matteo; Castelletti, Andrea

    2017-04-01

    Over the past years, many studies have looked at the planning and management of water infrastructure systems as two separate problems, where the dynamic component (i.e., operations) is considered only after the static problem (i.e., planning) has been resolved. Most recent works have started to investigate planning and management as two strictly interconnected faces of the same problem, where the former is solved jointly with the latter in an integrated framework. This brings advantages to multi-purpose water reservoir systems, where several optimal operating strategies exist and similar system designs might perform differently on the long term depending on the considered short-term operating tradeoff. An operationally robust design will be therefore one performing well across multiple feasible tradeoff operating policies. This work aims at studying the interaction between short-term operating strategies and their impacts on long-term structural decisions, when long-lived infrastructures with complex ecological impacts and multi-sectoral demands to satisfy (i.e., reservoirs) are considered. A parametric reinforcement learning approach is adopted for nesting optimization and control yielding to both optimal reservoir design and optimal operational policies for water reservoir systems. The method is demonstrated on a synthetic reservoir that must be designed and operated for ensuring reliable water supply to downstream users. At first, the optimal design capacity derived is compared with the 'no-fail storage' computed through Rippl, a capacity design function that returns the minimum storage needed to satisfy specified water demands without allowing supply shortfall. Then, the optimal reservoir volume is used to simulate the simplified case study under other operating objectives than water supply, in order to assess whether and how the system performance changes. The more robust the infrastructural design, the smaller the difference between the performances of

  15. An integrated framework of agent-based modelling and robust optimization for microgrid energy management

    International Nuclear Information System (INIS)

    Kuznetsova, Elizaveta; Li, Yan-Fu; Ruiz, Carlos; Zio, Enrico

    2014-01-01

    Highlights: • Microgrid composed of a train station, wind power plant and district is investigated. • Each player is modeled as an individual agent aiming at a particular goal. • Prediction Intervals quantify the uncertain operational and environmental parameters. • Optimal goal-directed actions planning is achieved with robust optimization. • Optimization framework improves system reliability and decreases power imbalances. - Abstract: A microgrid energy management framework for the optimization of individual objectives of microgrid stakeholders is proposed. The framework is exemplified by way of a microgrid that is connected to an external grid via a transformer and includes the following players: a middle-size train station with integrated photovoltaic power production system, a small energy production plant composed of urban wind turbines, and a surrounding district including residences and small businesses. The system is described by Agent-Based Modelling (ABM), in which each player is modelled as an individual agent aiming at a particular goal, (i) decreasing its expenses for power purchase or (ii) increasing its revenues from power selling. The context in which the agents operate is uncertain due to the stochasticity of operational and environmental parameters, and the technical failures of the renewable power generators. The uncertain operational and environmental parameters of the microgrid are quantified in terms of Prediction Intervals (PIs) by a Non-dominated Sorting Genetic Algorithm (NSGA-II) – trained Neural Network (NN). Under these uncertainties, each agent is seeking for optimal goal-directed actions planning by Robust Optimization (RO). The developed framework is shown to lead to an increase in system performance, evaluated in terms of typical reliability (adequacy) indicators for energy systems, such as Loss of Load Expectation (LOLE) and Loss of Expected Energy (LOEE), in comparison with optimal planning based on expected values of

  16. Integrated lunar materials manufacturing process

    Science.gov (United States)

    Gibson, Michael A. (Inventor); Knudsen, Christian W. (Inventor)

    1990-01-01

    A manufacturing plant and process for production of oxygen on the moon uses lunar minerals as feed and a minimum of earth-imported, process materials. Lunar feed stocks are hydrogen-reducible minerals, ilmenite and lunar agglutinates occurring in numerous, explored locations mixed with other minerals in the pulverized surface layer of lunar soil known as regolith. Ilmenite (FeTiO.sub.3) and agglutinates contain ferrous (Fe.sup.+2) iron reducible by hydrogen to yield H.sub.2 O and metallic Fe at about 700.degree.-1,200.degree. C. The H.sub.2 O is electrolyzed in gas phase to yield H.sub.2 for recycle and O.sub.2 for storage and use. Hydrogen losses to lunar vacuum are minimized, with no net hydrogen (or any other earth-derived reagent) consumption except for small leaks. Feed minerals are surface-mined by front shovels and transported in trucks to the processing area. The machines are manned or robotic. Ilmenite and agglutinates occur mixed with silicate minerals which are not hydrogen-reducible at 700.degree.-1,200.degree. C. and consequently are separated and concentrated before feeding to the oxygen generation process. Solids rejected from the separation step and reduced solids from the oxygen process are returned to the mine area. The plant is powered by nuclear or solar power generators. Vapor-phase water electrolysis, a staged, countercurrent, fluidized bed reduction reactor and a radio-frequency-driven ceramic gas heater are used to improve thermal efficiency.

  17. Developing engineering processes through integrated modelling of product and process

    DEFF Research Database (Denmark)

    Nielsen, Jeppe Bjerrum; Hvam, Lars

    2012-01-01

    This article aims at developing an operational tool for integrated modelling of product assortments and engineering processes in companies making customer specific products. Integrating a product model in the design of engineering processes will provide a deeper understanding of the engineering...... activities as well as insight into how product features affect the engineering processes. The article suggests possible ways of integrating models of products with models of engineering processes. The models have been tested and further developed in an action research study carried out in collaboration...... with a major international engineering company....

  18. An Integrated Environment for Batch Process Development - From Recipe to Manufacture

    DEFF Research Database (Denmark)

    Batch process development involves the process of converting a chemical synthesis into an optimum, safe, robust, and economical process for manufacturing the chemical of desired quality at the ultimate desired scale. In this paper we describe a strategy for developing a set of integrated decision...

  19. Robust path planning for flexible needle insertion using Markov decision processes.

    Science.gov (United States)

    Tan, Xiaoyu; Yu, Pengqian; Lim, Kah-Bin; Chui, Chee-Kong

    2018-05-11

    Flexible needle has the potential to accurately navigate to a treatment region in the least invasive manner. We propose a new planning method using Markov decision processes (MDPs) for flexible needle navigation that can perform robust path planning and steering under the circumstance of complex tissue-needle interactions. This method enhances the robustness of flexible needle steering from three different perspectives. First, the method considers the problem caused by soft tissue deformation. The method then resolves the common needle penetration failure caused by patterns of targets, while the last solution addresses the uncertainty issues in flexible needle motion due to complex and unpredictable tissue-needle interaction. Computer simulation and phantom experimental results show that the proposed method can perform robust planning and generate a secure control policy for flexible needle steering. Compared with a traditional method using MDPs, the proposed method achieves higher accuracy and probability of success in avoiding obstacles under complicated and uncertain tissue-needle interactions. Future work will involve experiment with biological tissue in vivo. The proposed robust path planning method can securely steer flexible needle within soft phantom tissues and achieve high adaptability in computer simulation.

  20. Hybrid robust deep and shallow semantic processing for creativity support in document production

    OpenAIRE

    Uszkoreit, Hans; Callmeier, Ulrich; Eisele, Andreas; Schäfer, Ulrich; Siegel, Melanie

    2004-01-01

    The research performed in the DeepThought project (http://www.project-deepthought.net) aims at demonstrating the potential of deep linguistic processing if added to existing shallow methods that ensure robustness. Classical information retrieval is extended by high precision concept indexing and relation detection. We use this approach to demonstrate the feasibility of three ambitious applications, one of which is a tool for creativity support in document production and collective brainstormi...

  1. Integrated durability process in product development

    International Nuclear Information System (INIS)

    Pompetzki, M.; Saadetian, H.

    2002-01-01

    This presentation describes the integrated durability process in product development. Each of the major components of the integrated process are described along with a number of examples of how integrated durability assessment has been used in the ground vehicle industry. The durability process starts with the acquisition of loading information, either physically through loads measurement or virtually through multibody dynamics. The loading information is then processed and characterized for further analysis. Durability assessment was historically test based and completed through field or laboratory evaluation. Today, it is common that both the test and CAE environments are used together in durability assessment. Test based durability assessment is used for final design sign-off but is also critically important for correlating CAE models, in order to investigate design alternatives. There is also a major initiative today to integrate the individual components into a process, by linking applications and providing a framework to communicate information as well as manage all the data involved in the entire process. Although a single process is presented, the details of the process can vary significantly for different products and applications. Recent applications that highlight different parts of the durability process are given. As well as an example of how integration of software tools between different disciplines (MBD, FE and fatigue) not only simplifies the process, but also significantly improves it. (author)

  2. High-performance integrated virtual environment (HIVE): a robust infrastructure for next-generation sequence data analysis.

    Science.gov (United States)

    Simonyan, Vahan; Chumakov, Konstantin; Dingerdissen, Hayley; Faison, William; Goldweber, Scott; Golikov, Anton; Gulzar, Naila; Karagiannis, Konstantinos; Vinh Nguyen Lam, Phuc; Maudru, Thomas; Muravitskaja, Olesja; Osipova, Ekaterina; Pan, Yang; Pschenichnov, Alexey; Rostovtsev, Alexandre; Santana-Quintero, Luis; Smith, Krista; Thompson, Elaine E; Tkachenko, Valery; Torcivia-Rodriguez, John; Voskanian, Alin; Wan, Quan; Wang, Jing; Wu, Tsung-Jung; Wilson, Carolyn; Mazumder, Raja

    2016-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a distributed storage and compute environment designed primarily to handle next-generation sequencing (NGS) data. This multicomponent cloud infrastructure provides secure web access for authorized users to deposit, retrieve, annotate and compute on NGS data, and to analyse the outcomes using web interface visual environments appropriately built in collaboration with research and regulatory scientists and other end users. Unlike many massively parallel computing environments, HIVE uses a cloud control server which virtualizes services, not processes. It is both very robust and flexible due to the abstraction layer introduced between computational requests and operating system processes. The novel paradigm of moving computations to the data, instead of moving data to computational nodes, has proven to be significantly less taxing for both hardware and network infrastructure.The honeycomb data model developed for HIVE integrates metadata into an object-oriented model. Its distinction from other object-oriented databases is in the additional implementation of a unified application program interface to search, view and manipulate data of all types. This model simplifies the introduction of new data types, thereby minimizing the need for database restructuring and streamlining the development of new integrated information systems. The honeycomb model employs a highly secure hierarchical access control and permission system, allowing determination of data access privileges in a finely granular manner without flooding the security subsystem with a multiplicity of rules. HIVE infrastructure will allow engineers and scientists to perform NGS analysis in a manner that is both efficient and secure. HIVE is actively supported in public and private domains, and project collaborations are welcomed. Database URL: https://hive.biochemistry.gwu.edu. © The Author(s) 2016. Published by Oxford University Press.

  3. Comparison of fMRI paradigms assessing visuospatial processing: Robustness and reproducibility.

    Directory of Open Access Journals (Sweden)

    Verena Schuster

    Full Text Available The development of brain imaging techniques, in particular functional magnetic resonance imaging (fMRI, made it possible to non-invasively study the hemispheric lateralization of cognitive brain functions in large cohorts. Comprehensive models of hemispheric lateralization are, however, still missing and should not only account for the hemispheric specialization of individual brain functions, but also for the interactions among different lateralized cognitive processes (e.g., language and visuospatial processing. This calls for robust and reliable paradigms to study hemispheric lateralization for various cognitive functions. While numerous reliable imaging paradigms have been developed for language, which represents the most prominent left-lateralized brain function, the reliability of imaging paradigms investigating typically right-lateralized brain functions, such as visuospatial processing, has received comparatively less attention. In the present study, we aimed to establish an fMRI paradigm that robustly and reliably identifies right-hemispheric activation evoked by visuospatial processing in individual subjects. In a first study, we therefore compared three frequently used paradigms for assessing visuospatial processing and evaluated their utility to robustly detect right-lateralized brain activity on a single-subject level. In a second study, we then assessed the test-retest reliability of the so-called Landmark task-the paradigm that yielded the most robust results in study 1. At the single-voxel level, we found poor reliability of the brain activation underlying visuospatial attention. This suggests that poor signal-to-noise ratios can become a limiting factor for test-retest reliability. This represents a common detriment of fMRI paradigms investigating visuospatial attention in general and therefore highlights the need for careful considerations of both the possibilities and limitations of the respective fMRI paradigm-in particular

  4. Model Identification of Integrated ARMA Processes

    Science.gov (United States)

    Stadnytska, Tetiana; Braun, Simone; Werner, Joachim

    2008-01-01

    This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…

  5. Poisson processes and a Bessel function integral

    NARCIS (Netherlands)

    Steutel, F.W.

    1985-01-01

    The probability of winning a simple game of competing Poisson processes turns out to be equal to the well-known Bessel function integral J(x, y) (cf. Y. L. Luke, Integrals of Bessel Functions, McGraw-Hill, New York, 1962). Several properties of J, some of which seem to be new, follow quite easily

  6. Robust Myocardial Motion Tracking for Echocardiography: Variational Framework Integrating Local-to-Global Deformation

    Directory of Open Access Journals (Sweden)

    Chi Young Ahn

    2013-01-01

    Full Text Available This paper proposes a robust real-time myocardial border tracking algorithm for echocardiography. Commonly, after an initial contour of LV border is traced at one or two frames from the entire cardiac cycle, LV contour tracking is performed over the remaining frames. Among a variety of tracking techniques, optical flow method is the most widely used for motion estimation of moving objects. However, when echocardiography data is heavily corrupted in some local regions, the errors bring the tracking point out of the endocardial border, resulting in distorted LV contours. This shape distortion often occurs in practice since the data acquisition is affected by ultrasound artifacts, dropouts, or shadowing phenomena of cardiac walls. The proposed method is designed to deal with this shape distortion problem by integrating local optical flow motion and global deformation into a variational framework. The proposed descent method controls the individual tracking points to follow the local motions of a specific speckle pattern, while their overall motions are confined to the global motion constraint being approximately an affine transform of the initial tracking points. Many real experiments show that the proposed method achieves better overall performance than conventional methods.

  7. Rapid, Sensitive, and Reusable Detection of Glucose by a Robust Radiofrequency Integrated Passive Device Biosensor Chip

    Science.gov (United States)

    Kim, Nam-Young; Adhikari, Kishor Kumar; Dhakal, Rajendra; Chuluunbaatar, Zorigt; Wang, Cong; Kim, Eun-Soo

    2015-01-01

    Tremendous demands for sensitive and reliable label-free biosensors have stimulated intensive research into developing miniaturized radiofrequency resonators for a wide range of biomedical applications. Here, we report the development of a robust, reusable radiofrequency resonator based integrated passive device biosensor chip fabricated on a gallium arsenide substrate for the detection of glucose in water-glucose solutions and sera. As a result of the highly concentrated electromagnetic energy between the two divisions of an intertwined spiral inductor coupled with an interdigital capacitor, the proposed glucose biosensor chip exhibits linear detection ranges with high sensitivity at center frequency. This biosensor, which has a sensitivity of up to 199 MHz/mgmL−1 and a short response time of less than 2 sec, exhibited an ultralow detection limit of 0.033 μM and a reproducibility of 0.61% relative standard deviation. In addition, the quantities derived from the measured S-parameters, such as the propagation constant (γ), impedance (Z), resistance (R), inductance (L), conductance (G) and capacitance (C), enabled the effective multi-dimensional detection of glucose. PMID:25588958

  8. A generalized integral fluctuation theorem for general jump processes

    International Nuclear Information System (INIS)

    Liu Fei; Ouyang Zhongcan; Luo Yupin; Huang Mingchang

    2009-01-01

    Using the Feynman-Kac and Cameron-Martin-Girsanov formulae, we obtain a generalized integral fluctuation theorem (GIFT) for discrete jump processes by constructing a time-invariable inner product. The existing discrete IFTs can be derived as its specific cases. A connection between our approach and the conventional time-reversal method is also established. Unlike the latter approach that has been extensively employed in the existing literature, our approach can naturally bring out the definition of a time reversal of a Markovian stochastic system. Additionally, we find that the robust GIFT usually does not result in a detailed fluctuation theorem. (fast track communication)

  9. An integrated biotechnology platform for developing sustainable chemical processes.

    Science.gov (United States)

    Barton, Nelson R; Burgard, Anthony P; Burk, Mark J; Crater, Jason S; Osterhout, Robin E; Pharkya, Priti; Steer, Brian A; Sun, Jun; Trawick, John D; Van Dien, Stephen J; Yang, Tae Hoon; Yim, Harry

    2015-03-01

    Genomatica has established an integrated computational/experimental metabolic engineering platform to design, create, and optimize novel high performance organisms and bioprocesses. Here we present our platform and its use to develop E. coli strains for production of the industrial chemical 1,4-butanediol (BDO) from sugars. A series of examples are given to demonstrate how a rational approach to strain engineering, including carefully designed diagnostic experiments, provided critical insights about pathway bottlenecks, byproducts, expression balancing, and commercial robustness, leading to a superior BDO production strain and process.

  10. Path Integral Formulation of Anomalous Diffusion Processes

    OpenAIRE

    Friedrich, Rudolf; Eule, Stephan

    2011-01-01

    We present the path integral formulation of a broad class of generalized diffusion processes. Employing the path integral we derive exact expressions for the path probability densities and joint probability distributions for the class of processes under consideration. We show that Continuous Time Random Walks (CTRWs) are included in our framework. A closed expression for the path probability distribution of CTRWs is found in terms of their waiting time distribution as the solution of a Dyson ...

  11. An Integrated Approach to Single-Leg Airline Revenue Management: The Role of Robust Optimization

    OpenAIRE

    Birbil, S.I.; Frenk, J.B.G.; Gromicho, J.A.S.; Zhang, S.

    2006-01-01

    textabstractIn this paper we introduce robust versions of the classical static and dynamic single leg seat allocation models as analyzed by Wollmer, and Lautenbacher and Stidham, respectively. These robust models take into account the inaccurate estimates of the underlying probability distributions. As observed by simulation experiments it turns out that for these robust versions the variability compared to their classical counter parts is considerably reduced with a negligible decrease of av...

  12. Integrated Monitoring System of Production Processes

    Directory of Open Access Journals (Sweden)

    Oborski Przemysław

    2016-12-01

    Full Text Available Integrated monitoring system for discrete manufacturing processes is presented in the paper. The multilayer hardware and software reference model was developed. Original research are an answer for industry needs of the integration of information flow in production process. Reference model corresponds with proposed data model based on multilayer data tree allowing to describe orders, products, processes and save monitoring data. Elaborated models were implemented in the integrated monitoring system demonstrator developed in the project. It was built on the base of multiagent technology to assure high flexibility and openness on applying intelligent algorithms for data processing. Currently on the base of achieved experience an application integrated monitoring system for real production system is developed. In the article the main problems of monitoring integration are presented, including specificity of discrete production, data processing and future application of Cyber-Physical-Systems. Development of manufacturing systems is based more and more on taking an advantage of applying intelligent solutions into machine and production process control and monitoring. Connection of technical systems, machine tools and manufacturing processes monitoring with advanced information processing seems to be one of the most important areas of near future development. It will play important role in efficient operation and competitiveness of the whole production system. It is also important area of applying in the future Cyber-Physical-Systems that can radically improve functionally of monitoring systems and reduce the cost of its implementation.

  13. On a Robust MaxEnt Process Regression Model with Sample-Selection

    Directory of Open Access Journals (Sweden)

    Hea-Jung Kim

    2018-04-01

    Full Text Available In a regression analysis, a sample-selection bias arises when a dependent variable is partially observed as a result of the sample selection. This study introduces a Maximum Entropy (MaxEnt process regression model that assumes a MaxEnt prior distribution for its nonparametric regression function and finds that the MaxEnt process regression model includes the well-known Gaussian process regression (GPR model as a special case. Then, this special MaxEnt process regression model, i.e., the GPR model, is generalized to obtain a robust sample-selection Gaussian process regression (RSGPR model that deals with non-normal data in the sample selection. Various properties of the RSGPR model are established, including the stochastic representation, distributional hierarchy, and magnitude of the sample-selection bias. These properties are used in the paper to develop a hierarchical Bayesian methodology to estimate the model. This involves a simple and computationally feasible Markov chain Monte Carlo algorithm that avoids analytical or numerical derivatives of the log-likelihood function of the model. The performance of the RSGPR model in terms of the sample-selection bias correction, robustness to non-normality, and prediction, is demonstrated through results in simulations that attest to its good finite-sample performance.

  14. An integrated approach to single-leg airline revenue management: The role of robust optimization

    NARCIS (Netherlands)

    S.I. Birbil (Ilker); J.B.G. Frenk (Hans); J.A.S. Gromicho (Joaquim); S. Zhang (Shuzhong)

    2006-01-01

    textabstractIn this paper we introduce robust versions of the classical static and dynamic single leg seat allocation models as analyzed by Wollmer, and Lautenbacher and Stidham, respectively. These robust models take into account the inaccurate estimates of the underlying probability distributions.

  15. An Integrated Approach to Single-Leg Airline Revenue Management: The Role of Robust Optimization

    NARCIS (Netherlands)

    S.I. Birbil (Ilker); J.B.G. Frenk (Hans); J.A.S. Gromicho (Joaquim); S. Zhang (Shuzhong)

    2006-01-01

    textabstractIn this paper we introduce robust versions of the classical static and dynamic single leg seat allocation models as analyzed by Wollmer, and Lautenbacher and Stidham, respectively. These robust models take into account the inaccurate estimates of the underlying probability distributions.

  16. Robust Manipulations of Pest Insect Behavior Using Repellents and Practical Application for Integrated Pest Management.

    Science.gov (United States)

    Wallingford, Anna K; Cha, Dong H; Linn, Charles E; Wolfin, Michael S; Loeb, Gregory M

    2017-10-01

    In agricultural settings, examples of effective control strategies using repellent chemicals in integrated pest management (IPM) are relatively scarce compared to those using attractants. This may be partly due to a poor understanding of how repellents affect insect behavior once they are deployed. Here we attempt to identify potential hallmarks of repellent stimuli that are robust enough for practical use in the field. We explore the literature for success stories using repellents in IPM and we investigate the mechanisms of repellency for two chemical oviposition deterrents for controlling Drosophila suzukii Matsumura, a serious pest of small fruit crops. Drosophila suzukii causes injury by laying her eggs in ripening fruit and resulting larvae make fruit unmarketable. In caged choice tests, reduced oviposition was observed in red raspberry fruit treated with volatile 1-octen-3-ol and geosmin at two initial concentrations (10% and 1%) compared to untreated controls. We used video monitoring to observe fly behavior in these caged choice tests and investigate the mode of action for deterrence through the entire behavioral repertoire leading to oviposition. We observed fewer visitors and more time elapsed before flies first landed on 1-octen-3-ol-treated fruits than control fruits and concluded that this odor primarily inhibits behaviors that occur before D. suzukii comes in contact with a potential oviposition substrate (precontact). We observed some qualitative differences in precontact behavior of flies around geosmin-treated fruits; however, we concluded that this odor primarily inhibits behaviors that occur after D. suzukii comes in contact with treated fruits (postcontact). Field trials found reduced oviposition in red raspberry treated with 1-octen-3-ol and a combination of 1-octen-3-ol and geosmin, but no effect of geosmin alone. Recommendations for further study of repellents for practical use in the field are discussed. © The Authors 2017. Published by

  17. Integration thermal processes through Pinch technology

    International Nuclear Information System (INIS)

    Rios H, Carlos Mario; Grisales Rincon, Rogelio; Cardona, Carlos Ariel

    2004-01-01

    This paper presents the techniques of heat integration used for process optimization, their fortresses and weaknesses during the implementation in several specific process are also discussed. It is focused to the pinch technology, explaining algorithms for method applications in the industry. The paper provides the concepts and models involved in different types of commercial software applying this method for energy cost reduction, both in design of new plants and improve of old ones. As complement to benefits of the energy cost reduction it is analysed other favorable aspects of process integration, as the emissions waste reduction and the combined heat end power systems

  18. Information Integration; The process of integration, evolution and versioning

    NARCIS (Netherlands)

    de Keijzer, Ander; van Keulen, Maurice

    2005-01-01

    At present, many information sources are available wherever you are. Most of the time, the information needed is spread across several of those information sources. Gathering this information is a tedious and time consuming job. Automating this process would assist the user in its task. Integration

  19. Integrating ergonomic knowledge into engineering design processes

    DEFF Research Database (Denmark)

    Hall-Andersen, Lene Bjerg

    Integrating ergonomic knowledge into engineering design processes has been shown to contribute to healthy and effective designs of workplaces. However, it is also well-recognized that, in practice, ergonomists often have difficulties gaining access to and impacting engineering design processes...... employed in the same company, constituted a supporting factor for the possibilities to integrate ergonomic knowledge into the engineering design processes. However, the integration activities remained discrete and only happened in some of the design projects. A major barrier was related to the business...... to the ergonomic ambitions of the clients. The ergonomists’ ability to navigate, act strategically, and compromise on ergonomic inputs is also important in relation to having an impact in the engineering design processes. Familiarity with the engineering design terminology and the setup of design projects seems...

  20. Decisional tool to assess current and future process robustness in an antibody purification facility.

    Science.gov (United States)

    Stonier, Adam; Simaria, Ana Sofia; Smith, Martin; Farid, Suzanne S

    2012-07-01

    Increases in cell culture titers in existing facilities have prompted efforts to identify strategies that alleviate purification bottlenecks while controlling costs. This article describes the application of a database-driven dynamic simulation tool to identify optimal purification sizing strategies and visualize their robustness to future titer increases. The tool harnessed the benefits of MySQL to capture the process, business, and risk features of multiple purification options and better manage the large datasets required for uncertainty analysis and optimization. The database was linked to a discrete-event simulation engine so as to model the dynamic features of biopharmaceutical manufacture and impact of resource constraints. For a given titer, the tool performed brute force optimization so as to identify optimal purification sizing strategies that minimized the batch material cost while maintaining the schedule. The tool was applied to industrial case studies based on a platform monoclonal antibody purification process in a multisuite clinical scale manufacturing facility. The case studies assessed the robustness of optimal strategies to batch-to-batch titer variability and extended this to assess the long-term fit of the platform process as titers increase from 1 to 10 g/L, given a range of equipment sizes available to enable scale intensification efforts. Novel visualization plots consisting of multiple Pareto frontiers with tie-lines connecting the position of optimal configurations over a given titer range were constructed. These enabled rapid identification of robust purification configurations given titer fluctuations and the facility limit that the purification suites could handle in terms of the maximum titer and hence harvest load. Copyright © 2012 American Institute of Chemical Engineers (AIChE).

  1. A Robust Process Analytical Technology (PAT) System Design for Crystallization Processes

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Sin, Gürkan; Gernaey, Krist

    2013-01-01

    A generic computer-aided framework for systematic design of a process monitoring and control system for crystallization processes has been developed to study various aspects of crystallization operations. The design framework contains a generic multidimensional modelling framework, a tool for gen...

  2. Designing Robust Process Analytical Technology (PAT) Systems for Crystallization Processes: A Potassium Dichromate Crystallization Case Study

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Sin, Gürkan

    2013-01-01

    The objective of this study is to test and validate a Process Analytical Technology (PAT) system design on a potassium dichromate crystallization process in the presence of input uncertainties using uncertainty and sensitivity analysis. To this end a systematic framework for managing uncertaintie...

  3. Robust estimation of autoregressive processes using a mixture-based filter-bank

    Czech Academy of Sciences Publication Activity Database

    Šmídl, V.; Anthony, Q.; Kárný, Miroslav; Guy, Tatiana Valentine

    2005-01-01

    Roč. 54, č. 4 (2005), s. 315-323 ISSN 0167-6911 R&D Projects: GA AV ČR IBS1075351; GA ČR GA102/03/0049; GA ČR GP102/03/P010; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : Bayesian estimation * probabilistic mixtures * recursive estimation Subject RIV: BC - Control Systems Theory Impact factor: 1.239, year: 2005 http://library.utia.cas.cz/separaty/historie/karny-robust estimation of autoregressive processes using a mixture-based filter- bank .pdf

  4. Materials issues in silicon integrated circuit processing

    International Nuclear Information System (INIS)

    Wittmer, M.; Stimmell, J.; Strathman, M.

    1986-01-01

    The symposium on ''Materials Issues in Integrated Circuit Processing'' sought to bring together all of the materials issued pertinent to modern integrated circuit processing. The inherent properties of the materials are becoming an important concern in integrated circuit manufacturing and accordingly research in materials science is vital for the successful implementation of modern integrated circuit technology. The session on Silicon Materials Science revealed the advanced stage of knowledge which topics such as point defects, intrinsic and extrinsic gettering and diffusion kinetics have achieved. Adaption of this knowledge to specific integrated circuit processing technologies is beginning to be addressed. The session on Epitaxy included invited papers on epitaxial insulators and IR detectors. Heteroepitaxy on silicon is receiving great attention and the results presented in this session suggest that 3-d integrated structures are an increasingly realistic possibility. Progress in low temperature silicon epitaxy and epitaxy of thin films with abrupt interfaces was also reported. Diffusion and Ion Implantation were well presented. Regrowth of implant-damaged layers and the nature of the defects which remain after regrowth were discussed in no less than seven papers. Substantial progress was also reported in the understanding of amorphising boron implants and the use of gallium implants for the formation of shallow p/sup +/ -layers

  5. A Robust Multivariable Feedforward/Feedback Controller Design for Integrated Power Control of Boiling Water Reactor Power Plants

    International Nuclear Information System (INIS)

    Shyu, S.-S.; Edwards, Robert M.

    2002-01-01

    In this paper, a methodology for synthesizing a robust multivariable feedforward/feedback control (FF/FBC) strategy is proposed for an integrated control of turbine power, throttle pressure, and reactor water level in a nuclear power plant. In the proposed method, the FBC is synthesized by the robust control approach. The feedforward control, which is generated via nonlinear programming, is added to the robust FBC system to further improve the control performance. The plant uncertainties, including unmodeled dynamics, linearization, and model reduction, are characterized and estimated. The comparisons of simulation responses based on a nonlinear reactor model demonstrate the achievement of the proposed controller with specified performance and endurance under uncertainty. It is also important to note that all input variables are manipulated in an orchestrated manner in response to a single output's setpoint change

  6. Robust Modelling of Heat and Mass Transfer in Processing of Solid Foods

    DEFF Research Database (Denmark)

    Feyissa, Aberham Hailu

    The study is focused on combined heat and mass transfer during processing of solid foods such as baking and frying processes. Modelling of heat and mass transfer during baking and frying is a significant scientific challenge. During baking and frying, the food undergoes several changes...... in microstructure and other physical properties of the food matrix. The heat and water transport inside the food is coupled in a complex way, which for some food systems it is not yet fully understood. A typical example of the latter is roasting of meat in convection oven, where the mechanism of water transport...... is unclear. Establishing the robust mathematical models describing the main mechanisms reliably is of great concern. A quantitative description of the heat and mass transfer during the solid food processing, in the form of mathematical equations, implementation of the solution techniques, and the value...

  7. Data-driven process decomposition and robust online distributed modelling for large-scale processes

    Science.gov (United States)

    Shu, Zhang; Lijuan, Li; Lijuan, Yao; Shipin, Yang; Tao, Zou

    2018-02-01

    With the increasing attention of networked control, system decomposition and distributed models show significant importance in the implementation of model-based control strategy. In this paper, a data-driven system decomposition and online distributed subsystem modelling algorithm was proposed for large-scale chemical processes. The key controlled variables are first partitioned by affinity propagation clustering algorithm into several clusters. Each cluster can be regarded as a subsystem. Then the inputs of each subsystem are selected by offline canonical correlation analysis between all process variables and its controlled variables. Process decomposition is then realised after the screening of input and output variables. When the system decomposition is finished, the online subsystem modelling can be carried out by recursively block-wise renewing the samples. The proposed algorithm was applied in the Tennessee Eastman process and the validity was verified.

  8. Pedagogic process modeling: Humanistic-integrative approach

    Directory of Open Access Journals (Sweden)

    Boritko Nikolaj M.

    2007-01-01

    Full Text Available The paper deals with some current problems of modeling the dynamics of the subject-features development of the individual. The term "process" is considered in the context of the humanistic-integrative approach, in which the principles of self education are regarded as criteria for efficient pedagogic activity. Four basic characteristics of the pedagogic process are pointed out: intentionality reflects logicality and regularity of the development of the process; discreteness (stageability in dicates qualitative stages through which the pedagogic phenomenon passes; nonlinearity explains the crisis character of pedagogic processes and reveals inner factors of self-development; situationality requires a selection of pedagogic conditions in accordance with the inner factors, which would enable steering the pedagogic process. Offered are two steps for singling out a particular stage and the algorithm for developing an integrative model for it. The suggested conclusions might be of use for further theoretic research, analyses of educational practices and for realistic predicting of pedagogical phenomena. .

  9. Carbon Nanotube Integration with a CMOS Process

    Science.gov (United States)

    Perez, Maximiliano S.; Lerner, Betiana; Resasco, Daniel E.; Pareja Obregon, Pablo D.; Julian, Pedro M.; Mandolesi, Pablo S.; Buffa, Fabian A.; Boselli, Alfredo; Lamagna, Alberto

    2010-01-01

    This work shows the integration of a sensor based on carbon nanotubes using CMOS technology. A chip sensor (CS) was designed and manufactured using a 0.30 μm CMOS process, leaving a free window on the passivation layer that allowed the deposition of SWCNTs over the electrodes. We successfully investigated with the CS the effect of humidity and temperature on the electrical transport properties of SWCNTs. The possibility of a large scale integration of SWCNTs with CMOS process opens a new route in the design of more efficient, low cost sensors with high reproducibility in their manufacture. PMID:22319330

  10. Identification of a robust subpathway-based signature for acute myeloid leukemia prognosis using an miRNA integrated strategy.

    Science.gov (United States)

    Chang, Huijuan; Gao, Qiuying; Ding, Wei; Qing, Xueqin

    2018-01-01

    Acute myeloid leukemia (AML) is a heterogeneous disease, and survival signatures are urgently needed to better monitor treatment. MiRNAs displayed vital regulatory roles on target genes, which was necessary involved in the complex disease. We therefore examined the expression levels of miRNAs and genes to identify robust signatures for survival benefit analyses. First, we reconstructed subpathway graphs by embedding miRNA components that were derived from low-throughput miRNA-gene interactions. Then, we randomly divided the data sets from The Cancer Genome Atlas (TCGA) into training and testing sets, and further formed 100 subsets based on the training set. Using each subset, we identified survival-related miRNAs and genes, and identified survival subpathways based on the reconstructed subpathway graphs. After statistical analyses of these survival subpathways, the most robust subpathways with the top three ranks were identified, and risk scores were calculated based on these robust subpathways for AML patient prognoses. Among these robust subpathways, three representative subpathways, path: 05200_10 from Pathways in cancer, path: 04110_20 from Cell cycle, and path: 04510_8 from Focal adhesion, were significantly associated with patient survival in the TCGA training and testing sets based on subpathway risk scores. In conclusion, we performed integrated analyses of miRNAs and genes to identify robust prognostic subpathways, and calculated subpathway risk scores to characterize AML patient survival.

  11. Process Integration Analysis of an Industrial Hydrogen Production Process

    OpenAIRE

    Stolten, Detlef; Grube, Thomas; Tock, Laurence; Maréchal, François; Metzger, Christian; Arpentinier, Philippe

    2010-01-01

    The energy efficiency of an industrial hydrogen production process using steam methane reforming (SMR) combined with the water gas shift reaction (WGS) is analyzed using process integration techniques based on heat cascade calculation and pinch analysis with the aim of identifying potential measures to enhance the process performance. The challenge is to satisfy the high temperature heat demand of the SMR reaction by minimizing the consumption of natural gas to feed the combustion and to expl...

  12. Improved process robustness by using closed loop control in deep drawing applications

    Science.gov (United States)

    Barthau, M.; Liewald, M.; Christian, Held

    2017-09-01

    The production of irregular shaped deep-drawing parts with high quality requirements, which are common in today’s automotive production, permanently challenges production processes. High requirements on lightweight construction of passenger car bodies following European regulations until 2020 have been massively increasing the use of high strength steels substantially for years and are also leading to bigger challenges in sheet metal part production. Of course, the more and more complex shapes of today’s car body shells also intensify the issue due to modern and future design criteria. The metal forming technology tries to meet these challenges by developing a highly sophisticated layout of deep drawing dies that consider part quality requirements, process robustness and controlled material flow during the deep or stretch drawing process phase. A new method for controlling material flow using a closed loop system was developed at the IFU Stuttgart. In contrast to previous approaches, this new method allows a control intervention during the deep-drawing stroke. The blank holder force around the outline of the drawn part is used as control variable. The closed loop is designed as trajectory follow up with feed forward control. The used command variable is the part-wall stress that is measured with a piezo-electric measuring pin. In this paper the used control loop will be described in detail. The experimental tool that was built for testing the new control approach is explained here with its features. A method for gaining the follow up trajectories from simulation will also be presented. Furthermore, experimental results considering the robustness of the deep drawing process and the gain in process performance with developed control loop will be shown. Finally, a new procedure for the industrial application of the new control method of deep drawing will be presented by using a new kind of active element to influence the local blank holder pressure onto part

  13. Maximum likelihood estimation for integrated diffusion processes

    DEFF Research Database (Denmark)

    Baltazar-Larios, Fernando; Sørensen, Michael

    We propose a method for obtaining maximum likelihood estimates of parameters in diffusion models when the data is a discrete time sample of the integral of the process, while no direct observations of the process itself are available. The data are, moreover, assumed to be contaminated...... EM-algorithm to obtain maximum likelihood estimates of the parameters in the diffusion model. As part of the algorithm, we use a recent simple method for approximate simulation of diffusion bridges. In simulation studies for the Ornstein-Uhlenbeck process and the CIR process the proposed method works...... by measurement errors. Integrated volatility is an example of this type of observations. Another example is ice-core data on oxygen isotopes used to investigate paleo-temperatures. The data can be viewed as incomplete observations of a model with a tractable likelihood function. Therefore we propose a simulated...

  14. Integration Process for the Habitat Demonstration Unit

    Science.gov (United States)

    Gill, Tracy; Merbitz, Jerad; Kennedy, Kriss; Tri, Terry; Howe, A. Scott

    2010-01-01

    The Habitat Demonstration Unit (HDU) is an experimental exploration habitat technology and architecture test platform designed for analog demonstration activities The HDU project has required a team to integrate a variety of contributions from NASA centers and outside collaborators and poses a challenge in integrating these disparate efforts into a cohesive architecture To complete the development of the HDU from conception in June 2009 to rollout for operations in July 2010, a cohesive integration strategy has been developed to integrate the various systems of HDU and the payloads, such as the Geology Lab, that those systems will support The utilization of interface design standards and uniquely tailored reviews have allowed for an accelerated design process Scheduled activities include early fit-checks and the utilization of a Habitat avionics test bed prior to equipment installation into HDU A coordinated effort to utilize modeling and simulation systems has aided in design and integration concept development Modeling tools have been effective in hardware systems layout, cable routing and length estimation, and human factors analysis Decision processes on the shell development including the assembly sequence and the transportation have been fleshed out early on HDU to maximize the efficiency of both integration and field operations Incremental test operations leading up to an integrated systems test allows for an orderly systems test program The HDU will begin its journey as an emulation of a Pressurized Excursion Module (PEM) for 2010 field testing and then may evolve to a Pressurized Core Module (PCM) for 2011 and later field tests, depending on agency architecture decisions The HDU deployment will vary slightly from current lunar architecture plans to include developmental hardware and software items and additional systems called opportunities for technology demonstration One of the HDU challenges has been designing to be prepared for the integration of

  15. Rigorous, robust and systematic: Qualitative research and its contribution to burn care. An integrative review.

    Science.gov (United States)

    Kornhaber, Rachel Anne; de Jong, A E E; McLean, L

    2015-12-01

    Qualitative methods are progressively being implemented by researchers for exploration within healthcare. However, there has been a longstanding and wide-ranging debate concerning the relative merits of qualitative research within the health care literature. This integrative review aimed to exam the contribution of qualitative research in burns care and subsequent rehabilitation. Studies were identified using an electronic search strategy using the databases PubMed, Cumulative Index of Nursing and Allied Health Literature (CINAHL), Excerpta Medica database (EMBASE) and Scopus of peer reviewed primary research in English between 2009 to April 2014 using Whittemore and Knafl's integrative review method as a guide for analysis. From the 298 papers identified, 26 research papers met the inclusion criteria. Across all studies there was an average of 22 participants involved in each study with a range of 6-53 participants conducted across 12 nations that focussed on burns prevention, paediatric burns, appropriate acquisition and delivery of burns care, pain and psychosocial implications of burns trauma. Careful and rigorous application of qualitative methodologies promotes and enriches the development of burns knowledge. In particular, the key elements in qualitative methodological process and its publication are critical in disseminating credible and methodologically sound qualitative research. Copyright © 2015 Elsevier Ltd and ISBI. All rights reserved.

  16. Integrated biological, chemical and physical processes kinetic ...

    African Journals Online (AJOL)

    ... for C and N removal, only gas and liquid phase processes were considered for this integrated model. ... kLA value for the aeration system, which affects the pH in the anoxic and aerobic reactors through CO2 gas exchange. ... Water SA Vol.

  17. Integrating Leadership Processes: Redefining the Principles Course.

    Science.gov (United States)

    Neff, Bonita Dostal

    2002-01-01

    Revamps the principles of a public relations course, the first professional course in the public relations sequence, by integrating a leadership process and a service-learning component. Finds that more students are reflecting the interpersonal and team skills desired in the 1998 national study on public relations. (SG)

  18. Polycation-mediated integrated cell death processes

    DEFF Research Database (Denmark)

    Parhamifar, Ladan; Andersen, Helene; Wu, Linping

    2014-01-01

    standard. PEIs are highly efficient transfectants, but depending on their architecture and size they induce cytotoxicity through different modes of cell death pathways. Here, we briefly review dynamic and integrated cell death processes and pathways, and discuss considerations in cell death assay design...

  19. The Integration Order of Vector Autoregressive Processes

    DEFF Research Database (Denmark)

    Franchi, Massimo

    We show that the order of integration of a vector autoregressive process is equal to the difference between the multiplicity of the unit root in the characteristic equation and the multiplicity of the unit root in the adjoint matrix polynomial. The equivalence with the standard I(1) and I(2...

  20. Process integration of organic Rankine cycle

    International Nuclear Information System (INIS)

    Desai, Nishith B.; Bandyopadhyay, Santanu

    2009-01-01

    An organic Rankine cycle (ORC) uses an organic fluid as a working medium within a Rankine cycle power plant. ORC offers advantages over conventional Rankine cycle with water as the working medium, as ORC generates shaft-work from low to medium temperature heat sources with higher thermodynamic efficiency. The dry and the isentropic fluids are most preferred working fluid for the ORC. The basic ORC can be modified by incorporating both regeneration and turbine bleeding to improve its thermal efficiency. In this paper, 16 different organic fluids have been analyzed as a working medium for the basic as well as modified ORCs. A methodology is also proposed for appropriate integration and optimization of an ORC as a cogeneration process with the background process to generate shaft-work. It has been illustrated that the choice of cycle configuration for appropriate integration with the background process depends on the heat rejection profile of the background process (i.e., the shape of the below pinch portion of the process grand composite curve). The benefits of integrating ORC with the background process and the applicability of the proposed methodology have been demonstrated through illustrative examples.

  1. A model for optimization of process integration investments under uncertainty

    International Nuclear Information System (INIS)

    Svensson, Elin; Stroemberg, Ann-Brith; Patriksson, Michael

    2011-01-01

    The long-term economic outcome of energy-related industrial investment projects is difficult to evaluate because of uncertain energy market conditions. In this article, a general, multistage, stochastic programming model for the optimization of investments in process integration and industrial energy technologies is proposed. The problem is formulated as a mixed-binary linear programming model where uncertainties are modelled using a scenario-based approach. The objective is to maximize the expected net present value of the investments which enables heat savings and decreased energy imports or increased energy exports at an industrial plant. The proposed modelling approach enables a long-term planning of industrial, energy-related investments through the simultaneous optimization of immediate and later decisions. The stochastic programming approach is also suitable for modelling what is possibly complex process integration constraints. The general model formulation presented here is a suitable basis for more specialized case studies dealing with optimization of investments in energy efficiency. -- Highlights: → Stochastic programming approach to long-term planning of process integration investments. → Extensive mathematical model formulation. → Multi-stage investment decisions and scenario-based modelling of uncertain energy prices. → Results illustrate how investments made now affect later investment and operation opportunities. → Approach for evaluation of robustness with respect to variations in probability distribution.

  2. A Framework for Robust Multivariable Optimization of Integrated Circuits in Space Applications

    Science.gov (United States)

    DuMonthier, Jeffrey; Suarez, George

    2013-01-01

    Application Specific Integrated Circuit (ASIC) design for space applications involves multiple challenges of maximizing performance, minimizing power and ensuring reliable operation in extreme environments. This is a complex multidimensional optimization problem which must be solved early in the development cycle of a system due to the time required for testing and qualification severely limiting opportunities to modify and iterate. Manual design techniques which generally involve simulation at one or a small number of corners with a very limited set of simultaneously variable parameters in order to make the problem tractable are inefficient and not guaranteed to achieve the best possible results within the performance envelope defined by the process and environmental requirements. What is required is a means to automate design parameter variation, allow the designer to specify operational constraints and performance goals, and to analyze the results in a way which facilitates identifying the tradeoffs defining the performance envelope over the full set of process and environmental corner cases. The system developed by the Mixed Signal ASIC Group (MSAG) at the Goddard Space Flight Center is implemented as framework of software modules, templates and function libraries. It integrates CAD tools and a mathematical computing environment, and can be customized for new circuit designs with only a modest amount of effort as most common tasks are already encapsulated. Customization is required for simulation test benches to determine performance metrics and for cost function computation. Templates provide a starting point for both while toolbox functions minimize the code required. Once a test bench has been coded to optimize a particular circuit, it is also used to verify the final design. The combination of test bench and cost function can then serve as a template for similar circuits or be re-used to migrate the design to different processes by re-running it with the

  3. A simulation-based robust biofuel facility location model for an integrated bio-energy logistics network

    Directory of Open Access Journals (Sweden)

    Jae-Dong Hong

    2014-10-01

    Full Text Available Purpose: The purpose of this paper is to propose a simulation-based robust biofuel facility location model for solving an integrated bio-energy logistics network (IBLN problem, where biomass yield is often uncertain or difficult to determine.Design/methodology/approach: The IBLN considered in this paper consists of four different facilities: farm or harvest site (HS, collection facility (CF, biorefinery (BR, and blending station (BS. Authors propose a mixed integer quadratic modeling approach to simultaneously determine the optimal CF and BR locations and corresponding biomass and bio-energy transportation plans. The authors randomly generate biomass yield of each HS and find the optimal locations of CFs and BRs for each generated biomass yield, and select the robust locations of CFs and BRs to show the effects of biomass yield uncertainty on the optimality of CF and BR locations. Case studies using data from the State of South Carolina in the United State are conducted to demonstrate the developed model’s capability to better handle the impact of uncertainty of biomass yield.Findings: The results illustrate that the robust location model for BRs and CFs works very well in terms of the total logistics costs. The proposed model would help decision-makers find the most robust locations for biorefineries and collection facilities, which usually require huge investments, and would assist potential investors in identifying the least cost or important facilities to invest in the biomass and bio-energy industry.Originality/value: An optimal biofuel facility location model is formulated for the case of deterministic biomass yield. To improve the robustness of the model for cases with probabilistic biomass yield, the model is evaluated by a simulation approach using case studies. The proposed model and robustness concept would be a very useful tool that helps potential biofuel investors minimize their investment risk.

  4. Robustness of the Process of Nucleoid Exclusion of Protein Aggregates in Escherichia coli

    Science.gov (United States)

    Neeli-Venkata, Ramakanth; Martikainen, Antti; Gupta, Abhishekh; Gonçalves, Nadia; Fonseca, Jose

    2016-01-01

    ABSTRACT Escherichia coli segregates protein aggregates to the poles by nucleoid exclusion. Combined with cell divisions, this generates heterogeneous aggregate distributions in subsequent cell generations. We studied the robustness of this process with differing medium richness and antibiotics stress, which affect nucleoid size, using multimodal, time-lapse microscopy of live cells expressing both a fluorescently tagged chaperone (IbpA), which identifies in vivo the location of aggregates, and HupA-mCherry, a fluorescent variant of a nucleoid-associated protein. We find that the relative sizes of the nucleoid's major and minor axes change widely, in a positively correlated fashion, with medium richness and antibiotic stress. The aggregate's distribution along the major cell axis also changes between conditions and in agreement with the nucleoid exclusion phenomenon. Consequently, the fraction of aggregates at the midcell region prior to cell division differs between conditions, which will affect the degree of asymmetries in the partitioning of aggregates between cells of future generations. Finally, from the location of the peak of anisotropy in the aggregate displacement distribution, the nucleoid relative size, and the spatiotemporal aggregate distribution, we find that the exclusion of detectable aggregates from midcell is most pronounced in cells with mid-sized nucleoids, which are most common under optimal conditions. We conclude that the aggregate management mechanisms of E. coli are significantly robust but are not immune to stresses due to the tangible effect that these have on nucleoid size. IMPORTANCE Escherichia coli segregates protein aggregates to the poles by nucleoid exclusion. From live single-cell microscopy studies of the robustness of this process to various stresses known to affect nucleoid size, we find that nucleoid size and aggregate preferential locations change concordantly between conditions. Also, the degree of influence of the nucleoid

  5. Robust electrochemical analysis of As(III) integrating with interference tests: A case study in groundwater

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Zhong-Gang [Nanomaterials and Environmental Detection Laboratory, Hefei Institutes of Physical Science, Chinese Academy of Sciences, Hefei 230031 (China); Department of Chemistry, University of Science and Technology of China, Hefei 230026 (China); Chen, Xing; Liu, Jin-Huai [Nanomaterials and Environmental Detection Laboratory, Hefei Institutes of Physical Science, Chinese Academy of Sciences, Hefei 230031 (China); Huang, Xing-Jiu, E-mail: xingjiuhuang@iim.ac.cn [Nanomaterials and Environmental Detection Laboratory, Hefei Institutes of Physical Science, Chinese Academy of Sciences, Hefei 230031 (China); Department of Chemistry, University of Science and Technology of China, Hefei 230026 (China)

    2014-08-15

    Graphical abstract: - Highlights: • Robust determination of As(III) in Togtoh water samples has been demonstrated. • The results were comparable to that obtained by ICP–AES. • No obvious interference was observed after a series of interference tests. • Robust stability was obtained in long-term measurements. - Abstract: In Togtoh region of Inner Mongolia, northern China, groundwater encountered high concentrations As contamination (greater than 50 μg L{sup −1}) causes an increasing concern. This work demonstrates an electrochemical protocol for robust (efficient and accurate) determination of As(III) in Togtoh water samples using Au microwire electrode without the need of pretreatment or clean-up steps. Considering the complicated conditions of Togtoh water, the efficiency of Au microwire electrode was systematically evaluated by a series of interference tests, stability and reproducibility measurements. No obvious interference on the determination of As(III) was observed. Especially, the influence of humic acid (HA) was intensively investigated. Electrode stability was also observed with long-term measurements (70 days) in Togtoh water solution and under different temperatures (0–35 °C). Excellent reproducibility (RSD:1.28%) was observed from different batches of Au microwire electrodes. The results obtained at Au microwire electrode were comparable to that obtained by inductively coupled plasma atomic emission spectroscopy (ICP–AES), indicating a good accuracy. These evaluations (efficiency, robustness, and accuracy) demonstrated that the Au microwire electrode was able to determine As(III) in application to real environmental samples.

  6. Study of robust thin film PT-1000 temperature sensors for cryogenic process control applications

    Science.gov (United States)

    Ramalingam, R.; Boguhn, D.; Fillinger, H.; Schlachter, S. I.; Süßer, M.

    2014-01-01

    In some cryogenic process measurement applications, for example, in hydrogen technology and in high temperature superconductor based generators, there is a need of robust temperature sensors. These sensors should be able to measure the large temperature range of 20 - 500 K with reasonable resolution and accuracy. Thin film PT 1000 sensors could be a choice to cover this large temperature range. Twenty one sensors selected from the same production batch were tested for their temperature sensitivity which was then compared with different batch sensors. Furthermore, the sensor's stability was studied by subjecting the sensors to repeated temperature cycles of 78-525 K. Deviations in the resistance were investigated using ice point calibration and water triple point calibration methods. Also the study of directional oriented intense static magnetic field effects up to 8 Oersted (Oe) were conducted to understand its magneto resistance behaviour in the cryogenic temperature range from 77 K - 15 K. This paper reports all investigation results in detail.

  7. Integrating ergonomics into the product development process

    DEFF Research Database (Denmark)

    Broberg, Ole

    1997-01-01

    and production engineers regarding information sources in problem solving, communication pattern, perception of ergonomics, motivation and requests to support tools and methods. These differences and the social and organizational contexts of the development process must be taken into account when considering......A cross-sectional case study was performed in a large company producing electro-mechanical products for industrial application. The purpose was to elucidate conditions and strategies for integrating ergonomics into the product development process thereby preventing ergonomic problems at the time...... of manufacture of new products. In reality the product development process is not a rational problem solving process and does not proceed in a sequential manner as decribed in engineering models. Instead it is a complex organizational process involving uncertainties, iterative elements and negotiation between...

  8. Audiovisual integration in speech perception: a multi-stage process

    DEFF Research Database (Denmark)

    Eskelund, Kasper; Tuomainen, Jyrki; Andersen, Tobias

    2011-01-01

    investigate whether the integration of auditory and visual speech observed in these two audiovisual integration effects are specific traits of speech perception. We further ask whether audiovisual integration is undertaken in a single processing stage or multiple processing stages....

  9. An integrated model for supplier selection process

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    In today's highly competitive manufacturing environment, the supplier selection process becomes one of crucial activities in supply chain management. In order to select the best supplier(s) it is not only necessary to continuously tracking and benchmarking performance of suppliers but also to make a tradeoff between tangible and intangible factors some of which may conflict. In this paper an integration of case-based reasoning (CBR), analytical network process (ANP) and linear programming (LP) is proposed to solve the supplier selection problem.

  10. The Efficient Separations and Processing Integrated Program

    International Nuclear Information System (INIS)

    Kuhn, W.L.; Gephart, J.M.

    1994-08-01

    The Efficient Separations and Processing Integrated Program (ESPIP) was created in 1991 to identify, develop, and perfect separations technologies and processes to treat wastes and address environmental problems throughout the US Department of Energy (DOE) complex. The ESPIP funds several multiyear tasks that address high-priority waste remediation problems involving high-level, low-level, transuranic, hazardous, and mixed (radioactive and hazardous) wastes. The ESPIP supports applied R ampersand D leading to demonstration or use of these separations technologies by other organizations within DOE's Office of Environmental Restoration and Waste Management. Examples of current ESPIP-funded separations technologies are described here

  11. Robust processing of intracranial CT angiograms for 3D volume rendering

    International Nuclear Information System (INIS)

    Moore, E.A.; Grieve, J.P.; Jaeger, H.R.; Univ. Dept. of Neurosurgery, London

    2001-01-01

    The goal of this study was to develop a robust and simple technique for processing of cranial CT angiograms (CTA) in the clinical setting. The method described in this paper involves segmentation of the bone, then dilation of the skull by adding three or four layers of voxels. This dilated skull is subtracted from the vessels object on a voxel-by-voxel basis, allowing segmentation and subsequent display of the vessels only. For evaluation of the technique, three groups of operators processed one CTA, and the quality of the 3D views obtained and the times taken were compared. One group was given training by an expert and a ''recipe'' for guidance, the second was given only the ''recipe,'' and the third group consisted of expert operators. All operators were able to produce good or acceptable shaded-surface displays when compared with digital subtraction angiography, within 10 min for experienced users, an average of 17 min for trained operators and 26 min for those using only the recipe sheet. Using a simple scoring system for the appearance of feeding vessels and draining veins, no significant differences were found between the three levels of training and experience. This technique simplifies the processing of CTAs and is quick enough to make such examinations part of a routine clinical service. (orig.)

  12. A robust post-processing workflow for datasets with motion artifacts in diffusion kurtosis imaging.

    Science.gov (United States)

    Li, Xianjun; Yang, Jian; Gao, Jie; Luo, Xue; Zhou, Zhenyu; Hu, Yajie; Wu, Ed X; Wan, Mingxi

    2014-01-01

    The aim of this study was to develop a robust post-processing workflow for motion-corrupted datasets in diffusion kurtosis imaging (DKI). The proposed workflow consisted of brain extraction, rigid registration, distortion correction, artifacts rejection, spatial smoothing and tensor estimation. Rigid registration was utilized to correct misalignments. Motion artifacts were rejected by using local Pearson correlation coefficient (LPCC). The performance of LPCC in characterizing relative differences between artifacts and artifact-free images was compared with that of the conventional correlation coefficient in 10 randomly selected DKI datasets. The influence of rejected artifacts with information of gradient directions and b values for the parameter estimation was investigated by using mean square error (MSE). The variance of noise was used as the criterion for MSEs. The clinical practicality of the proposed workflow was evaluated by the image quality and measurements in regions of interest on 36 DKI datasets, including 18 artifact-free (18 pediatric subjects) and 18 motion-corrupted datasets (15 pediatric subjects and 3 essential tremor patients). The relative difference between artifacts and artifact-free images calculated by LPCC was larger than that of the conventional correlation coefficient (pworkflow improved the image quality and reduced the measurement biases significantly on motion-corrupted datasets (pworkflow was reliable to improve the image quality and the measurement precision of the derived parameters on motion-corrupted DKI datasets. The workflow provided an effective post-processing method for clinical applications of DKI in subjects with involuntary movements.

  13. Information processing in the transcriptional regulatory network of yeast: Functional robustness

    Directory of Open Access Journals (Sweden)

    Dehmer Matthias

    2009-03-01

    Full Text Available Abstract Background Gene networks are considered to represent various aspects of molecular biological systems meaningfully because they naturally provide a systems perspective of molecular interactions. In this respect, the functional understanding of the transcriptional regulatory network is considered as key to elucidate the functional organization of an organism. Results In this paper we study the functional robustness of the transcriptional regulatory network of S. cerevisiae. We model the information processing in the network as a first order Markov chain and study the influence of single gene perturbations on the global, asymptotic communication among genes. Modification in the communication is measured by an information theoretic measure allowing to predict genes that are 'fragile' with respect to single gene knockouts. Our results demonstrate that the predicted set of fragile genes contains a statistically significant enrichment of so called essential genes that are experimentally found to be necessary to ensure vital yeast. Further, a structural analysis of the transcriptional regulatory network reveals that there are significant differences between fragile genes, hub genes and genes with a high betweenness centrality value. Conclusion Our study does not only demonstrate that a combination of graph theoretical, information theoretical and statistical methods leads to meaningful biological results but also that such methods allow to study information processing in gene networks instead of just their structural properties.

  14. Integrated system for automated financial document processing

    Science.gov (United States)

    Hassanein, Khaled S.; Wesolkowski, Slawo; Higgins, Ray; Crabtree, Ralph; Peng, Antai

    1997-02-01

    A system was developed that integrates intelligent document analysis with multiple character/numeral recognition engines in order to achieve high accuracy automated financial document processing. In this system, images are accepted in both their grayscale and binary formats. A document analysis module starts by extracting essential features from the document to help identify its type (e.g. personal check, business check, etc.). These features are also utilized to conduct a full analysis of the image to determine the location of interesting zones such as the courtesy amount and the legal amount. These fields are then made available to several recognition knowledge sources such as courtesy amount recognition engines and legal amount recognition engines through a blackboard architecture. This architecture allows all the available knowledge sources to contribute incrementally and opportunistically to the solution of the given recognition query. Performance results on a test set of machine printed business checks using the integrated system are also reported.

  15. Process Analytical Technology (PAT): batch-to-batch reproducibility of fermentation processes by robust process operational design and control.

    Science.gov (United States)

    Gnoth, S; Jenzsch, M; Simutis, R; Lübbert, A

    2007-10-31

    The Process Analytical Technology (PAT) initiative of the FDA is a reaction on the increasing discrepancy between current possibilities in process supervision and control of pharmaceutical production processes and its current application in industrial manufacturing processes. With rigid approval practices based on standard operational procedures, adaptations of production reactors towards the state of the art were more or less inhibited for long years. Now PAT paves the way for continuous process and product improvements through improved process supervision based on knowledge-based data analysis, "Quality-by-Design"-concepts, and, finally, through feedback control. Examples of up-to-date implementations of this concept are presented. They are taken from one key group of processes in recombinant pharmaceutical protein manufacturing, the cultivations of genetically modified Escherichia coli bacteria.

  16. Energy optimization of integrated process plants

    Energy Technology Data Exchange (ETDEWEB)

    Sandvig Nielsen, J

    1996-10-01

    A general approach for viewing the process synthesis as an evolutionary process is proposed. Each step is taken according to the present level of information and knowledge. This is formulated in a Process Synthesis Cycle. Initially the synthesis is conducted at a high abstraction level maximizing use of heuristics (prior experience, rules of thumbs etc). When further knowledge and information are available, heuristics will gradually be replaced by exact problem formulations. The principles in the Process Synthesis Cycle, is used to develop a general procedure for energy synthesis, based on available tools. The procedure is based on efficient use of process simulators with integrated Pinch capabilities (energy targeting). The proposed general procedure is tailored to three specific problems (Humid Air Turbine power plant synthesis, Nitric Acid process synthesis and Sulphuric Acid synthesis). Using the procedure reduces the problem dimension considerable and thus allows for faster evaluation of more alternatives. At more detailed level a new framework for the Heat Exchanger Network synthesis problem is proposed. The new framework is object oriented based on a general functional description of all elements potentially present in the heat exchanger network (streams, exchangers, pumps, furnaces etc.). (LN) 116 refs.

  17. A Robust Gold Deconvolution Approach for LiDAR Waveform Data Processing to Characterize Vegetation Structure

    Science.gov (United States)

    Zhou, T.; Popescu, S. C.; Krause, K.; Sheridan, R.; Ku, N. W.

    2014-12-01

    Increasing attention has been paid in the remote sensing community to the next generation Light Detection and Ranging (lidar) waveform data systems for extracting information on topography and the vertical structure of vegetation. However, processing waveform lidar data raises some challenges compared to analyzing discrete return data. The overall goal of this study was to present a robust de-convolution algorithm- Gold algorithm used to de-convolve waveforms in a lidar dataset acquired within a 60 x 60m study area located in the Harvard Forest in Massachusetts. The waveform lidar data was collected by the National Ecological Observatory Network (NEON). Specific objectives were to: (1) explore advantages and limitations of various waveform processing techniques to derive topography and canopy height information; (2) develop and implement a novel de-convolution algorithm, the Gold algorithm, to extract elevation and canopy metrics; and (3) compare results and assess accuracy. We modeled lidar waveforms with a mixture of Gaussian functions using the Non-least squares (NLS) algorithm implemented in R and derived a Digital Terrain Model (DTM) and canopy height. We compared our waveform-derived topography and canopy height measurements using the Gold de-convolution algorithm to results using the Richardson-Lucy algorithm. Our findings show that the Gold algorithm performed better than the Richardson-Lucy algorithm in terms of recovering the hidden echoes and detecting false echoes for generating a DTM, which indicates that the Gold algorithm could potentially be applied to processing of waveform lidar data to derive information on terrain elevation and canopy characteristics.

  18. Microwave plasmatrons for giant integrated circuit processing

    Energy Technology Data Exchange (ETDEWEB)

    Petrin, A.B.

    2000-02-01

    A method for calculating the interaction of a powerful microwave with a plane layer of magnetoactive low-pressure plasma under conditions of electron cyclotron resonance is presented. In this paper, the plasma layer is situated between a plane dielectric layer and a plane metal screen. The calculation model contains the microwave energy balance, particle balance, and electron energy balance. The equation that expressed microwave properties of nonuniform magnetoactive plasma is found. The numerical calculations of the microwave-plasma interaction for a one-dimensional model of the problem are considered. Applications of the results for microwave plasmatrons designed for processing giant integrated circuits are suggested.

  19. Testing for Level Shifts in Fractionally Integrated Processes: a State Space Approach

    DEFF Research Database (Denmark)

    Monache, Davide Delle; Grassi, Stefano; Santucci de Magistris, Paolo

    Short memory models contaminated by level shifts have similar long-memory features as fractionally integrated processes. This makes hard to verify whether the true data generating process is a pure fractionally integrated process when employing standard estimation methods based on the autocorrela......Short memory models contaminated by level shifts have similar long-memory features as fractionally integrated processes. This makes hard to verify whether the true data generating process is a pure fractionally integrated process when employing standard estimation methods based...... on the autocorrelation function or the periodogram. In this paper, we propose a robust testing procedure, based on an encompassing parametric specification that allows to disentangle the level shifts from the fractionally integrated component. The estimation is carried out on the basis of a state-space methodology...... and it leads to a robust estimate of the fractional integration parameter also in presence of level shifts. Once the memory parameter is correctly estimated, we use the KPSS test for presence of level shift. The Monte Carlo simulations show how this approach produces unbiased estimates of the memory parameter...

  20. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.

    1996-01-01

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  1. An integrated approach for prioritized process improvement.

    Science.gov (United States)

    Vanteddu, Gangaraju; McAllister, Charles D

    2014-01-01

    The purpose of this paper is to propose an integrated framework to simultaneously identify and improve healthcare processes that are important from the healthcare provider's and patient's perspectives. A modified quality function deployment (QFD) chart is introduced to the field of healthcare quality assurance. A healthcare service example is used to demonstrate the utility of the proposed chart. The proposed framework is versatile and can be used in a wide variety of healthcare quality improvement contexts, wherein, two different perspectives are needed to be considered for identifying and improving critical healthcare processes. The modified QFD chart used in conjunction with the stacked Pareto chart will facilitate the identification of key performance metrics from the patient's and the hospital's perspectives. Subsequently, the chief contributory factors at different levels are identified in a very efficient manner. Healthcare quality improvement professionals will be able to use the proposed modified QFD chart in association with stacked Pareto chart for effective quality assurance.

  2. An Enhanced Data Integrity Model In Mobile Cloud Environment Using Digital Signature Algorithm And Robust Reversible Watermarking

    Directory of Open Access Journals (Sweden)

    Boukari Souley

    2017-10-01

    Full Text Available the increase use of hand held devices such as smart phones to access multimedia content in the cloud is increasing with rise and growth in information technology. Mobile cloud computing is increasingly used today because it allows users to have access to variety of resources in the cloud such as image video audio and software applications with minimal usage of their inbuilt resources such as storage memory by using the one available in the cloud. The major challenge faced with mobile cloud computing is security. Watermarking and digital signature are some techniques used to provide security and authentication on user data in the cloud. Watermarking is a technique used to embed digital data within a multimedia content such as image video or audio in order to prevent authorized access to those content by intruders whereas digital signature is used to identify and verify user data when accessed. In this work we implemented digital signature and robust reversible image watermarking in order enhance mobile cloud computing security and integrity of data by providing double authentication techniques. The results obtained show the effectiveness of combining the two techniques robust reversible watermarking and digital signature by providing strong authentication to ensures data integrity and extract the original content watermarked without changes.

  3. Globalization and Integration Processes in Europe

    Directory of Open Access Journals (Sweden)

    Beti Godnič

    2017-03-01

    Full Text Available Research Question (RQ: In the article we highlight the issue of whether Integration processes in the European Union are only a manifestation of these Globalization processes and if there are differences in the the old member States EU (15 and the new EU member states in changed micro and macro environment? Purpose: We wanted to determine how the old member States EU (15 and the new EU member states adapt to the new circumstances and other changes in the micro and macro environment. Method: Analysing complexity of the changes of the state of economic system, and complex fundamental global processes, which have been occurred in long period of time, need to supplement the pure scientific approach with other types of research work, more holistic approach, which is commonly used in Comparative economics. We have taken such an approach in this article. Results: In the article we studied the geopolitical changes in the micro and macro environment. We found that the development in the old EU member states EU-15 and in the new EU member states is different. EU havent addopted the harmonised economic policy which will solve the »North-South« problem and cross-state cultural consensus and find a way to operate systemically in global environment. Organization: The findings can be used to support undestanding of micro and macro envirnment of the companys and contribute for better strategic planning and design of the entire supply chain. Society: The findings can contribute to better understanding of integrative processes in the EU. Limitations/Future Research: The complexity of the problem and the dynamic changes in the functioning of the global market requires in-depth studiying of changes in the micro and macro environment of logistics companie

  4. Process-integrated slag treatment; Prozessintegrierte Schlackebehandlung

    Energy Technology Data Exchange (ETDEWEB)

    Koralewska, R.; Faulstich, M. [Technische Univ., Garching (Germany). Lehrstuhl fuer Wasserguete- und Abfallwirtschaft

    1998-09-01

    The present study compares two methods of washing waste incineration slag, one with water only, and one which uses additives during wet deslagging. The presented aggregate offers ideal conditions for process-integrated slag treatment. The paper gives a schematic description of the integrated slag washing process. The washing liquid serves to wash out the readily soluble constituents and remove the fines, while the additives are for immobilising heavy metals in the slag material. The study is based on laboratory and semi-technical trials on the wet chemical treatment of grate slag with addition of carbon dioxide and phosphoric acid. [Deutsch] Die dargestellten Untersuchungen beziehen sich auf den Vergleich zwischen einer Waesche der Muellverbrennungsschlacke mit Wasser und unter Zugabe von Additiven im Nassentschlacker. In diesem Aggregat bieten sich optimale Voraussetzungen fuer eine prozessintegrierte Schlackebehandlung. Die Durchfuehrung der integrierten Schlackewaesche wird schematisch gezeigt. Durch die Waschfluessigkeit sollen die leichtloeslichen Bestandteile ausgewaschen und die Feinanteile ausgetragen sowie durch die Additive zusaetzlich die Schwermetalle im Schlackematerial immobilisiert werden. Dazu erfolgten Labor- und halbtechnische Versuche zur nasschemischen Behandlung der Rostschlacken unter Zugabe von Kohlendioxid und Phosphorsaeure. (orig./SR)

  5. An Integrated Membrane Process for Butenes Production

    Directory of Open Access Journals (Sweden)

    Leonardo Melone

    2016-11-01

    Full Text Available Iso-butene is an important material for the production of chemicals and polymers. It can take part in various chemical reactions, such as hydrogenation, oxidation and other additions owing to the presence of a reactive double bond. It is usually obtained as a by-product of a petroleum refinery, by Fluidized Catalytic Cracking (FCC of naphtha or gas-oil. However, an interesting alternative to iso-butene production is n-butane dehydroisomerization, which allows the direct conversion of n-butane via dehydrogenation and successive isomerization. In this work, a simulation analysis of an integrated membrane system is proposed for the production and recovery of butenes. The dehydroisomerization of n-butane to iso-butene takes place in a membrane reactor where the hydrogen is removed from the reaction side with a Pd/Ag alloys membrane. Afterwards, the retentate and permeate post-processing is performed in membrane separation units for butenes concentration and recovery. Four different process schemes are developed. The performance of each membrane unit is analyzed by appropriately developed performance maps, to identify the operating conditions windows and the membrane permeation properties required to maximize the recovery of the iso-butene produced. An analysis of integrated systems showed a yield of butenes higher than the other reaction products with high butenes recovery in the gas separation section, with values of molar concentration between 75% and 80%.

  6. Manufacturing Process for OLED Integrated Substrate

    Energy Technology Data Exchange (ETDEWEB)

    Hung, Cheng-Hung [Vitro Flat Glass LLC, Cheswick, PA (United States). Glass Technology Center

    2017-03-31

    The main objective of this project was to develop a low-cost integrated substrate for rigid OLED solid-state lighting produced at a manufacturing scale. The integrated substrates could include combinations of soda lime glass substrate, light extraction layer, and an anode layer (i.e., Transparent Conductive Oxide, TCO). Over the 3+ year course of the project, the scope of work was revised to focus on the development of a glass substrates with an internal light extraction (IEL) layer. A manufacturing-scale float glass on-line particle embedding process capable of producing an IEL glass substrate having a thickness of less than 1.7mm and an area larger than 500mm x 400mm was demonstrated. Substrates measuring 470mm x 370mm were used in the OLED manufacturing process for fabricating OLED lighting panels in single pixel devices as large as 120.5mm x 120.5mm. The measured light extraction efficiency (calculated as external quantum efficiency, EQE) for on-line produced IEL samples (>50%) met the project’s initial goal.

  7. Heat integration in multipurpose batch plants using a robust scheduling framework

    CSIR Research Space (South Africa)

    Seid, ER

    2014-07-01

    Full Text Available This case study was taken from the petro- chemical plant by Kallrath [46] and used as a benchmark problem in the scheduling environment for multipurpose batch plants. We adapted this case study to incorporate energy integration. The recipe representa- tion...

  8. Robust iterative learning control for multi-phase batch processes: an average dwell-time method with 2D convergence indexes

    Science.gov (United States)

    Wang, Limin; Shen, Yiteng; Yu, Jingxian; Li, Ping; Zhang, Ridong; Gao, Furong

    2018-01-01

    In order to cope with system disturbances in multi-phase batch processes with different dimensions, a hybrid robust control scheme of iterative learning control combined with feedback control is proposed in this paper. First, with a hybrid iterative learning control law designed by introducing the state error, the tracking error and the extended information, the multi-phase batch process is converted into a two-dimensional Fornasini-Marchesini (2D-FM) switched system with different dimensions. Second, a switching signal is designed using the average dwell-time method integrated with the related switching conditions to give sufficient conditions ensuring stable running for the system. Finally, the minimum running time of the subsystems and the control law gains are calculated by solving the linear matrix inequalities. Meanwhile, a compound 2D controller with robust performance is obtained, which includes a robust extended feedback control for ensuring the steady-state tracking error to converge rapidly. The application on an injection molding process displays the effectiveness and superiority of the proposed strategy.

  9. Direct integration of intensity-level data from Affymetrix and Illumina microarrays improves statistical power for robust reanalysis

    Directory of Open Access Journals (Sweden)

    Turnbull Arran K

    2012-08-01

    Full Text Available Abstract Background Affymetrix GeneChips and Illumina BeadArrays are the most widely used commercial single channel gene expression microarrays. Public data repositories are an extremely valuable resource, providing array-derived gene expression measurements from many thousands of experiments. Unfortunately many of these studies are underpowered and it is desirable to improve power by combining data from more than one study; we sought to determine whether platform-specific bias precludes direct integration of probe intensity signals for combined reanalysis. Results Using Affymetrix and Illumina data from the microarray quality control project, from our own clinical samples, and from additional publicly available datasets we evaluated several approaches to directly integrate intensity level expression data from the two platforms. After mapping probe sequences to Ensembl genes we demonstrate that, ComBat and cross platform normalisation (XPN, significantly outperform mean-centering and distance-weighted discrimination (DWD in terms of minimising inter-platform variance. In particular we observed that DWD, a popular method used in a number of previous studies, removed systematic bias at the expense of genuine biological variability, potentially reducing legitimate biological differences from integrated datasets. Conclusion Normalised and batch-corrected intensity-level data from Affymetrix and Illumina microarrays can be directly combined to generate biologically meaningful results with improved statistical power for robust, integrated reanalysis.

  10. A novel robust proportional-integral (PI) adaptive observer design for chaos synchronization

    International Nuclear Information System (INIS)

    Pourgholi Mahdi; Majd Vahid Johari

    2011-01-01

    In this paper, chaos synchronization in the presence of parameter uncertainty, observer gain perturbation and exogenous input disturbance is considered. A nonlinear non-fragile proportional-integral (PI) adaptive observer is designed for the synchronization of chaotic systems; its stability conditions based on the Lyapunov technique are derived. The observer proportional and integral gains, by converting the conditions into linear matrix inequality (LMI), are optimally selected from solutions that satisfy the observer stability conditions such that the effect of disturbance on the synchronization error becomes minimized. To show the effectiveness of the proposed method, simulation results for the synchronization of a Lorenz chaotic system with unknown parameters in the presence of an exogenous input disturbance and abrupt gain perturbation are reported. (general)

  11. International Political Processes of Integration of Education

    Directory of Open Access Journals (Sweden)

    Marina M. Lebedeva

    2017-09-01

    Full Text Available Introduction: the study of the international dimension of education is usually reduced to a comparative analysis of the characteristics of education in different countries. The situation began to change at the end of 20th – beginning of 21st centuries due to the rapid development of globalisation processes (the formation of a transparency of national borders and integration (deepening the cooperation between countries based on intergovernmental agreements. It had an impact on education, which was intensively internationalised (to acquire a wide international dimension. Despite the possible setbacks the process of internationalisation of education, the general vector of development is that this process will increase. The purpose of this article is to analyse what new challenges and opportunities are opened due to internationalisation of education (Russian education in particular. Materials and Methods: the study is based on the principles according to which education, on the one hand, depends on the transformation of the global political organisation of the world, on another hand – it is contributing to this transformation. Materials for the study are based on international agreements, which in particular are adopted in the framework of the Bologna process, and the results of scientific works of Russian and foreign scholars. Descriptive and comparative metho ds of analysis are widely used. Results: the analysis of the processes of internationalisation of education in the world has shown that, along with its traditional directions and aspects. It was noted that university begins to play a special role in the current world. It is shown that the specificity of education in Russia, which took shape due to a large terri¬tory and historical traditions, should be taken into account when forming a strategy for the development of the internationalisation of education in the country. Discussion and Conclusions: the specificity of Russia creates a risk

  12. Integration of CCS, emissions trading and volatilities of fuel prices into sustainable energy planning, and its robust optimization

    International Nuclear Information System (INIS)

    Koo, Jamin; Han, Kyusang; Yoon, En Sup

    2011-01-01

    In this paper, a new approach has been proposed that allows a robust optimization of sustainable energy planning over a period of years. It is based on the modified energy flow optimization model (EFOM) and minimizes total costs in planning capacities of power plants and CCS to be added, stripped or retrofitted. In the process, it reduces risks due to a high volatility in fuel prices; it also provides robustness against infeasibility with respect to meeting the required emission level by adopting a penalty constant that corresponds to the price level of emission allowances. In this manner, the proposed methodology enables decision makers to determine the optimal capacities of power plants and/or CCS, as well as volumes of emissions trading in the future that will meet the required emission level and satisfy energy demand from various user-sections with minimum costs and maximum robustness. They can also gain valuable insights on the effects that the price of emission allowances has on the competitiveness of RES and CCS technologies; it may be used in, for example, setting appropriate subsidies and tax policies for promoting greater use of these technologies. The proposed methodology is applied to a case based on directions and volumes of energy flows in South Korea during the year 2008. (author)

  13. The communication process in Telenursing: integrative review.

    Science.gov (United States)

    Barbosa, Ingrid de Almeida; Silva, Karen Cristina da Conceição Dias da; Silva, Vladimir Araújo da; Silva, Maria Júlia Paes da

    2016-01-01

    to identify scientific evidence about the communication process in Telenursing and analyze them. integrative review performed in March 2014. The search strategy, structured with the descriptors "telenursing" and "communication", was implemented in the databases Medline, Bireme, Cinahl, Scopus, Web of Science, Scielo, and Cochrane. ten studies were selected after inclusion and exclusion criteria. The main challenges were: the clinical condition of patients, the possibility for inadequate communication to cause misconduct, the absence of visual references in interactions without video, and difficulty understanding nonverbal communication. distance imposes communicative barriers in all elements: sender, recipient and message; and in both ways of transmission, verbal and nonverbal. The main difficulty is to understand nonverbal communication. To properly behave in this context, nurses must receive specific training to develop abilities and communication skills.

  14. Integrated Circuits for Analog Signal Processing

    CERN Document Server

    2013-01-01

      This book presents theory, design methods and novel applications for integrated circuits for analog signal processing.  The discussion covers a wide variety of active devices, active elements and amplifiers, working in voltage mode, current mode and mixed mode.  This includes voltage operational amplifiers, current operational amplifiers, operational transconductance amplifiers, operational transresistance amplifiers, current conveyors, current differencing transconductance amplifiers, etc.  Design methods and challenges posed by nanometer technology are discussed and applications described, including signal amplification, filtering, data acquisition systems such as neural recording, sensor conditioning such as biomedical implants, actuator conditioning, noise generators, oscillators, mixers, etc.   Presents analysis and synthesis methods to generate all circuit topologies from which the designer can select the best one for the desired application; Includes design guidelines for active devices/elements...

  15. Integrated modeling and robust control for full-envelope flight of robotic helicopters

    Science.gov (United States)

    La Civita, Marco

    Robotic helicopters have attracted a great deal of interest from the university, the industry, and the military world. They are versatile machines and there is a large number of important missions that they could accomplish. Nonetheless, there are only a handful of documented examples of robotic-helicopter applications in real-world scenarios. This situation is mainly due to the poor flight performance that can be achieved and---more important---guaranteed under automatic control. Given the maturity of control theory, and given the large body of knowledge in helicopter dynamics, it seems that the lack of success in flying high-performance controllers for robotic helicopters, especially by academic groups and by small industries, has nothing to do with helicopters or control theory as such. The problem lies instead in the large amount of time and resources needed to synthesize, test, and implement new control systems with the approach normally followed in the aeronautical industry. This thesis attempts to provide a solution by presenting a modeling and control framework that minimizes the time, cost, and both human and physical resources necessary to design high-performance flight controllers. The work is divided in two main parts. The first consists of the development of a modeling technique that allows the designer to obtain a high-fidelity model adequate for both real-time simulation and controller design, with few flight, ground, and wind-tunnel tests and a modest level of complexity in the dynamic equations. The second consists of the exploitation of the predictive capabilities of the model and of the robust stability and performance guarantees of the Hinfinity loop-shaping control theory to reduce the number of iterations of the design/simulated-evaluation/flight-test-evaluation procedure. The effectiveness of this strategy is demonstrated by designing and flight testing a wide-envelope high-performance controller for the Carnegie Mellon University robotic

  16. Robust operation and performance of integrated carbon nanotubes atomic force microscopy probes

    International Nuclear Information System (INIS)

    Rius, G; Clark, I T; Yoshimura, M

    2013-01-01

    We present a complete characterization of carbon nanotubes-atomic force microscopy (CNT-AFM) probes to evaluate the cantilever operation and advanced properties originating from the CNTs. The fabrication consists of silicon probes tip-functionalized with multiwalled CNTs by microwave plasma enhanced chemical vapor deposition. A dedicated methodology has been defined to evaluate the effect of CNT integration into the Si cantilevers. The presence of the CNTs provides enhanced capability for sensing and durability, as demonstrated using dynamic and static modes, e.g. imaging, indentation and force/current characterization.

  17. Integration of Environment Sensing and Control Functions for Robust Rotorcraft UAV (RUAV) Guidance

    Science.gov (United States)

    Dadkhah Tehrani, Navid

    Unmanned Air Vehicles (UAVs) have started supplanting manned aircraft in a broad range of tasks. Vehicles such as miniature rotorcrafts with broad maneuvering range and small size can enter remote locations that are hard to reach using other air and ground vehicles. Developing a guidance system which enables a Rotorcraft UAV (RUAV) to perform such tasks involves combing key elements from robotics motion planning, control system design, trajectory optimization as well as dynamics modeling. The focus of this thesis is to integrate a guidance system for a small-scale rotorcraft to enable a high level of performance and situational awareness. We cover large aspects of the system integration including modeling, control system design, environment sensing as well as motion planning in the presence of uncertainty. The system integration in this thesis is performed around a Blade-CX2 miniature coaxial helicopter. The first part of the thesis focuses on the development of the parameterized model for the Blade-CX2 helicopter with an emphasis on the coaxial rotor configuration. The model explicitly accounts for the dynamics of lower rotor and uses an implicit lumped parameter model for the upper rotor and stabilizer-bar. The parameterized model was identified using frequency domain system identification. In the second part of the thesis, we use the identified model to design a control law for the Blade-CX2 helicopter. The control augmentation for the Blade-CX2 helicopter was based on a nested attitude-velocity loop control architecture and was designed following classical loop-shaping and dynamic inversion techniques. A path following layer wrapped around the velocity control system enables the rotorcraft to follow reference trajectories specified by a sequence of waypoints and velocity vectors. Such reference paths are common in autonomous guidance systems. Finally, the third part of the thesis addresses the problem of autonomous navigation through a partially known or

  18. Enhanced robust fractional order proportional-plus-integral controller based on neural network for velocity control of permanent magnet synchronous motor.

    Science.gov (United States)

    Zhang, Bitao; Pi, YouGuo

    2013-07-01

    The traditional integer order proportional-integral-differential (IO-PID) controller is sensitive to the parameter variation or/and external load disturbance of permanent magnet synchronous motor (PMSM). And the fractional order proportional-integral-differential (FO-PID) control scheme based on robustness tuning method is proposed to enhance the robustness. But the robustness focuses on the open-loop gain variation of controlled plant. In this paper, an enhanced robust fractional order proportional-plus-integral (ERFOPI) controller based on neural network is proposed. The control law of the ERFOPI controller is acted on a fractional order implement function (FOIF) of tracking error but not tracking error directly, which, according to theory analysis, can enhance the robust performance of system. Tuning rules and approaches, based on phase margin, crossover frequency specification and robustness rejecting gain variation, are introduced to obtain the parameters of ERFOPI controller. And the neural network algorithm is used to adjust the parameter of FOIF. Simulation and experimental results show that the method proposed in this paper not only achieve favorable tracking performance, but also is robust with regard to external load disturbance and parameter variation. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  19. Robust and Accurate Closed-Loop Control of McKibben Artificial Muscle Contraction with a Linear Single Integral Action

    Directory of Open Access Journals (Sweden)

    Bertrand Tondu

    2014-06-01

    Full Text Available We analyze the possibility of taking advantage of artificial muscle’s own stiffness and damping, and substituting it for a classic proportional-integral-derivative controller (PID controller an I controller. The advantages are that there would only be one parameter to tune and no need for a dynamic model. A stability analysis is proposed from a simple phenomenological artificial muscle model. Step and sinus-wave tracking responses performed with pneumatic McKibben muscles are reported showing the practical efficiency of the method to combine accuracy and load robustness. In the particular case of the McKibben artificial muscle technology, we suggest that the dynamic performances in stability and load robustness would result from the textile nature of its braided sleeve and its internal friction which do not obey Coulomb’s third law, as verified by preliminary reported original friction experiments. Comparisons are reported between three kinds of braided sleeves made of rayon yarns, plastic, and thin metal wires, whose similar closed-loop dynamic performances are highlighted. It is also experimentally shown that a sleeve braided with thin metal wires can give high accuracy performance, in step as in tracking response. This would be due to a low static friction coefficient combined with a kinetic friction exponentially increasing with speed in accordance with hydrodynamic lubrication theory applied to textile physics.

  20. Service Quality Robust Design by the Integration of Taguchi Experiments and SERVQUAL Approach in a Travel Agency

    Directory of Open Access Journals (Sweden)

    nassibeh janatyan

    2012-02-01

    Full Text Available The main purpose of this research is to address how robust design of service quality dimensions can be obtained. Service Quality Robust Design has been conducted by the integration of Taguchi Design of Experiments and SERVQUAL approach in Iran Travel Agency. Five basic dimensions of service quality, i.e. reliability, responsiveness, assurance, empathy, tangibles and price have been assumed as control factors. Response factor has been defined as two alternatives i the sum of customer expectations, and ii the sum of service quality gaps. In this investigation assumed that noise factor is not existed. The advantage of this paper is to improve the average and standard deviation simultaneously. Signal to noise ratio has been computed and the desired mix of the levels of service quality dimensions has been addressed. The main findings of this research includes the desired mix of the levels of service quality dimensions based on the sum of customer expectations and the desired mix of the levels of service quality dimensions based on the sum of service quality gaps. Comparing the two sets of findings helps the agency to analyze the cost of attracting new customers or retaining regular customers.

  1. Robust and rapid algorithms facilitate large-scale whole genome sequencing downstream analysis in an integrative framework.

    Science.gov (United States)

    Li, Miaoxin; Li, Jiang; Li, Mulin Jun; Pan, Zhicheng; Hsu, Jacob Shujui; Liu, Dajiang J; Zhan, Xiaowei; Wang, Junwen; Song, Youqiang; Sham, Pak Chung

    2017-05-19

    Whole genome sequencing (WGS) is a promising strategy to unravel variants or genes responsible for human diseases and traits. However, there is a lack of robust platforms for a comprehensive downstream analysis. In the present study, we first proposed three novel algorithms, sequence gap-filled gene feature annotation, bit-block encoded genotypes and sectional fast access to text lines to address three fundamental problems. The three algorithms then formed the infrastructure of a robust parallel computing framework, KGGSeq, for integrating downstream analysis functions for whole genome sequencing data. KGGSeq has been equipped with a comprehensive set of analysis functions for quality control, filtration, annotation, pathogenic prediction and statistical tests. In the tests with whole genome sequencing data from 1000 Genomes Project, KGGSeq annotated several thousand more reliable non-synonymous variants than other widely used tools (e.g. ANNOVAR and SNPEff). It took only around half an hour on a small server with 10 CPUs to access genotypes of ∼60 million variants of 2504 subjects, while a popular alternative tool required around one day. KGGSeq's bit-block genotype format used 1.5% or less space to flexibly represent phased or unphased genotypes with multiple alleles and achieved a speed of over 1000 times faster to calculate genotypic correlation. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. A robust combination approach for short-term wind speed forecasting and analysis – Combination of the ARIMA (Autoregressive Integrated Moving Average), ELM (Extreme Learning Machine), SVM (Support Vector Machine) and LSSVM (Least Square SVM) forecasts using a GPR (Gaussian Process Regression) model

    International Nuclear Information System (INIS)

    Wang, Jianzhou; Hu, Jianming

    2015-01-01

    With the increasing importance of wind power as a component of power systems, the problems induced by the stochastic and intermittent nature of wind speed have compelled system operators and researchers to search for more reliable techniques to forecast wind speed. This paper proposes a combination model for probabilistic short-term wind speed forecasting. In this proposed hybrid approach, EWT (Empirical Wavelet Transform) is employed to extract meaningful information from a wind speed series by designing an appropriate wavelet filter bank. The GPR (Gaussian Process Regression) model is utilized to combine independent forecasts generated by various forecasting engines (ARIMA (Autoregressive Integrated Moving Average), ELM (Extreme Learning Machine), SVM (Support Vector Machine) and LSSVM (Least Square SVM)) in a nonlinear way rather than the commonly used linear way. The proposed approach provides more probabilistic information for wind speed predictions besides improving the forecasting accuracy for single-value predictions. The effectiveness of the proposed approach is demonstrated with wind speed data from two wind farms in China. The results indicate that the individual forecasting engines do not consistently forecast short-term wind speed for the two sites, and the proposed combination method can generate a more reliable and accurate forecast. - Highlights: • The proposed approach can make probabilistic modeling for wind speed series. • The proposed approach adapts to the time-varying characteristic of the wind speed. • The hybrid approach can extract the meaningful components from the wind speed series. • The proposed method can generate adaptive, reliable and more accurate forecasting results. • The proposed model combines four independent forecasting engines in a nonlinear way.

  3. Robustness of movement models: can models bridge the gap between temporal scales of data sets and behavioural processes?

    Science.gov (United States)

    Schlägel, Ulrike E; Lewis, Mark A

    2016-12-01

    Discrete-time random walks and their extensions are common tools for analyzing animal movement data. In these analyses, resolution of temporal discretization is a critical feature. Ideally, a model both mirrors the relevant temporal scale of the biological process of interest and matches the data sampling rate. Challenges arise when resolution of data is too coarse due to technological constraints, or when we wish to extrapolate results or compare results obtained from data with different resolutions. Drawing loosely on the concept of robustness in statistics, we propose a rigorous mathematical framework for studying movement models' robustness against changes in temporal resolution. In this framework, we define varying levels of robustness as formal model properties, focusing on random walk models with spatially-explicit component. With the new framework, we can investigate whether models can validly be applied to data across varying temporal resolutions and how we can account for these different resolutions in statistical inference results. We apply the new framework to movement-based resource selection models, demonstrating both analytical and numerical calculations, as well as a Monte Carlo simulation approach. While exact robustness is rare, the concept of approximate robustness provides a promising new direction for analyzing movement models.

  4. Engineering of biotin-prototrophy in Pichia pastoris for robust production processes.

    Science.gov (United States)

    Gasser, Brigitte; Dragosits, Martin; Mattanovich, Diethard

    2010-11-01

    Biotin plays an essential role as cofactor for biotin-dependent carboxylases involved in essential metabolic pathways. The cultivation of Pichia pastoris, a methylotrophic yeast that is successfully used as host for the production of recombinant proteins, requires addition of high dosage of biotin. As biotin is the only non-salt media component used during P. pastoris fermentation (apart from the carbon source), nonconformities during protein production processes are usually attributed to poor quality of the added biotin. In order to avoid dismissed production runs due to biotin quality issues, we engineered the biotin-requiring yeast P. pastoris to become a biotin-prototrophic yeast. Integration of four genes involved in the biotin biosynthesis from brewing yeast into the P. pastoris genome rendered P. pastoris biotin-prototrophic. The engineered strain has successfully been used as production host for both intracellular and secreted heterologous proteins in fed-batch processes, employing mineral media without vitamins. Another field of application for these truly prototrophic hosts is the production of biochemicals and small metabolites, where defined mineral media leads to easier purification procedures. Copyright © 2010 Elsevier Inc. All rights reserved.

  5. Integrated Site Model Process Model Report

    International Nuclear Information System (INIS)

    Booth, T.

    2000-01-01

    The Integrated Site Model (ISM) provides a framework for discussing the geologic features and properties of Yucca Mountain, which is being evaluated as a potential site for a geologic repository for the disposal of nuclear waste. The ISM is important to the evaluation of the site because it provides 3-D portrayals of site geologic, rock property, and mineralogic characteristics and their spatial variabilities. The ISM is not a single discrete model; rather, it is a set of static representations that provide three-dimensional (3-D), computer representations of site geology, selected hydrologic and rock properties, and mineralogic-characteristics data. These representations are manifested in three separate model components of the ISM: the Geologic Framework Model (GFM), the Rock Properties Model (RPM), and the Mineralogic Model (MM). The GFM provides a representation of the 3-D stratigraphy and geologic structure. Based on the framework provided by the GFM, the RPM and MM provide spatial simulations of the rock and hydrologic properties, and mineralogy, respectively. Functional summaries of the component models and their respective output are provided in Section 1.4. Each of the component models of the ISM considers different specific aspects of the site geologic setting. Each model was developed using unique methodologies and inputs, and the determination of the modeled units for each of the components is dependent on the requirements of that component. Therefore, while the ISM represents the integration of the rock properties and mineralogy into a geologic framework, the discussion of ISM construction and results is most appropriately presented in terms of the three separate components. This Process Model Report (PMR) summarizes the individual component models of the ISM (the GFM, RPM, and MM) and describes how the three components are constructed and combined to form the ISM

  6. Demonstration of an N7 integrated fab process for metal oxide EUV photoresist

    Science.gov (United States)

    De Simone, Danilo; Mao, Ming; Kocsis, Michael; De Schepper, Peter; Lazzarino, Frederic; Vandenberghe, Geert; Stowers, Jason; Meyers, Stephen; Clark, Benjamin L.; Grenville, Andrew; Luong, Vinh; Yamashita, Fumiko; Parnell, Doni

    2016-03-01

    Inpria has developed a directly patternable metal oxide hard-mask as a robust, high-resolution photoresist for EUV lithography. In this paper we demonstrate the full integration of a baseline Inpria resist into an imec N7 BEOL block mask process module. We examine in detail both the lithography and etch patterning results. By leveraging the high differential etch resistance of metal oxide photoresists, we explore opportunities for process simplification and cost reduction. We review the imaging results from the imec N7 block mask patterns and its process windows as well as routes to maximize the process latitude, underlayer integration, etch transfer, cross sections, etch equipment integration from cross metal contamination standpoint and selective resist strip process. Finally, initial results from a higher sensitivity Inpria resist are also reported. A dose to size of 19 mJ/cm2 was achieved to print pillars as small as 21nm.

  7. Robust Query Processing for Personalized Information Access on the Semantic Web

    DEFF Research Database (Denmark)

    Dolog, Peter; Stuckenschmidt, Heiner; Wache, Holger

    and user preferences. We describe a framework for information access that combines query refinement and relaxation in order to provide robust, personalized access to heterogeneous RDF data as well as an implementation in terms of rewriting rules and explain its application in the context of e-learning...

  8. The integration of process monitoring for safeguards

    International Nuclear Information System (INIS)

    Cipiti, Benjamin B.; Zinaman, Owen R.

    2010-01-01

    The Separations and Safeguards Performance Model is a reprocessing plant model that has been developed for safeguards analyses of future plant designs. The model has been modified to integrate bulk process monitoring data with traditional plutonium inventory balances to evaluate potential advanced safeguards systems. Taking advantage of the wealth of operator data such as flow rates and mass balances of bulk material, the timeliness of detection of material loss was shown to improve considerably. Four diversion cases were tested including both abrupt and protracted diversions at early and late times in the run. The first three cases indicated alarms before half of a significant quantity of material was removed. The buildup of error over time prevented detection in the case of a protracted diversion late in the run. Some issues related to the alarm conditions and bias correction will need to be addressed in future work. This work both demonstrates the use of the model for performing diversion scenario analyses and for testing advanced safeguards system designs.

  9. Robust on-line monitoring of biogas processes; Robusta maettekniker on-line foer optimerad biogasproduktion

    Energy Technology Data Exchange (ETDEWEB)

    Nordberg, Aake; Hansson, Mikael; Kanerot, Mija; Krozer, Anatol; Loefving, Bjoern; Sahlin, Eskil

    2010-03-15

    Although demand for biomethane in Sweden is higher than ever, many Swedish codigestion plants are presently operated below their designed capacity. Efforts must be taken to increase the loading rate and guarantee stable operation and high availability of the plants. There are currently no commercial systems for on-line monitoring, and due to the characteristics of the material, including corrosion and tearing, robust applications have to be developed. The objective of this project was to identify and study different monitoring technologies with potential for on-line monitoring of both substrate mixtures and anaerobic digester content. Based on the prerequisites and demands at Boraas Energi och Miljoe AB's (BEMAB, the municipal energy and waste utility in the city of Boraas, Sweden) biogas plant, the extent of the problems, measurement variables and possible ways of managing these issues have been identified and prioritized. The substrate mixtures in question have a high viscosity and are inhomogeneous with variation in composition, which calls for further homogenization, dilution and filtration to achieve high precision in the necessary analyses. Studies of using different mixers and mills showed that the particle size (800 mum) needed for on-line COD measurement could not be achieved. The problem of homogenization can be avoided if indirect measurement methods are used. Laboratory tests with NIR (near-infra red spectroscopy) showed that VS can be predicted (R2=0,78) in the interval of 2-9% VS. Furthermore, impedance can give a measurement of soluble components. However, impedance is not sensitive enough to give a good measurement of total TS. Microwave technology was installed at the production plant and showed a faster response to changes in TS than the existing TS-sensor. However, due to technical problems, the evaluation only could be done during a limited period of ten days. BEMAB will continue the measurements and evaluation of the instrument. The

  10. A two-step patterning process increases the robustness of periodic patterning in the fly eye.

    Science.gov (United States)

    Gavish, Avishai; Barkai, Naama

    2016-06-01

    Complex periodic patterns can self-organize through dynamic interactions between diffusible activators and inhibitors. In the biological context, self-organized patterning is challenged by spatial heterogeneities ('noise') inherent to biological systems. How spatial variability impacts the periodic patterning mechanism and how it can be buffered to ensure precise patterning is not well understood. We examine the effect of spatial heterogeneity on the periodic patterning of the fruit fly eye, an organ composed of ∼800 miniature eye units (ommatidia) whose periodic arrangement along a hexagonal lattice self-organizes during early stages of fly development. The patterning follows a two-step process, with an initial formation of evenly spaced clusters of ∼10 cells followed by a subsequent refinement of each cluster into a single selected cell. Using a probabilistic approach, we calculate the rate of patterning errors resulting from spatial heterogeneities in cell size, position and biosynthetic capacity. Notably, error rates were largely independent of the desired cluster size but followed the distributions of signaling speeds. Pre-formation of large clusters therefore greatly increases the reproducibility of the overall periodic arrangement, suggesting that the two-stage patterning process functions to guard the pattern against errors caused by spatial heterogeneities. Our results emphasize the constraints imposed on self-organized patterning mechanisms by the need to buffer stochastic effects. Author summary Complex periodic patterns are common in nature and are observed in physical, chemical and biological systems. Understanding how these patterns are generated in a precise manner is a key challenge. Biological patterns are especially intriguing, as they are generated in a noisy environment; cell position and cell size, for example, are subject to stochastic variations, as are the strengths of the chemical signals mediating cell-to-cell communication. The need

  11. Acquisition, processing, and visualization of big data as applied to robust multivariate impact models

    Science.gov (United States)

    Romeo, L.; Rose, K.; Bauer, J. R.; Dick, D.; Nelson, J.; Bunn, A.; Buenau, K. E.; Coleman, A. M.

    2016-02-01

    Increased offshore oil exploration and production emphasizes the need for environmental, social, and economic impact models that require big data from disparate sources to conduct thorough multi-scale analyses. The National Energy Technology Laboratory's Cumulative Spatial Impact Layers (CSILs) and Spatially Weighted Impact Model (SWIM) are user-driven flexible suites of GIS-based tools that can efficiently process, integrate, visualize, and analyze a wide variety of big datasets that are acquired to better to understand potential impacts for oil spill prevention and response readiness needs. These tools provide solutions to address a range of stakeholder questions and aid in prioritization decisions needed when responding to oil spills. This is particularly true when highlighting ecologically sensitive areas and spatially analyzing which species may be at risk. Model outputs provide unique geospatial visualizations of potential impacts and informational reports based on user preferences. The spatio-temporal capabilities of these tools can be leveraged to a number of anthropogenic and natural disasters enabling decision-makers to be better informed to potential impacts and response needs.

  12. Variable-structure approaches analysis, simulation, robust control and estimation of uncertain dynamic processes

    CERN Document Server

    Senkel, Luise

    2016-01-01

    This edited book aims at presenting current research activities in the field of robust variable-structure systems. The scope equally comprises highlighting novel methodological aspects as well as presenting the use of variable-structure techniques in industrial applications including their efficient implementation on hardware for real-time control. The target audience primarily comprises research experts in the field of control theory and nonlinear dynamics but the book may also be beneficial for graduate students.

  13. An integrated workflow for robust alignment and simplified quantitative analysis of NMR spectrometry data.

    Science.gov (United States)

    Vu, Trung N; Valkenborg, Dirk; Smets, Koen; Verwaest, Kim A; Dommisse, Roger; Lemière, Filip; Verschoren, Alain; Goethals, Bart; Laukens, Kris

    2011-10-20

    Nuclear magnetic resonance spectroscopy (NMR) is a powerful technique to reveal and compare quantitative metabolic profiles of biological tissues. However, chemical and physical sample variations make the analysis of the data challenging, and typically require the application of a number of preprocessing steps prior to data interpretation. For example, noise reduction, normalization, baseline correction, peak picking, spectrum alignment and statistical analysis are indispensable components in any NMR analysis pipeline. We introduce a novel suite of informatics tools for the quantitative analysis of NMR metabolomic profile data. The core of the processing cascade is a novel peak alignment algorithm, called hierarchical Cluster-based Peak Alignment (CluPA). The algorithm aligns a target spectrum to the reference spectrum in a top-down fashion by building a hierarchical cluster tree from peak lists of reference and target spectra and then dividing the spectra into smaller segments based on the most distant clusters of the tree. To reduce the computational time to estimate the spectral misalignment, the method makes use of Fast Fourier Transformation (FFT) cross-correlation. Since the method returns a high-quality alignment, we can propose a simple methodology to study the variability of the NMR spectra. For each aligned NMR data point the ratio of the between-group and within-group sum of squares (BW-ratio) is calculated to quantify the difference in variability between and within predefined groups of NMR spectra. This differential analysis is related to the calculation of the F-statistic or a one-way ANOVA, but without distributional assumptions. Statistical inference based on the BW-ratio is achieved by bootstrapping the null distribution from the experimental data. The workflow performance was evaluated using a previously published dataset. Correlation maps, spectral and grey scale plots show clear improvements in comparison to other methods, and the down

  14. An integrated workflow for robust alignment and simplified quantitative analysis of NMR spectrometry data

    Directory of Open Access Journals (Sweden)

    Dommisse Roger

    2011-10-01

    Full Text Available Abstract Background Nuclear magnetic resonance spectroscopy (NMR is a powerful technique to reveal and compare quantitative metabolic profiles of biological tissues. However, chemical and physical sample variations make the analysis of the data challenging, and typically require the application of a number of preprocessing steps prior to data interpretation. For example, noise reduction, normalization, baseline correction, peak picking, spectrum alignment and statistical analysis are indispensable components in any NMR analysis pipeline. Results We introduce a novel suite of informatics tools for the quantitative analysis of NMR metabolomic profile data. The core of the processing cascade is a novel peak alignment algorithm, called hierarchical Cluster-based Peak Alignment (CluPA. The algorithm aligns a target spectrum to the reference spectrum in a top-down fashion by building a hierarchical cluster tree from peak lists of reference and target spectra and then dividing the spectra into smaller segments based on the most distant clusters of the tree. To reduce the computational time to estimate the spectral misalignment, the method makes use of Fast Fourier Transformation (FFT cross-correlation. Since the method returns a high-quality alignment, we can propose a simple methodology to study the variability of the NMR spectra. For each aligned NMR data point the ratio of the between-group and within-group sum of squares (BW-ratio is calculated to quantify the difference in variability between and within predefined groups of NMR spectra. This differential analysis is related to the calculation of the F-statistic or a one-way ANOVA, but without distributional assumptions. Statistical inference based on the BW-ratio is achieved by bootstrapping the null distribution from the experimental data. Conclusions The workflow performance was evaluated using a previously published dataset. Correlation maps, spectral and grey scale plots show clear

  15. Integrating conceptualizations of experience into the interaction design process

    DEFF Research Database (Denmark)

    Dalsgaard, Peter

    2010-01-01

    From a design perspective, the increasing awareness of experiential aspects of interactive systems prompts the question of how conceptualizations of experience can inform and potentially be integrated into the interaction design process. This paper presents one approach to integrating theoretical...

  16. A unified approach for proportional-integral-derivative controller design for time delay processes

    International Nuclear Information System (INIS)

    Shamsuzzoha, Mohammad

    2015-01-01

    An analytical design method for PI/PID controller tuning is proposed for several types of processes with time delay. A single tuning formula gives enhanced disturbance rejection performance. The design method is based on the IMC approach, which has a single tuning parameter to adjust the performance and robustness of the controller. A simple tuning formula gives consistently better performance as compared to several well-known methods at the same degree of robustness for stable and integrating process. The performance of the unstable process has been compared with other recently published methods which also show significant improvement in the proposed method. Furthermore, the robustness of the controller is investigated by inserting a perturbation uncertainty in all parameters simultaneously, again showing comparable results with other methods. An analysis has been performed for the uncertainty margin in the different process parameters for the robust controller design. It gives the guidelines of the M s setting for the PI controller design based on the process parameters uncertainty. For the selection of the closed-loop time constant, (τ c ), a guideline is provided over a broad range of θ/τ ratios on the basis of the peak of maximum uncertainty (M s ). A comparison of the IAE has been conducted for the wide range of θ/τ ratio for the first order time delay process. The proposed method shows minimum IAE in compared to SIMC, while Lee et al. shows poor disturbance rejection in the lag dominant process. In the simulation study, the controllers were tuned to have the same degree of robustness by measuring the M s , to obtain a reasonable comparison

  17. An integrated approach to process control

    NARCIS (Netherlands)

    Schippers, W.A.J.

    2001-01-01

    The control of production processes is the subject of several disciplines, such as statistical process control (SPC), total productive maintenance (TPM), and automated process control (APC). Although these disciplines are traditionally separated (both in science and in business practice), their

  18. Neuro-fuzzy Control of Integrating Processes

    Directory of Open Access Journals (Sweden)

    Anna Vasičkaninová

    2011-11-01

    Full Text Available Fuzzy technology is adaptive and easily applicable in different areas.Fuzzy logic provides powerful tools to capture the perceptionof natural phenomena. The paper deals with tuning of neuro-fuzzy controllers for integrating plant and for integrating plantswith time delay. The designed approach is verified on three examples by simulations and compared plants with classical PID control.Designed fuzzy controllers lead to better closed-loop control responses then classical PID controllers.

  19. Robust modelling of heat-induced reactions in an industrial food production process exemplified by acrylamide generation in breakfast cereals

    DEFF Research Database (Denmark)

    Jensen, Bo Boye Busk; Lennox, Martin; Granby, Kit

    2008-01-01

    Data from an industrial case study of breakfast cereal production indicated that the generated amounts of acrylamide are greatly dependent upon the combined effects of temperature and heating time in a roasting step process. Two approaches to obtain process models for acrylamide generation were...... of difficulties in applying multi-parameter models and emphasized the advantages of "classical" approaches to process modelling, especially for use in an industrial context. The study faced with a significant degree of variability in the data, due to fluctuations in the process, which also emphasized...... the importance of robustness in the developed models. The correlations obtained for predicting acrylamide generation in the case study present a useful tool for food processing industry to minimize acrylamide generation. In the present case it was possible by lowering process temperature and prolonging residence...

  20. Adaptive and robust statistical methods for processing near-field scanning microwave microscopy images.

    Science.gov (United States)

    Coakley, K J; Imtiaz, A; Wallis, T M; Weber, J C; Berweger, S; Kabos, P

    2015-03-01

    Near-field scanning microwave microscopy offers great potential to facilitate characterization, development and modeling of materials. By acquiring microwave images at multiple frequencies and amplitudes (along with the other modalities) one can study material and device physics at different lateral and depth scales. Images are typically noisy and contaminated by artifacts that can vary from scan line to scan line and planar-like trends due to sample tilt errors. Here, we level images based on an estimate of a smooth 2-d trend determined with a robust implementation of a local regression method. In this robust approach, features and outliers which are not due to the trend are automatically downweighted. We denoise images with the Adaptive Weights Smoothing method. This method smooths out additive noise while preserving edge-like features in images. We demonstrate the feasibility of our methods on topography images and microwave |S11| images. For one challenging test case, we demonstrate that our method outperforms alternative methods from the scanning probe microscopy data analysis software package Gwyddion. Our methods should be useful for massive image data sets where manual selection of landmarks or image subsets by a user is impractical. Published by Elsevier B.V.

  1. Total Ore Processing Integration and Management

    Energy Technology Data Exchange (ETDEWEB)

    Leslie Gertsch

    2006-05-15

    This report outlines the technical progress achieved for project DE-FC26-03NT41785 (Total Ore Processing Integration and Management) during the period 01 January through 31 March of 2006. (1) Work in Progress: Minntac Mine--Graphical analysis of drill monitor data moved from two-dimensional horizontal patterns to vertical variations in measured and calculated parameters. The rock quality index and the two dimensionless ({pi}) indices developed by Kewen Yin of the University of Minnesota are used by Minntac Mine to design their blasts, but the drill monitor data from any given pattern is obviously not available for the design of that shot. Therefore, the blast results--which are difficult to quantify in a short time--must be back-analyzed for comparison with the drill monitor data to be useful for subsequent blast designs. {pi}{sub 1} indicates the performance of the drill, while {pi}{sub 2} is a measure of the rock resistance to drilling. As would be expected, since a drill tends to perform better in rock that offers little resistance, {pi}{sub 1} and {pi}{sub 2} are strongly inversely correlated; the relationship is a power function rather than simply linear. Low values of each Pi index tend to be quantized, indicating that these two parameters may be most useful above certain minimum magnitudes. (2) Work in Progress: Hibtac Mine--Statistical examination of a data set from Hibtac Mine (Table 1) shows that incorporating information on the size distribution of material feeding from the crusher to the autogenous mills improves the predictive capability of the model somewhat (43% vs. 44% correlation coefficient), but a more important component is production data from preceding days (26% vs. 44% correlation coefficient), determined using exponentially weighted moving average predictive variables. This lag effect likely reflects the long and varied residence times of the different size fragments in the grinding mills. The rock sizes are also correlated with the geologic

  2. Structural integration of separation and reaction systems: I. Integration of stage-wise processes

    Directory of Open Access Journals (Sweden)

    Mitrović Milan

    2002-01-01

    Full Text Available The structural integration of separation processes, using multifunctional equipment, has been studied on four stage-wise liquid-liquid separations extraction, absorption, distillation, adsorption and on some combinations of these processes. It was shown for stage - wise processes that the ultimate aim of equipment integration is 3-way integration (by components by steps and by stages and that membrane multiphase contactors present concerning the equipment optimal solutions in many cases. First, by using partially integrated equipment and, later by developing fully integrated systems it was experimentally confirmed that structural 3-way integration produces much higher degrees of component separations and component enrichments in compact and safe equipment.

  3. A robust hybrid model integrating enhanced inputs based extreme learning machine with PLSR (PLSR-EIELM) and its application to intelligent measurement.

    Science.gov (United States)

    He, Yan-Lin; Geng, Zhi-Qiang; Xu, Yuan; Zhu, Qun-Xiong

    2015-09-01

    In this paper, a robust hybrid model integrating an enhanced inputs based extreme learning machine with the partial least square regression (PLSR-EIELM) was proposed. The proposed PLSR-EIELM model can overcome two main flaws in the extreme learning machine (ELM), i.e. the intractable problem in determining the optimal number of the hidden layer neurons and the over-fitting phenomenon. First, a traditional extreme learning machine (ELM) is selected. Second, a method of randomly assigning is applied to the weights between the input layer and the hidden layer, and then the nonlinear transformation for independent variables can be obtained from the output of the hidden layer neurons. Especially, the original input variables are regarded as enhanced inputs; then the enhanced inputs and the nonlinear transformed variables are tied together as the whole independent variables. In this way, the PLSR can be carried out to identify the PLS components not only from the nonlinear transformed variables but also from the original input variables, which can remove the correlation among the whole independent variables and the expected outputs. Finally, the optimal relationship model of the whole independent variables with the expected outputs can be achieved by using PLSR. Thus, the PLSR-EIELM model is developed. Then the PLSR-EIELM model served as an intelligent measurement tool for the key variables of the Purified Terephthalic Acid (PTA) process and the High Density Polyethylene (HDPE) process. The experimental results show that the predictive accuracy of PLSR-EIELM is stable, which indicate that PLSR-EIELM has good robust character. Moreover, compared with ELM, PLSR, hierarchical ELM (HELM), and PLSR-ELM, PLSR-EIELM can achieve much smaller predicted relative errors in these two applications. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  4. Integrated Intelligent Modeling, Design and Control of Crystal Growth Processes

    National Research Council Canada - National Science Library

    Prasad, V

    2000-01-01

    .... This MURI program took an integrated approach towards modeling, design and control of crystal growth processes and in conjunction with growth and characterization experiments developed much better...

  5. Sensor integration for robotic laser welding processes

    NARCIS (Netherlands)

    Iakovou, D.; Aarts, Ronald G.K.M.; Meijer, J.; Ostendorf, A; Hoult, A.; Lu, Y.

    2005-01-01

    The use of robotic laser welding is increasing among industrial applications, because of its ability to weld objects in three dimensions. Robotic laser welding involves three sub-processes: seam detection and tracking, welding process control, and weld seam inspection. Usually, for each sub-process,

  6. Integrating the autonomous subsystems management process

    Science.gov (United States)

    Ashworth, Barry R.

    1992-01-01

    Ways in which the ranking of the Space Station Module Power Management and Distribution testbed may be achieved and an individual subsystem's internal priorities may be managed within the complete system are examined. The application of these results in the integration and performance leveling of the autonomously managed system is discussed.

  7. Integrating personality structure, personality process, and personality development

    NARCIS (Netherlands)

    Baumert, Anna; Schmitt, Manfred; Perugini, Marco; Johnson, Wendy; Blum, Gabriela; Borkenau, Peter; Costantini, Giulio; Denissen, J.J.A.; Fleeson, William; Grafton, Ben; Jayawickreme, Eranda; Kurzius, Elena; MacLeod, Colin; Miller, Lynn C.; Read, Stephen J.; Robinson, Michael D.; Wood, Dustin; Wrzus, Cornelia

    2017-01-01

    In this target article, we argue that personality processes, personality structure, and personality development have to be understood and investigated in integrated ways in order to provide comprehensive responses to the key questions of personality psychology. The psychological processes and

  8. Improving Robustness of Hydrologic Ensemble Predictions Through Probabilistic Pre- and Post-Processing in Sequential Data Assimilation

    Science.gov (United States)

    Wang, S.; Ancell, B. C.; Huang, G. H.; Baetz, B. W.

    2018-03-01

    Data assimilation using the ensemble Kalman filter (EnKF) has been increasingly recognized as a promising tool for probabilistic hydrologic predictions. However, little effort has been made to conduct the pre- and post-processing of assimilation experiments, posing a significant challenge in achieving the best performance of hydrologic predictions. This paper presents a unified data assimilation framework for improving the robustness of hydrologic ensemble predictions. Statistical pre-processing of assimilation experiments is conducted through the factorial design and analysis to identify the best EnKF settings with maximized performance. After the data assimilation operation, statistical post-processing analysis is also performed through the factorial polynomial chaos expansion to efficiently address uncertainties in hydrologic predictions, as well as to explicitly reveal potential interactions among model parameters and their contributions to the predictive accuracy. In addition, the Gaussian anamorphosis is used to establish a seamless bridge between data assimilation and uncertainty quantification of hydrologic predictions. Both synthetic and real data assimilation experiments are carried out to demonstrate feasibility and applicability of the proposed methodology in the Guadalupe River basin, Texas. Results suggest that statistical pre- and post-processing of data assimilation experiments provide meaningful insights into the dynamic behavior of hydrologic systems and enhance robustness of hydrologic ensemble predictions.

  9. Process Variations and Probabilistic Integrated Circuit Design

    CERN Document Server

    Haase, Joachim

    2012-01-01

    Uncertainty in key parameters within a chip and between different chips in the deep sub micron era plays a more and more important role. As a result, manufacturing process spreads need to be considered during the design process.  Quantitative methodology is needed to ensure faultless functionality, despite existing process variations within given bounds, during product development.   This book presents the technological, physical, and mathematical fundamentals for a design paradigm shift, from a deterministic process to a probability-orientated design process for microelectronic circuits.  Readers will learn to evaluate the different sources of variations in the design flow in order to establish different design variants, while applying appropriate methods and tools to evaluate and optimize their design.  Trains IC designers to recognize problems caused by parameter variations during manufacturing and to choose the best methods available to mitigate these issues during the design process; Offers both qual...

  10. Integrating interface slicing into software engineering processes

    Science.gov (United States)

    Beck, Jon

    1993-01-01

    Interface slicing is a tool which was developed to facilitate software engineering. As previously presented, it was described in terms of its techniques and mechanisms. The integration of interface slicing into specific software engineering activities is considered by discussing a number of potential applications of interface slicing. The applications discussed specifically address the problems, issues, or concerns raised in a previous project. Because a complete interface slicer is still under development, these applications must be phrased in future tenses. Nonetheless, the interface slicing techniques which were presented can be implemented using current compiler and static analysis technology. Whether implemented as a standalone tool or as a module in an integrated development or reverse engineering environment, they require analysis no more complex than that required for current system development environments. By contrast, conventional slicing is a methodology which, while showing much promise and intuitive appeal, has yet to be fully implemented in a production language environment despite 12 years of development.

  11. Silicon wafers for integrated circuit process

    OpenAIRE

    Leroy , B.

    1986-01-01

    Silicon as a substrate material will continue to dominate the market of integrated circuits for many years. We first review how crystal pulling procedures impact the quality of silicon. We then investigate how thermal treatments affect the behaviour of oxygen and carbon, and how, as a result, the quality of silicon wafers evolves. Gettering techniques are then presented. We conclude by detailing the requirements that wafers must satisfy at the incoming inspection.

  12. Introduction: Integration as a three-way process approach?

    NARCIS (Netherlands)

    Garcés-Mascareñas, B.; Penninx, R.; Garcés-Mascareñas, B.; Penninx, R.

    2016-01-01

    This chapter introduces the topic of this volume, which is the recent departure from viewing integration as a strictly two-way process (between migrants and the receiving society) to acknowledge the potential role that countries of origin might play in support of the integration process. It traces

  13. Robust Model-based Control of Open-loop Unstable Processes

    International Nuclear Information System (INIS)

    Emad, Ali

    1999-01-01

    This paper addresses the development of new formulations for estimating modeling errors or unmeasured disturbances to be used in Model Predictive Control (MPC) algorithms during open-loop prediction. Two different formulations were developed in this paper. One is used in MPC that directly utilizes linear models and the other in MPC that utilizes non-linear models. These estimation techniques were utilized to provide robust performance for MPC algorithms when the plant is open-loop unstable and under the influence of modeling error and/or unmeasured disturbances. For MPC that utilizes a non-linear model, the estimation technique is formulated as a fixed small size on-line optimization problem, while for linear MPC, the unmeasured disturbances are estimated via a proposed linear disturbance model. The disturbance model coefficients are identified on-line from historical estimates of plant-model mismatch. The effectiveness of incorporating these proposed estimation techniques into MPC is tested through simulated implementation on non-linear unstable exothermic fluidized bed reactor. Closed-loop simulations proved the capability of the proposed estimation methods to stabilize and, thereby, improve the MPC performance in such cases. (Author)

  14. Systematic identification and robust control design for uncertain time delay processes

    DEFF Research Database (Denmark)

    Huusom, Jakob Kjøbsted; Poulsen, Niels Kjølstad; Jørgensen, Sten Bay

    2011-01-01

    A systematic procedure is proposed to handle the standard process control problem. The considered standard problem involves infrequent step disturbances to processes with large delays and measurement noise. The process is modeled as an ARX model and extended with a suitable noise model in order...... to reject unmeasured step disturbances and unavoidable model errors. This controller is illustrated to perform well for both set point tracking and a disturbance rejection for a SISO process example of a furnace which has a time delay which is significantly longer than the dominating time constant....

  15. EPR design tools. Integrated data processing tools

    International Nuclear Information System (INIS)

    Kern, R.

    1997-01-01

    In all technical areas, planning and design have been supported by electronic data processing for many years. New data processing tools had to be developed for the European Pressurized Water Reactor (EPR). The work to be performed was split between KWU and Framatome and laid down in the Basic Design contract. The entire plant was reduced to a logical data structure; the circuit diagrams and flowsheets of the systems were drafted, the central data pool was established, the outlines of building structures were defined, the layout of plant components was planned, and the electrical systems were documented. Also building construction engineering was supported by data processing. The tasks laid down in the Basic Design were completed as so-called milestones. Additional data processing tools also based on the central data pool are required for the phases following after the Basic Design phase, i.e Basic Design Optimization; Detailed Design; Management; Construction, and Commissioning. (orig.) [de

  16. {sup 14}CO{sub 2} processing using an improved and robust molecular sieve cartridge

    Energy Technology Data Exchange (ETDEWEB)

    Wotte, Anja, E-mail: Anja.Wotte@uni-koeln.de [Institute of Geology and Mineralogy, University of Cologne, Cologne (Germany); Wordell-Dietrich, Patrick [Thünen Institute of Climate-Smart Agriculture, Braunschweig (Germany); Wacker, Lukas [Ion Beam Physics, ETH Zurich, Zurich (Switzerland); Don, Axel [Thünen Institute of Climate-Smart Agriculture, Braunschweig (Germany); Rethemeyer, Janet [Institute of Geology and Mineralogy, University of Cologne, Cologne (Germany)

    2017-06-01

    Radiocarbon ({sup 14}C) analysis on CO{sub 2} can provide valuable information on the carbon cycle as different carbon pools differ in their {sup 14}C signature. While fresh, biogenic carbon shows atmospheric {sup 14}C concentrations, fossil carbon is {sup 14}C free. As shown in previous studies, CO{sub 2} can be collected for {sup 14}C analysis using molecular sieve cartridges (MSC). These devices have previously been made of plastic and glass, which can easily be damaged during transport. We thus constructed a robust MSC suitable for field application under tough conditions or in remote areas, which is entirely made of stainless steel. The new MSC should also be tight over several months to allow long sampling campaigns and transport times, which was proven by a one year storage test. The reliability of the {sup 14}CO{sub 2} results obtained with the MSC was evaluated by detailed tests of different procedures to clean the molecular sieve (zeolite type 13X) and for the adsorption and desorption of CO{sub 2} from the zeolite using a vacuum rig. We show that the {sup 14}CO{sub 2} results are not affected by any contamination of modern or fossil origin, cross contamination from previous samples, and by carbon isotopic fractionation. In addition, we evaluated the direct CO{sub 2} transfer from the MSC into the automatic graphitization equipment AGE with the subsequent {sup 14}C AMS analysis as graphite. This semi-automatic approach can be fully automated in the future, which would allow a high sample throughput. We obtained very promising, low blank values between 0.0018 and 0.0028 F{sup 14}C (equivalent to 50,800 and 47,200 yrs BP), which are within the analytical background and lower than results obtained in previous studies.

  17. "1"4CO_2 processing using an improved and robust molecular sieve cartridge

    International Nuclear Information System (INIS)

    Wotte, Anja; Wordell-Dietrich, Patrick; Wacker, Lukas; Don, Axel; Rethemeyer, Janet

    2017-01-01

    Radiocarbon ("1"4C) analysis on CO_2 can provide valuable information on the carbon cycle as different carbon pools differ in their "1"4C signature. While fresh, biogenic carbon shows atmospheric "1"4C concentrations, fossil carbon is "1"4C free. As shown in previous studies, CO_2 can be collected for "1"4C analysis using molecular sieve cartridges (MSC). These devices have previously been made of plastic and glass, which can easily be damaged during transport. We thus constructed a robust MSC suitable for field application under tough conditions or in remote areas, which is entirely made of stainless steel. The new MSC should also be tight over several months to allow long sampling campaigns and transport times, which was proven by a one year storage test. The reliability of the "1"4CO_2 results obtained with the MSC was evaluated by detailed tests of different procedures to clean the molecular sieve (zeolite type 13X) and for the adsorption and desorption of CO_2 from the zeolite using a vacuum rig. We show that the "1"4CO_2 results are not affected by any contamination of modern or fossil origin, cross contamination from previous samples, and by carbon isotopic fractionation. In addition, we evaluated the direct CO_2 transfer from the MSC into the automatic graphitization equipment AGE with the subsequent "1"4C AMS analysis as graphite. This semi-automatic approach can be fully automated in the future, which would allow a high sample throughput. We obtained very promising, low blank values between 0.0018 and 0.0028 F"1"4C (equivalent to 50,800 and 47,200 yrs BP), which are within the analytical background and lower than results obtained in previous studies.

  18. PROCESS CAPABILITY ESTIMATION FOR NON-NORMALLY DISTRIBUTED DATA USING ROBUST METHODS - A COMPARATIVE STUDY

    Directory of Open Access Journals (Sweden)

    Yerriswamy Wooluru

    2016-06-01

    Full Text Available Process capability indices are very important process quality assessment tools in automotive industries. The common process capability indices (PCIs Cp, Cpk, Cpm are widely used in practice. The use of these PCIs based on the assumption that process is in control and its output is normally distributed. In practice, normality is not always fulfilled. Indices developed based on normality assumption are very sensitive to non- normal processes. When distribution of a product quality characteristic is non-normal, Cp and Cpk indices calculated using conventional methods often lead to erroneous interpretation of process capability. In the literature, various methods have been proposed for surrogate process capability indices under non normality but few literature sources offer their comprehensive evaluation and comparison of their ability to capture true capability in non-normal situation. In this paper, five methods have been reviewed and capability evaluation is carried out for the data pertaining to resistivity of silicon wafer. The final results revealed that the Burr based percentile method is better than Clements method. Modelling of non-normal data and Box-Cox transformation method using statistical software (Minitab 14 provides reasonably good result as they are very promising methods for non - normal and moderately skewed data (Skewness <= 1.5.

  19. Anaerobic modeling for improving synergy and robustness of a manure co-digestion process

    DEFF Research Database (Denmark)

    Lima, D. M. F.; Rodrigues, J. A. D.; Boe, Kanokwan

    2016-01-01

    Biogas production is becoming increasingly important in the environmental area because, besides treating wastewaters, it also generates energy. Co-digestion has become more and more powerful since it is possible, with the use of abundant and cheap substrates, to dilute the inhibitory effects...... of various other substrates, making the process of anaerobic digestion more efficient and stable. Biogas process modelling describes the kinetics and stoichiometry of different steps in the anaerobic digestion process. This mathematical modelling provides an understanding of the processes and interactions...... occurring inside the biogas system. The present work investigated the interactions between different simple co-substrates (carbohydrate, lipid and protein) and real co-substrates (corn silage, fodder beet, grass and wheat straw) under co-digestion with manure, in order to verify synergetic effects...

  20. ANAEROBIC MODELING FOR IMPROVING SYNERGY AND ROBUSTNESS OF A MANURE CO-DIGESTION PROCESS

    Directory of Open Access Journals (Sweden)

    D. M. F. Lima

    Full Text Available Abstract Biogas production is becoming increasingly important in the environmental area because, besides treating wastewaters, it also generates energy. Co-digestion has become more and more powerful since it is possible, with the use of abundant and cheap substrates, to dilute the inhibitory effects of various other substrates, making the process of anaerobic digestion more efficient and stable. Biogas process modelling describes the kinetics and stoichiometry of different steps in the anaerobic digestion process. This mathematical modelling provides an understanding of the processes and interactions occurring inside the biogas system. The present work investigated the interactions between different simple co-substrates (carbohydrate, lipid and protein and real co-substrates (corn silage, fodder beet, grass and wheat straw under co-digestion with manure, in order to verify synergetic effects. Subsequently, some experiments were reproduced, in order to evaluate the synergy obtained in the previous simulation and validate the model.

  1. Integration of air quality-related planning processes : report

    International Nuclear Information System (INIS)

    2004-05-01

    Several communities in British Columbia have conducted air quality, greenhouse gas, or community energy management plans. This report explored the possibility of integrating 3 community-based air quality-related planning processes into a single process and evaluated the use of these 3 processes by local governments and First Nations in identifying and addressing air quality-related objectives, and determined to what extent they could be integrated to achieve planning objectives for air quality, greenhouse gas emissions, and energy supply and conservation. The lessons learned from 9 case studies in British Columbia were presented. The purpose of the case studies was to examine how communities handled emissions and energy related inventory and planning work, as well as their experiences with, or considerations for, an integrated process. The lessons were grouped under several key themes including organization and stakeholder involvement; messaging and focus; leadership/champions; and resources and capacity. The report also outlined a framework for an integrated planning process and provided recommendations regarding how an integrated or complementary process could be performed. A number of next steps were also offered for the provincial government to move the concept of an integrated process forward with the assistance of other partners. These included identifying the resources required to support communities engaging in an integrated process as well as discussing the series of options for provincial support with key stakeholders. refs., tabs., figs

  2. Application of a tablet film coating model to define a process-imposed transition boundary for robust film coating.

    Science.gov (United States)

    van den Ban, Sander; Pitt, Kendal G; Whiteman, Marshall

    2018-02-01

    A scientific understanding of interaction of product, film coat, film coating process, and equipment is important to enable design and operation of industrial scale pharmaceutical film coating processes that are robust and provide the level of control required to consistently deliver quality film coated product. Thermodynamic film coating conditions provided in the tablet film coating process impact film coat formation and subsequent product quality. A thermodynamic film coating model was used to evaluate film coating process performance over a wide range of film coating equipment from pilot to industrial scale (2.5-400 kg). An approximate process-imposed transition boundary, from operating in a dry to a wet environment, was derived, for relative humidity and exhaust temperature, and used to understand the impact of the film coating process on product formulation and process control requirements. This approximate transition boundary may aid in an enhanced understanding of risk to product quality, application of modern Quality by Design (QbD) based product development, technology transfer and scale-up, and support the science-based justification of critical process parameters (CPPs).

  3. Integrating human factors into process hazard analysis

    International Nuclear Information System (INIS)

    Kariuki, S.G.; Loewe, K.

    2007-01-01

    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors

  4. Integrated modelling of near field and engineered barrier system processes

    International Nuclear Information System (INIS)

    Lamont, A.; Gansemer, J.

    1994-01-01

    The Yucca Mountain Integrating Model (YMIM) is an integrated model of the Engineered barrier System has been developed to assist project managers at LLNL in identifying areas where research emphasis should be placed. The model was designed to be highly modular so that a model of an individual process could be easily modified or replaced without interfering with the models of other processes. The modules modelling container failure and the dissolution of nuclides include particularly detailed, temperature dependent models of their corresponding processes

  5. Value Creation through ICT Integration in Merger & Acquisition Processes

    DEFF Research Database (Denmark)

    Holm Larsen, Michael

    2005-01-01

    As deals are becoming more complex, and as technology, and the people supporting it, are becoming key drivers of merger and acquisition processes, planning of information and communication technologies in early stages of the integration process is vital to the realization of benefits of an Merger...... & Acquisition process. This statement is substantiated through review of literature from academics as well as practitioners, and case exemplifications of the financial service organization, the Nordea Group. Keywords: ICT Integration, Mergers & Acquisitions, Nordea Group....

  6. Ergonomics Integration Omproving Production Process Management in Enterprises of Latvia

    OpenAIRE

    Henrijs Kaļķis

    2013-01-01

    Dotoral thesis ERGONOMICS INTEGRATION IMPROVING PRODUCTION PROCESS MANAGEMENT IN ENTERPRISES OF LATVIA ANNOTATION Ergonomics integration in process management has great significance in organisations` growth of productivity. It is a new approach to entrepreneurship and business strategy, where ergonomic aspects and values are taken into account in ensuring the effective process management and profitability of enterprises. This study is aimed at solution of the problem of e...

  7. The integrity management cycle as a business process

    Energy Technology Data Exchange (ETDEWEB)

    Ackhurst, Trent B.; Peverelli, Romina P. [PIMS - Pipeline Integrity Management Specialists of London Ltd. (United Kingdom).

    2009-07-01

    It is a best-practice Oil and Gas pipeline integrity and reliability technique to apply integrity management cycles. This is conforms to the business principles of continuous improvement. This paper examines the integrity management cycle - both goals and objectives and subsequent component steps - from a business perspective. Traits that businesses require, to glean maximum benefit from such a cycle, are highlighted. A case study focuses upon an integrity and reliability process developed to apply to pipeline operators. installations. This is compared and contrasted to the pipeline integrity management cycle to underline both cycles. consistency with the principles of continuous improvement. (author)

  8. Preparing Health Care Processes for IT Integration

    DEFF Research Database (Denmark)

    Walley, Paul; Laursen, Martin Lindgård

    2005-01-01

    effectiveness and efficiency of the system. Using data from two countries and involving 200 hospitals, the paper addresses the current state of determinacy of processes and explores the potential route towards standardisation. We hypothesise that management paradigms such as “lean thinking...

  9. Sensegiving and sensemaking in integration processes

    DEFF Research Database (Denmark)

    Søderberg, Anne-Marie

    2003-01-01

    at different hierarchical levels, interviewed at different points of timeover a period of six years. The collected narrative interviews are viewed as retrospectiveinterpretations of change processes in the acquired company, made by organizational actorsas parts of the plots they are continually constructing...

  10. Highly efficient organic solar Cells based on a robust room-temperature solution-processed copper iodide hole transporter

    KAUST Repository

    Zhao, Kui

    2015-07-30

    Achieving high performance and reliable organic solar cells hinges on the development of stable and energetically suitable hole transporting buffer layers in tune with the electrode and photoactive materials of the solar cell stack. Here we have identified solution-processed copper(I) iodide (CuI) thin films with low-temperature processing conditions as an effective hole–transporting layer (HTL) for a wide range of polymer:fullerene bulk heterojunction (BHJ) systems. The solar cells using CuI HTL show higher power conversion efficiency (PCE) in standard device structure for polymer blends, up to PCE of 8.8%, as compared with poly(3,4-ethylenedioxy-thiophene):poly(styrenesulfonate) (PEDOT:PSS) HTL, for a broad range of polymer:fullerene systems. The CuI layer properties and solar cell device behavior are shown to be remarkably robust and insensitive to a wide range of processing conditions of the HTL, including processing solvent, annealing temperature (room temperature up to 200 °C), and film thickness. CuI is also shown to improve the overall lifetime of solar cells in the standard architecture as compared to PEDOT:PSS. We further demonstrate promising solar cell performance when using CuI as top HTL in an inverted device architecture. The observation of uncommon properties, such as photoconductivity of CuI and templating effects on the BHJ layer formation, are also discussed. This study points to CuI as being a good candidate to replace PEDOT:PSS in solution-processed solar cells thanks to the facile implementation and demonstrated robustness of CuI thin films.

  11. Highly efficient organic solar Cells based on a robust room-temperature solution-processed copper iodide hole transporter

    KAUST Repository

    Zhao, Kui; Ngongang Ndjawa, Guy Olivier; Jagadamma, Lethy Krishnan; El Labban, Abdulrahman; Hu, Hanlin; Wang, Qingxiao; Li, Ruipeng; Abdelsamie, Maged; Beaujuge, Pierre; Amassian, Aram

    2015-01-01

    Achieving high performance and reliable organic solar cells hinges on the development of stable and energetically suitable hole transporting buffer layers in tune with the electrode and photoactive materials of the solar cell stack. Here we have identified solution-processed copper(I) iodide (CuI) thin films with low-temperature processing conditions as an effective hole–transporting layer (HTL) for a wide range of polymer:fullerene bulk heterojunction (BHJ) systems. The solar cells using CuI HTL show higher power conversion efficiency (PCE) in standard device structure for polymer blends, up to PCE of 8.8%, as compared with poly(3,4-ethylenedioxy-thiophene):poly(styrenesulfonate) (PEDOT:PSS) HTL, for a broad range of polymer:fullerene systems. The CuI layer properties and solar cell device behavior are shown to be remarkably robust and insensitive to a wide range of processing conditions of the HTL, including processing solvent, annealing temperature (room temperature up to 200 °C), and film thickness. CuI is also shown to improve the overall lifetime of solar cells in the standard architecture as compared to PEDOT:PSS. We further demonstrate promising solar cell performance when using CuI as top HTL in an inverted device architecture. The observation of uncommon properties, such as photoconductivity of CuI and templating effects on the BHJ layer formation, are also discussed. This study points to CuI as being a good candidate to replace PEDOT:PSS in solution-processed solar cells thanks to the facile implementation and demonstrated robustness of CuI thin films.

  12. E-IMPACT - A ROBUST HAZARD-BASED ENVIRONMENTAL IMPACT ASSESSMENT APPROACH FOR PROCESS INDUSTRIES

    Directory of Open Access Journals (Sweden)

    KHANDOKER A. HOSSAIN

    2008-04-01

    Full Text Available This paper proposes a hazard-based environmental impact assessment approach (E-Impact, for evaluating the environmental impact during process design and retrofit stages. E-Impact replaces the normalisation step of the conventional impact assessment phase. This approach compares the impact scores for different options and assigns a relative score to each option. This eliminates the complexity of the normalisation step in the evaluation phase. The applicability of the E-Impact has been illustrated through a case study of solvent selection in an acrylic acid manufacturing plant. E-Impact is used in conjunction with Aspen-HYSYS process simulator to develop mass and heat balance data.

  13. Integrating Porous Resins In Enzymatic Processes

    DEFF Research Database (Denmark)

    Al-Haque, Naweed

    . Screening resins for moderately hydrophobic multi-component systems is challenging. Often it is found that the capacity of the resin is inversely related with product selectivity. Therefore a tradeoff has to be made between these parameters which can be crucial from an economic point of view. A low resin...... procedure. The screening therefore becomes a multi-objective task that has to be solved simultaneously. Such an approach has been applied in the method formulated in this framework. To overcome these challenges, different process strategies are required to obtain high yields. A number of different...... inhibition, has gained considerable recognition. The resins act as a reservoir for the inhibitory substrate and a sink for the inhibitory product and simultaneously attain the required high substrate loading to make the process economically feasible. In this way the potential benefit of the enzyme can...

  14. Inverted process for graphene integrated circuits fabrication.

    Science.gov (United States)

    Lv, Hongming; Wu, Huaqiang; Liu, Jinbiao; Huang, Can; Li, Junfeng; Yu, Jiahan; Niu, Jiebin; Xu, Qiuxia; Yu, Zhiping; Qian, He

    2014-06-07

    CMOS compatible 200 mm two-layer-routing technology is employed to fabricate graphene field-effect transistors (GFETs) and monolithic graphene ICs. The process is inverse to traditional Si technology. Passive elements are fabricated in the first metal layer and GFETs are formed with buried gate/source/drain in the second metal layer. Gate dielectric of 3.1 nm in equivalent oxide thickness (EOT) is employed. 500 nm-gate-length GFETs feature a yield of 80% and fT/fmax = 17 GHz/15.2 GHz RF performance. A high-performance monolithic graphene frequency multiplier is demonstrated using the proposed process. Functionality was demonstrated up to 8 GHz input and 16 GHz output. The frequency multiplier features a 3 dB bandwidth of 4 GHz and conversion gain of -26 dB.

  15. Robust Unconventional Interaction Design and Hybrid Tool Environments for Design and Engineering Processes

    NARCIS (Netherlands)

    Wendrich, Robert E.; Kruiper, Ruben

    2017-01-01

    This paper investigates how and whether existing or current design tools, assist and support designers and engineers in the early-phases of ideation and conceptualization stages of design and engineering processes. The research explores how fluidly and/or congruously technology affords cognitive,

  16. A numerical approach to robust in-line control of roll forming processes

    NARCIS (Netherlands)

    Wiebenga, J.H.; Weiss, M.; Rolfe, B.; van den Boogaard, A.H.

    2012-01-01

    The quality of roll formed products is known to be highly sensitive and dependent on the process parameters and thus the unavoidable variations of these parameters during mass production. To maintain a constant high product quality, a new roll former with an adjustable final roll forming stand is

  17. Formalism of continual integrals for cascade processes with particle fusion

    International Nuclear Information System (INIS)

    Gedalin, Eh.V.

    1987-01-01

    Formalism of continuous integrals for description of cascade processes, in which besides cascade particle reproduction, their synthesis and coalescence take place, is used. Account of cascade particle coalescence leads to the fact that the development of some cascade branches cannot be independent and main equations of the cascade process become functional instead of integral. The method of continuous intagrals permits to construct in the closed form producing functionals for the cascade process and to obtain the rules of their calculation using diagrams. Analytical expressions in the form of continuous integrals for producing functionals describing cascade development are obtained

  18. Monitoring of laser material processing using machine integrated low-coherence interferometry

    Science.gov (United States)

    Kunze, Rouwen; König, Niels; Schmitt, Robert

    2017-06-01

    Laser material processing has become an indispensable tool in modern production. With the availability of high power pico- and femtosecond laser sources, laser material processing is advancing into applications, which demand for highest accuracies such as laser micro milling or laser drilling. In order to enable narrow tolerance windows, a closedloop monitoring of the geometrical properties of the processed work piece is essential for achieving a robust manufacturing process. Low coherence interferometry (LCI) is a high-precision measuring principle well-known from surface metrology. In recent years, we demonstrated successful integrations of LCI into several different laser material processing methods. Within this paper, we give an overview about the different machine integration strategies, that always aim at a complete and ideally telecentric integration of the measurement device into the existing beam path of the processing laser. Thus, highly accurate depth measurements within machine coordinates and a subsequent process control and quality assurance are possible. First products using this principle have already found its way to the market, which underlines the potential of this technology for the monitoring of laser material processing.

  19. Systems integration processes for space nuclear electric propulsion systems

    International Nuclear Information System (INIS)

    Olsen, C.S.; Rice, J.W.; Stanley, M.L.

    1991-01-01

    The various components and subsystems that comprise a nuclear electric propulsion system should be developed and integrated so that each functions ideally and so that each is properly integrated with the other components and subsystems in the optimum way. This paper discusses how processes similar to those used in the development and intergration of the subsystems that comprise the Multimegawatt Space Nuclear Power System concepts can be and are being efficiently and effectively utilized for these purposes. The processes discussed include the development of functional and operational requirements at the system and subsystem level; the assessment of individual nuclear power supply and thruster concepts and their associated technologies; the conduct of systems integration efforts including the evaluation of the mission benefits for each system; the identification and resolution of concepts development, technology development, and systems integration feasibility issues; subsystem, system, and technology development and integration; and ground and flight subsystem and integrated system testing

  20. METHODOLOGY FRAMEWORK FOR PROCESS INTEGRATION AND SERVICE MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Darko Galinec

    2007-06-01

    Full Text Available History of information systems development was driven by business system's functions automation and mergers and acquisitions - business subjects integration into a whole. Modern business requires business processes integration through their dynamics and thus enterprise application integration (EAI as well. In this connection it is necessary to find ways and means of application integration and interaction in a consistent and reliable way. The real-time enterprise (RTE monitors, captures and analyzes root causes and overt events that are critical to its success the instant those events occur [6]. EAI is determined by business needs and business requirements. It must be based on business process repository and models, business integration methodology (BIM and information flow as well. Decisions concerning technology must be in function of successful application integration. In this paper EAI methodological framework and technological concepts for its achievements are introduced.

  1. Robustness of near-infrared calibration models for the prediction of milk constituents during the milking process.

    Science.gov (United States)

    Melfsen, Andreas; Hartung, Eberhard; Haeussermann, Angelika

    2013-02-01

    The robustness of in-line raw milk analysis with near-infrared spectroscopy (NIRS) was tested with respect to the prediction of the raw milk contents fat, protein and lactose. Near-infrared (NIR) spectra of raw milk (n = 3119) were acquired on three different farms during the milking process of 354 milkings over a period of six months. Calibration models were calculated for: a random data set of each farm (fully random internal calibration); first two thirds of the visits per farm (internal calibration); whole datasets of two of the three farms (external calibration), and combinations of external and internal datasets. Validation was done either on the remaining data set per farm (internal validation) or on data of the remaining farms (external validation). Excellent calibration results were obtained when fully randomised internal calibration sets were used for milk analysis. In this case, RPD values of around ten, five and three for the prediction of fat, protein and lactose content, respectively, were achieved. Farm internal calibrations achieved much poorer prediction results especially for the prediction of protein and lactose with RPD values of around two and one respectively. The prediction accuracy improved when validation was done on spectra of an external farm, mainly due to the higher sample variation in external calibration sets in terms of feeding diets and individual cow effects. The results showed that further improvements were achieved when additional farm information was added to the calibration set. One of the main requirements towards a robust calibration model is the ability to predict milk constituents in unknown future milk samples. The robustness and quality of prediction increases with increasing variation of, e.g., feeding and cow individual milk composition in the calibration model.

  2. Robust Selection Algorithm (RSA) for Multi-Omic Biomarker Discovery; Integration with Functional Network Analysis to Identify miRNA Regulated Pathways in Multiple Cancers.

    Science.gov (United States)

    Sehgal, Vasudha; Seviour, Elena G; Moss, Tyler J; Mills, Gordon B; Azencott, Robert; Ram, Prahlad T

    2015-01-01

    MicroRNAs (miRNAs) play a crucial role in the maintenance of cellular homeostasis by regulating the expression of their target genes. As such, the dysregulation of miRNA expression has been frequently linked to cancer. With rapidly accumulating molecular data linked to patient outcome, the need for identification of robust multi-omic molecular markers is critical in order to provide clinical impact. While previous bioinformatic tools have been developed to identify potential biomarkers in cancer, these methods do not allow for rapid classification of oncogenes versus tumor suppressors taking into account robust differential expression, cutoffs, p-values and non-normality of the data. Here, we propose a methodology, Robust Selection Algorithm (RSA) that addresses these important problems in big data omics analysis. The robustness of the survival analysis is ensured by identification of optimal cutoff values of omics expression, strengthened by p-value computed through intensive random resampling taking into account any non-normality in the data and integration into multi-omic functional networks. Here we have analyzed pan-cancer miRNA patient data to identify functional pathways involved in cancer progression that are associated with selected miRNA identified by RSA. Our approach demonstrates the way in which existing survival analysis techniques can be integrated with a functional network analysis framework to efficiently identify promising biomarkers and novel therapeutic candidates across diseases.

  3. Improving Construction Process through Integration and Concurrent Engineering

    Directory of Open Access Journals (Sweden)

    Malik Khalfan

    2012-11-01

    Full Text Available In an increasingly competitive business environment, improvedtime-to-market, reduced production cost, quality of the productand customer involvement are rapidly becoming the key successfactors for any product development process. Consequently, mostorganisations are moving towards the adoption of latest technologyand new management concepts and philosophies such as totalquality management and concurrent engineering (CE to bringimprovement in their product development process. This paperdiscusses the adoption of integrated processes and CE withinthe construction industry to enable construction organisations toimprove their project development process. It also discusses aproposed integrated database model for the construction projects,which should enable the construction process to improve, becomemore effective and more efficient.

  4. Integrated Process Design, Control and Analysis of Intensified Chemical Processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil

    chemical processes; for example, intensified processes such as reactive distillation. Most importantly, it identifies and eliminates potentially promising design alternatives that may have controllability problems later. To date, a number of methodologies have been proposed and applied on various problems......, that the same principles that apply to a binary non-reactive compound system are valid also for a binary-element or a multi-element system. Therefore, it is advantageous to employ the element based method for multicomponent reaction-separation systems. It is shown that the same design-control principles...

  5. Integrating risk management into the baselining process

    International Nuclear Information System (INIS)

    Jennett, N.; Tonkinson, A.

    1994-01-01

    These processes work together in building the project (comprised of the technical, schedule, and cost baselines) against which performance is measured and changes to the scope, schedule and cost of a project are managed and controlled. Risk analysis is often performed as the final element of the scheduling or estimating processes, a precursor to establishing cost and schedule contingency. However, best business practices dictate that information that may be crucial to the success of a project be analyzed and incorporated into project planning as soon as it is available and usable. The purpose or risk management is not to eliminate risk. Neither is it intended to suggest wholesale re-estimating and re-scheduling of a project. Rather, the intent is to make provisions to reduce and control the schedule and/or cost ramifications of risk by anticipating events and conditions that cannot be reliably planned for and which have the potential to negatively impact accomplishment of the technical objectives and requirements of the project

  6. Psychological needs and the facilitation of integrative processes.

    Science.gov (United States)

    Ryan, R M

    1995-09-01

    The assumption that there are innate integrative or actualizing tendencies underlying personality and social development is reexamined. Rather than viewing such processes as either nonexistent or as automatic, I argue that they are dynamic and dependent upon social-contextual supports pertaining to basic human psychological needs. To develop this viewpoint, I conceptually link the notion of integrative tendencies to specific developmental processes, namely intrinsic motivation; internalization; and emotional integration. These processes are then shown to be facilitated by conditions that fulfill psychological needs for autonomy, competence, and relatedness, and forestalled within contexts that frustrate these needs. Interactions between psychological needs and contextual supports account, in part, for the domain and situational specificity of motivation, experience, and relative integration. The meaning of psychological needs (vs. wants) is directly considered, as are the relations between concepts of integration and autonomy and those of independence, individualism, efficacy, and cognitive models of "multiple selves."

  7. CMOS and BiCMOS process integration and device characterization

    CERN Document Server

    El-Kareh, Badih

    2009-01-01

    Covers both the theoretical and practical aspects of modern silicon devices and the relationship between their electrical properties and processing conditions. This book also covers silicon devices and integrated process technologies. It discusses modern silicon devices, their characteristics, and interactions with process parameters.

  8. Integrating Scientific Array Processing into Standard SQL

    Science.gov (United States)

    Misev, Dimitar; Bachhuber, Johannes; Baumann, Peter

    2014-05-01

    We live in a time that is dominated by data. Data storage is cheap and more applications than ever accrue vast amounts of data. Storing the emerging multidimensional data sets efficiently, however, and allowing them to be queried by their inherent structure, is a challenge many databases have to face today. Despite the fact that multidimensional array data is almost always linked to additional, non-array information, array databases have mostly developed separately from relational systems, resulting in a disparity between the two database categories. The current SQL standard and SQL DBMS supports arrays - and in an extension also multidimensional arrays - but does so in a very rudimentary and inefficient way. This poster demonstrates the practicality of an SQL extension for array processing, implemented in a proof-of-concept multi-faceted system that manages a federation of array and relational database systems, providing transparent, efficient and scalable access to the heterogeneous data in them.

  9. Integrated Dialogue System for Spatial Decision Process

    International Nuclear Information System (INIS)

    Nishiyama, Yumi; Fukui, Hiromichi; Kaneyasu, Iwao; Nagasaka, Toshinari; Usuda, Yuichiro; Sakamoto, Ai; Kusafuka, Minako

    2003-01-01

    The disposal of high-level radioactive waste (HLW) is a difficult challenge for all countries that uses nuclear energy. In Japan, an implementing agency for HLW was authorized in 2001, and now seeking for municipalities that voluntarily apply to be a preliminary investigation area for a final disposal site. Along with these policy progresses, the HLW disposal program has been gaining social attentions. This leads to high demand for a systematic process for evaluating the proposed policy and environmental impact of geological disposal so that policy decisions can adequately address technical, ethical, and social considerations. As a step toward this objective, we have developed a participatory decision support system on the web. Web-based communication is in its infancy but may be viable support tool to engage different people. Through the study, we aimed to examine the possibility of web-based dialogue system for spatial decision process. One conclusion from the web-based dialogue is that it is possible to create a working environment on the web within those who have different backgrounds and interests. From the results, we found many findings that should be taken into account for further development. One is the need to re-construct the data, model imagery and opinions to judge the problem objectively. We will reexamine the contents based on the international activities so that participants can understand what the information means in the context. Facilitation is key element on the web, also. He or she is expected to make the atmosphere where even those who don't have high-level knowledge can participate in and arouse their opinion from the faceless communication. In the point, the auto navigation comes in very useful

  10. 49 CFR 1106.4 - The Safety Integration Plan process.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 8 2010-10-01 2010-10-01 false The Safety Integration Plan process. 1106.4 Section 1106.4 Transportation Other Regulations Relating to Transportation (Continued) SURFACE... CONSIDERATION OF SAFETY INTEGRATION PLANS IN CASES INVOLVING RAILROAD CONSOLIDATIONS, MERGERS, AND ACQUISITIONS...

  11. Integrating Islamist Militants into the Political Process : Palestinian ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Integrating Islamist Militants into the Political Process : Palestinian Hamas. The striking victory of Hamas in the elections of January 2006 raises questions about the integration of Islamists into the Palestinian political system. This project, which is part of a larger program of research on the role of political parties in the Middle ...

  12. A Robust Multi-Scale Modeling System for the Study of Cloud and Precipitation Processes

    Science.gov (United States)

    Tao, Wei-Kuo

    2012-01-01

    During the past decade, numerical weather and global non-hydrostatic models have started using more complex microphysical schemes originally developed for high resolution cloud resolving models (CRMs) with 1-2 km or less horizontal resolutions. These microphysical schemes affect the dynamic through the release of latent heat (buoyancy loading and pressure gradient) the radiation through the cloud coverage (vertical distribution of cloud species), and surface processes through rainfall (both amount and intensity). Recently, several major improvements of ice microphysical processes (or schemes) have been developed for cloud-resolving model (Goddard Cumulus Ensemble, GCE, model) and regional scale (Weather Research and Forecast, WRF) model. These improvements include an improved 3-ICE (cloud ice, snow and graupel) scheme (Lang et al. 2010); a 4-ICE (cloud ice, snow, graupel and hail) scheme and a spectral bin microphysics scheme and two different two-moment microphysics schemes. The performance of these schemes has been evaluated by using observational data from TRMM and other major field campaigns. In this talk, we will present the high-resolution (1 km) GeE and WRF model simulations and compared the simulated model results with observation from recent field campaigns [i.e., midlatitude continental spring season (MC3E; 2010), high latitude cold-season (C3VP, 2007; GCPEx, 2012), and tropical oceanic (TWP-ICE, 2006)].

  13. A sensitivity analysis of process design parameters, commodity prices and robustness on the economics of odour abatement technologies.

    Science.gov (United States)

    Estrada, José M; Kraakman, N J R Bart; Lebrero, Raquel; Muñoz, Raúl

    2012-01-01

    The sensitivity of the economics of the five most commonly applied odour abatement technologies (biofiltration, biotrickling filtration, activated carbon adsorption, chemical scrubbing and a hybrid technology consisting of a biotrickling filter coupled with carbon adsorption) towards design parameters and commodity prices was evaluated. Besides, the influence of the geographical location on the Net Present Value calculated for a 20 years lifespan (NPV20) of each technology and its robustness towards typical process fluctuations and operational upsets were also assessed. This comparative analysis showed that biological techniques present lower operating costs (up to 6 times) and lower sensitivity than their physical/chemical counterparts, with the packing material being the key parameter affecting their operating costs (40-50% of the total operating costs). The use of recycled or partially treated water (e.g. secondary effluent in wastewater treatment plants) offers an opportunity to significantly reduce costs in biological techniques. Physical/chemical technologies present a high sensitivity towards H2S concentration, which is an important drawback due to the fluctuating nature of malodorous emissions. The geographical analysis evidenced high NPV20 variations around the world for all the technologies evaluated, but despite the differences in wage and price levels, biofiltration and biotrickling filtration are always the most cost-efficient alternatives (NPV20). When, in an economical evaluation, the robustness is as relevant as the overall costs (NPV20), the hybrid technology would move up next to BTF as the most preferred technologies. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. Integration mockup and process material management system

    Science.gov (United States)

    Verble, Adas James, Jr.

    1992-01-01

    Work to define and develop a full scale Space Station Freedom (SSF) mockup with the flexibility to evolve into future designs, to validate techniques for maintenance and logistics and verify human task allocations and support trade studies is described. This work began in early 1985 and ended in August, 1991. The mockups are presently being used at MSFC in Building 4755 as a technology and design testbed, as well as for public display. Micro Craft also began work on the Process Material Management System (PMMS) under this contract. The PMMS simulator was a sealed enclosure for testing to identify liquids, gaseous, particulate samples, and specimen including, urine, waste water, condensate, hazardous gases, surrogate gasses, liquids, and solids. The SSF would require many trade studies to validate techniques for maintenance and logistics and verify system task allocations; it was necessary to develop a full scale mockup which would be representative of current SSF design with the ease of changing those designs as the SSF design evolved and changed. The tasks defined for Micro Craft were to provide the personnel, services, tools, and materials for the SSF mockup which would consist of four modules, nodes, interior components, and part task mockups of MSFC responsible engineering systems. This included the Engineering Control and Life Support Systems (ECLSS) testbed. For the initial study, the mockups were low fidelity, soft mockups of graphics art bottle, and other low cost materials, which evolved into higher fidelity mockups as the R&D design evolved, by modifying or rebuilding, an important cost saving factor in the design process. We designed, fabricated, and maintained the full size mockup shells and support stands. The shells consisted of cylinders, end cones, rings, longerons, docking ports, crew airlocks, and windows. The ECLSS required a heavier cylinder to support the ECLSS systems test program. Details of this activity will be covered. Support stands were

  15. Business process management and IT management: The missing integration

    DEFF Research Database (Denmark)

    Rahimi, Fatemeh; Møller, Charles; Hvam, Lars

    2016-01-01

    of IT on process innovations, the association between business process management and IT management is under-explored. Drawing on a literature analysis of the capabilities of business process and IT governance frameworks and findings from a case study, we propose the need for horizontal integration between the two......The importance of business processes and the centrality of IT to contemporary organizations' performance calls for a specific focus on business process management and IT management. Despite the wide scope of business process management covering both business and IT domains, and the profound impact...... management functions to enable strategic and operational business - IT alignment. We further argue that the role of IT in an organization influences the direction of integration between the two functions and thus the choice of integration mechanisms. Using case study findings, we propose...

  16. Integration of MGDS design into the licensing process

    International Nuclear Information System (INIS)

    1997-12-01

    This paper presents an overview of how the Mined Geologic Disposal System (MGDS) design for a potential repository is integrated into the licensing process. The integration process employs a two-told approach: (1) ensure that the MGDS design complies with applicable Nuclear Regulatory Commission (NRC) licensing requirements, and (2) ensure that the MGDS design is appropriately reflected in a license application that is acceptable to the NRC for performing acceptance and compliance reviews

  17. A path-integral approach to inclusive processes

    International Nuclear Information System (INIS)

    Sukumar, C.V.

    1995-01-01

    The cross section for an inclusive scattering process may be expressed in terms of a double path integral. Evaluation of the double path integral by the stationary-phase approximation yields classical equations of motion for the stationary trajectories and a classical cross section for the inclusive process which depends on the polarization of the initial state. Polarization analyzing powers are calculated from this theory and the results are compared with those obtained in an earlier paper. ((orig.))

  18. Integrated Process Design and Control of Reactive Distillation Processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil; Sales-Cruz, Mauricio; Huusom, Jakob Kjøbsted

    2015-01-01

    on the element concept, which is used to translate a system of compounds into elements. The operation of the reactive distillation column at the highest driving force and other candidate points is analyzed through analytical solution as well as rigorous open-loop and closed-loop simulations. By application...... of this approach, it is shown that designing the reactive distillation process at the maximum driving force results in an optimal design in terms of controllability and operability. It is verified that the reactive distillation design option is less sensitive to the disturbances in the feed at the highest driving...

  19. Extreme temperature robust optical sensor designs and fault-tolerant signal processing

    Science.gov (United States)

    Riza, Nabeel Agha [Oviedo, FL; Perez, Frank [Tujunga, CA

    2012-01-17

    Silicon Carbide (SiC) probe designs for extreme temperature and pressure sensing uses a single crystal SiC optical chip encased in a sintered SiC material probe. The SiC chip may be protected for high temperature only use or exposed for both temperature and pressure sensing. Hybrid signal processing techniques allow fault-tolerant extreme temperature sensing. Wavelength peak-to-peak (or null-to-null) collective spectrum spread measurement to detect wavelength peak/null shift measurement forms a coarse-fine temperature measurement using broadband spectrum monitoring. The SiC probe frontend acts as a stable emissivity Black-body radiator and monitoring the shift in radiation spectrum enables a pyrometer. This application combines all-SiC pyrometry with thick SiC etalon laser interferometry within a free-spectral range to form a coarse-fine temperature measurement sensor. RF notch filtering techniques improve the sensitivity of the temperature measurement where fine spectral shift or spectrum measurements are needed to deduce temperature.

  20. Robust resistive memory devices using solution-processable metal-coordinated azo aromatics

    Science.gov (United States)

    Goswami, Sreetosh; Matula, Adam J.; Rath, Santi P.; Hedström, Svante; Saha, Surajit; Annamalai, Meenakshi; Sengupta, Debabrata; Patra, Abhijeet; Ghosh, Siddhartha; Jani, Hariom; Sarkar, Soumya; Motapothula, Mallikarjuna Rao; Nijhuis, Christian A.; Martin, Jens; Goswami, Sreebrata; Batista, Victor S.; Venkatesan, T.

    2017-12-01

    Non-volatile memories will play a decisive role in the next generation of digital technology. Flash memories are currently the key player in the field, yet they fail to meet the commercial demands of scalability and endurance. Resistive memory devices, and in particular memories based on low-cost, solution-processable and chemically tunable organic materials, are promising alternatives explored by the industry. However, to date, they have been lacking the performance and mechanistic understanding required for commercial translation. Here we report a resistive memory device based on a spin-coated active layer of a transition-metal complex, which shows high reproducibility (~350 devices), fast switching (106 s) and scalability (down to ~60 nm2). In situ Raman and ultraviolet-visible spectroscopy alongside spectroelectrochemistry and quantum chemical calculations demonstrate that the redox state of the ligands determines the switching states of the device whereas the counterions control the hysteresis. This insight may accelerate the technological deployment of organic resistive memories.

  1. Diagnosis about Integration process of youth foreign Catalan

    Directory of Open Access Journals (Sweden)

    Marta Sabariego Puig

    2015-09-01

    Full Text Available This paper presents the results of a diagnostic study on the real integration process of young migrants in Catalonia. The study was carried out using a descriptive survey of 3,830 young Catalans from varying cultural backgrounds aged between 14 and 18, with the aim of identifying the key factors influencing the integration of Catalan migrant youth. Also, in order to analyze these key elements in greater depth as factors easing and/or obstructing integration, four discussion groups were held with the same young people. Results reveal achievements and challenges for further study, useful for the design of social and educational policies which may promote the integration process, understood in its structural, social, cognitive-cultural and identitary dimensions. Our study confirms the need for a society with pluralistic beliefs, principles and actions, which should be reflected in democratic systems and social and educational policies based on the concept of integration as reciprocity.

  2. Metabolic Control in Mammalian Fed-Batch Cell Cultures for Reduced Lactic Acid Accumulation and Improved Process Robustness

    Directory of Open Access Journals (Sweden)

    Viktor Konakovsky

    2016-01-01

    Full Text Available Biomass and cell-specific metabolic rates usually change dynamically over time, making the “feed according to need” strategy difficult to realize in a commercial fed-batch process. We here demonstrate a novel feeding strategy which is designed to hold a particular metabolic state in a fed-batch process by adaptive feeding in real time. The feed rate is calculated with a transferable biomass model based on capacitance, which changes the nutrient flow stoichiometrically in real time. A limited glucose environment was used to confine the cell in a particular metabolic state. In order to cope with uncertainty, two strategies were tested to change the adaptive feed rate and prevent starvation while in limitation: (i inline pH and online glucose concentration measurement or (ii inline pH alone, which was shown to be sufficient for the problem statement. In this contribution, we achieved metabolic control within a defined target range. The direct benefit was two-fold: the lactic acid profile was improved and pH could be kept stable. Multivariate Data Analysis (MVDA has shown that pH influenced lactic acid production or consumption in historical data sets. We demonstrate that a low pH (around 6.8 is not required for our strategy, as glucose availability is already limiting the flux. On the contrary, we boosted glycolytic flux in glucose limitation by setting the pH to 7.4. This new approach led to a yield of lactic acid/glucose (Y L/G around zero for the whole process time and high titers in our labs. We hypothesize that a higher carbon flux, resulting from a higher pH, may lead to more cells which produce more product. The relevance of this work aims at feeding mammalian cell cultures safely in limitation with a desired metabolic flux range. This resulted in extremely stable, low glucose levels, very robust pH profiles without acid/base interventions and a metabolic state in which lactic acid was consumed instead of being produced from day 1. With

  3. Advances in robust fractional control

    CERN Document Server

    Padula, Fabrizio

    2015-01-01

    This monograph presents design methodologies for (robust) fractional control systems. It shows the reader how to take advantage of the superior flexibility of fractional control systems compared with integer-order systems in achieving more challenging control requirements. There is a high degree of current interest in fractional systems and fractional control arising from both academia and industry and readers from both milieux are catered to in the text. Different design approaches having in common a trade-off between robustness and performance of the control system are considered explicitly. The text generalizes methodologies, techniques and theoretical results that have been successfully applied in classical (integer) control to the fractional case. The first part of Advances in Robust Fractional Control is the more industrially-oriented. It focuses on the design of fractional controllers for integer processes. In particular, it considers fractional-order proportional-integral-derivative controllers, becau...

  4. Total site integration of light hydrocarbons separation process

    OpenAIRE

    Ulyev, L.; Vasilyev, M.; Maatouk, A.; Duic, Neven; Khusanovc, Alisher

    2016-01-01

    Ukraine is the largest consumer of hydrocarbons per unit of production in Europe (Ukraine policy review, 2006). The most important point is a reduction of energy consumption in chemical and metallurgical industries as a biggest consumer. This paper deals with energy savings potential of light hydrocarbons separation process. Energy consumption of light hydrocarbons separation process processes typical of Eastern European countries were analysed. Process Integration (PI) was used to perform a ...

  5. Optimal integration of organic Rankine cycles with industrial processes

    International Nuclear Information System (INIS)

    Hipólito-Valencia, Brígido J.; Rubio-Castro, Eusiel; Ponce-Ortega, José M.; Serna-González, Medardo; Nápoles-Rivera, Fabricio; El-Halwagi, Mahmoud M.

    2013-01-01

    Highlights: • An optimization approach for heat integration is proposed. • A new general superstructure for heat integration is proposed. • Heat process streams are simultaneously integrated with an organic Rankine cycle. • Better results can be obtained respect to other previously reported methodologies. - Abstract: This paper presents a procedure for simultaneously handling the problem of optimal integration of regenerative organic Rankine cycles (ORCs) with overall processes. ORCs may allow the recovery of an important fraction of the low-temperature process excess heat (i.e., waste heat from industrial processes) in the form of mechanical energy. An integrated stagewise superstructure is proposed for representing the interconnections and interactions between the HEN and ORC for fixed data of process streams. Based on the integrated superstructure, the optimization problem is formulated as a mixed integer nonlinear programming problem to simultaneously account for the capital and operating costs including the revenue from the sale of the shaft power produced by the integrated system. The application of this method is illustrated with three example problems. Results show that the proposed procedure provides significantly better results than an earlier developed method for discovering optimal integrated systems using a sequential approach, due to the fact that it accounts simultaneously for the tradeoffs between the capital and operating costs as well as the sale of the produced energy. Also, the proposed method is an improvement over the previously reported methods for solving the synthesis problem of heat exchanger networks without the option of integration with an ORC (i.e., stand-alone heat exchanger networks)

  6. The mechanism of development of integration processes in the region

    Directory of Open Access Journals (Sweden)

    V. M. Bautin

    2017-01-01

    Full Text Available In the context of the weakening economic development of the region, it is necessary to find new ways to increase the efficiency of interaction between the economic structures of the region. One of the areas is the development of integration processes in the field of cooperation between the public and private capital to meet the goals and objectives of the effective functioning of both the participants of integration interaction, as well as the region as a whole. Factors that influence the emergence and development of integration processes, are a scarce resource; motivated by the need to diversify the business; the desire to improve the economic efficiency of business entities. Development grace-integral process is economic interaction managing subjects, followed by combining them to achieve common objectives and obtain the synergistic effect due to a number of resource solutions, organizational and administrative problems. To obtain high economic benefits of integration interaction of the participants, we have proposed a mechanism for the development of integration processes in the region, based on three levels of interaction between regional authorities, educational institutions and private organizations. It allows forming a single chain integration and process management to increase the effectiveness of their implementation in practice, and to avoid the disadvantages associated with the formation of the integrated structures. Integration cooperation of regional authorities with organizations of various spheres of activity of the region and education (research organizations is a key component of the new Russian innovation policy because, if done right, it provides broader benefits from investments in research and development, creating favorable conditions for sustainable innovation development and is a strategic factor in the economic growth of the region.

  7. Integrated Main Propulsion System Performance Reconstruction Process/Models

    Science.gov (United States)

    Lopez, Eduardo; Elliott, Katie; Snell, Steven; Evans, Michael

    2013-01-01

    The Integrated Main Propulsion System (MPS) Performance Reconstruction process provides the MPS post-flight data files needed for postflight reporting to the project integration management and key customers to verify flight performance. This process/model was used as the baseline for the currently ongoing Space Launch System (SLS) work. The process utilizes several methodologies, including multiple software programs, to model integrated propulsion system performance through space shuttle ascent. It is used to evaluate integrated propulsion systems, including propellant tanks, feed systems, rocket engine, and pressurization systems performance throughout ascent based on flight pressure and temperature data. The latest revision incorporates new methods based on main engine power balance model updates to model higher mixture ratio operation at lower engine power levels.

  8. Integration of process computer systems to Cofrentes NPP

    International Nuclear Information System (INIS)

    Saettone Justo, A.; Pindado Andres, R.; Buedo Jimenez, J.L.; Jimenez Fernandez-Sesma, A.; Delgado Muelas, J.A.

    1997-01-01

    The existence of three different process computer systems in Cofrentes NPP and the ageing of two of them have led to the need for their integration into a single real time computer system, known as Integrated ERIS-Computer System (SIEC), which covers the functionality of the three systems: Process Computer (PC), Emergency Response Information System (ERIS) and Nuclear Calculation Computer (OCN). The paper describes the integration project developed, which has essentially consisted in the integration of PC, ERIS and OCN databases into a single database, the migration of programs from the old process computer into the new SIEC hardware-software platform and the installation of a communications programme to transmit all necessary data for OCN programs from the SIEC computer, which in the new configuration is responsible for managing the databases of the whole system. (Author)

  9. Software features and applications in process design, integration and operation

    Energy Technology Data Exchange (ETDEWEB)

    Dhole, V. [Aspen Tech Limited, Warrington (United Kingdom)

    1999-02-01

    Process engineering technologies and tools have evolved rapidly over the last twenty years. Process simulation/modeling, advanced process control, on-line optimisation, production planning and supply chain management are some of the examples of technologies that have rapidly matured from early commercial prototypes and concepts to established tools with significant impact on profitability of process industry today. Process Synthesis or Process Integration (PI) in comparison is yet to create its impact and still remains largely in the domain of few expert users. One of the key reasons as to why PI has not taken off is because the PI tools have not become integral components of the standard process engineering environments. On the last 15 years AspenTech has grown from a small process simulation tool provider to a large multinational company providing a complete suite of process engineering technologies and services covering process design, operation, planning and supply chain management. Throughout this period, AspenTech has acquired experience in rapidly evolving technologies from their early prototype stage to mature products and services. The paper outlines AspenTech`s strategy of integrating PI with other more established process design and operational improvement technologies. The paper illustrates the key elements of AspenTech`s strategy via examples of software development initiatives and services projects. The paper also outlines AspenTech`s future vision of the role of PI in process engineering. (au)

  10. Tech-X Corporation releases simulation code for solving complex problems in plasma physics : VORPAL code provides a robust environment for simulating plasma processes in high-energy physics, IC fabrications and material processing applications

    CERN Multimedia

    2005-01-01

    Tech-X Corporation releases simulation code for solving complex problems in plasma physics : VORPAL code provides a robust environment for simulating plasma processes in high-energy physics, IC fabrications and material processing applications

  11. Sustainable Chemical Process Development through an Integrated Framework

    DEFF Research Database (Denmark)

    Papadakis, Emmanouil; Kumar Tula, Anjan; Anantpinijwatna, Amata

    2016-01-01

    This paper describes the development and the application of a general integrated framework based on systematic model-based methods and computer-aided tools with the objective to achieve more sustainable process designs and to improve the process understanding. The developed framework can be appli...... studies involve multiphase reaction systems for the synthesis of active pharmaceutical ingredients....

  12. Integrating Usability Evaluations into the Software Development Process

    DEFF Research Database (Denmark)

    Lizano, Fulvio

    as relevant and strategic human–computer interaction (HCI) activities in the software development process, there are obstacles that limit the complete, effective and efficient integration of this kind of testing into the software development process. Two main obstacles are the cost of usability evaluations...... and the software developers' resistance to accepting users’ opinions regarding the lack of usability in their software systems. The ‘cost obstacle’ refers to the constraint of conducting usability evaluations in the software process due to the significant amount of resources required by this type of testing. Some......This thesis addresses the integration of usability evaluations into the software development process. The integration here is contextualized in terms of how to include usability evaluation as an activity in the software development lifecycle. Even though usability evaluations are considered...

  13. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  14. Integrating Thermal Tools Into the Mechanical Design Process

    Science.gov (United States)

    Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.

    1999-01-01

    The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.

  15. A comprehensive, holistic people integration process for mergers and acquisitions

    Directory of Open Access Journals (Sweden)

    Rina P. Steynberg

    2011-03-01

    Research purpose: To develop and validate a comprehensive, holistic model for the people integration process during mergers and acquisitions. Motivation for the study: The literature on a comprehensive, holistic people integration process for mergers and acquisitions is sparse and fragmented. Research design, approach and method: A qualitative approach was adopted consisting of a three step process which solicited the views of seasoned M&A Practioners; these views were compared against the available literature. Finally, practioners were asked to critique the final model from a practice perspective. The utility of the final model was assessed against two mergers and acquisitions case studies. Main findings: A comprehensive, holistic people integration process model for mergers and acquisitions was developed and validated. However, this model will only significantly enhance mergers and acquisitions value realisation if it is applied from the appropriate vantage point. Practical/managerial implications: The proposed approach will increase the probability of a successful M&A people-wise and M&A value realisation. Contribution/value add: Theoretically, the development and validation of a M&A people process integration model; practically, guidelines for successful people integration; organisationally, significantly enhancing the chances of M&A success; and community wise, the reduction of the negative effects of M&A failure on communities.

  16. Virtual Collaborative Simulation Environment for Integrated Product and Process Development

    Science.gov (United States)

    Gulli, Michael A.

    1997-01-01

    Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.

  17. Human Processing of Knowledge from Texts: Acquisition, Integration, and Reasoning

    Science.gov (United States)

    1979-06-01

    comprehension. Norwood, N.J.: Ablex, 1977. Craik , F.I.M., and Lockhart , R. S. Levels of processing : for memory research. Journal of Verbal Learning A...Table 5.9 presents summary data regarding the performance levels and memory and search processes of individual subjects. The first row in Table 5.9...R-2256-ARP A June 1979 ARPA Order No.: 189-1 9020 Cybernetics Technology Human Processing of Knowledge from Texts: Acquisition, Integration, and

  18. BATCH PROCESS INTEGRATION OF APPLYING TECHNOLOGY OF ACID CARMINIC PINCH

    OpenAIRE

    Erazo E., Raymundo; Cárdenas R., Jorge L.; Woolcott H., Juan C.

    2014-01-01

    This work was developed in order to implement the PINCH technology integration batch process for carminic acid. The method used consisted of the application of the concepts of bottle necks total process (OPB) together with part-time models (TAM) and time fractionated! (TSM). The drying operation is identified as the rate limiting step of the process identifying it as an OPB plant capacity. The extraction yield was 95% w / p carminic acid with an energy savings of approximately 60% of the...

  19. Test processing integrated system (S.I.D.E.X.)

    International Nuclear Information System (INIS)

    Sabas, M.; Oules, H.; Badel, D.

    1969-01-01

    The Test Processing Integrated System is mostly composed of a CAE 9080 (equiv. S. D. S. 9300) computer which is equipped of a 100 000 samples/sec acquisition system. The System is designed for high speed data acquisition and data processing on environment tests, and also calculation of structural models. Such a digital appliance on data processing has many advantages compared to the conventional methods based on analog instruments. (author) [fr

  20. Brain activity related to integrative processes in visual object recognition

    DEFF Research Database (Denmark)

    Gerlach, Christian; Aaside, C T; Humphreys, G W

    2002-01-01

    We report evidence from a PET activation study that the inferior occipital gyri (likely to include area V2) and the posterior parts of the fusiform and inferior temporal gyri are involved in the integration of visual elements into perceptual wholes (single objects). Of these areas, the fusiform a......) that perceptual and memorial processes can be dissociated on both functional and anatomical grounds. No evidence was obtained for the involvement of the parietal lobes in the integration of single objects....

  1. Computer-integrated electric-arc melting process control system

    OpenAIRE

    Дёмин, Дмитрий Александрович

    2014-01-01

    Developing common principles of completing melting process automation systems with hardware and creating on their basis rational choices of computer- integrated electricarc melting control systems is an actual task since it allows a comprehensive approach to the issue of modernizing melting sites of workshops. This approach allows to form the computer-integrated electric-arc furnace control system as part of a queuing system “electric-arc furnace - foundry conveyor” and consider, when taking ...

  2. Integrating discount usability in scrum development process in Ethiopia

    DEFF Research Database (Denmark)

    Teka, Degif; Dittrich, Y.; Kifle, Mesfin

    2017-01-01

    be adapted and integrated into the Scrum-agile development with especial emphasis on the Ethiopian context. The research aims at adapting software engineering and ICT development methods to the specific situation and integrating user-centered design (UCD) and lightweight usability methods into agile...... end users and developers. Culturally adapted user pair testing and heuristic evaluation supported usability testing and supported developers in getting early feedback. Integrated approach of discount usability with the Scrum process has been developed and evaluated first with the involved...

  3. A CMOS low power, process/temperature variation tolerant RSSI with an integrated AGC loop

    International Nuclear Information System (INIS)

    Lei Qianqian; Lin Min; Shi Yin

    2013-01-01

    A low voltage low power CMOS limiter and received signal strength indicator (RSSI) with an integrated automatic gain control (AGC) loop for a short-distance receiver are implemented in SMIC 0.13 μm CMOS technology. The RSSI has a dynamic range of more than 60 dB and the RSSI linearity error is within ±0.5 dB for an input power from −65 to −8 dBm. The RSSI output voltage is from 0.15 to 1 V and the slope of the curve is 14.17 mV/dB while consuming 1.5 mA (I and Q paths) from a 1.2 V supply. Auto LNA gain mode selection with a combined RSSI function is also presented. Furthermore, with the compensation circuit, the proposed RSSI shows good temperature-independent and good robustness against process variation characteristics. (semiconductor integrated circuits)

  4. THE INTEGRATION PROCESS MERCOSUR IN 2007 BY MODEL OF GLOBAL DIMENSION OF REGIONAL INTEGRATION

    Directory of Open Access Journals (Sweden)

    André Bechlin

    2013-04-01

    Full Text Available This paper aimed to analyze the advance of the regional integration process in the MERCOSUR (Southern Common Market, using a model developed for Professor Mario Ruiz Estrada, of the College of Economy and Administration of the University of Kuala Lumpur in Malaysia, the GDRI (Global Dimension of Regional Integration Model and that as characteristic has differentiated the use of other variable for analysis, that not specifically of economic origin, derivatives of the evolution of the commerce processes. When inferring and comparing the external performance of the economies that compose the Mercosur, evaluating itself the impacts of the advance of the process of regional and commercial integration, are evidents the inequalities that exist in the block. However, a common evolution is observed, in the direction of intensification of the integration between the economies, mainly after the process of opening lived for the continent, beyond the advance of the integration in the context of the Mercosur, from the decade of 1990. The analyzed data show that, in the generality, these economies are if integrating to the world-wide market, and in parallel, accenting the integration degree enters the members of the block.

  5. Integrating digital topology in image-processing libraries.

    Science.gov (United States)

    Lamy, Julien

    2007-01-01

    This paper describes a method to integrate digital topology informations in image-processing libraries. This additional information allows a library user to write algorithms respecting topological constraints, for example, a seed fill or a skeletonization algorithm. As digital topology is absent from most image-processing libraries, such constraints cannot be fulfilled. We describe and give code samples for all the structures necessary for this integration, and show a use case in the form of a homotopic thinning filter inside ITK. The obtained filter can be up to a hundred times as fast as ITK's thinning filter and works for any image dimension. This paper mainly deals of integration within ITK, but can be adapted with only minor modifications to other image-processing libraries.

  6. Simplification of Process Integration Studies in Intermediate Size Industries

    DEFF Research Database (Denmark)

    Dalsgård, Henrik; Petersen, P. M.; Qvale, Einar Bjørn

    2002-01-01

    associated with a given process integration study in an intermediate size industry. This is based on the observation that the systems that eventually result from a process integration project and that are economically and operationally most interesting are also quite simple. Four steps that may be used......It can be argued that the largest potential for energy savings based on process integration is in the intermediate size industry. But this is also the industrial scale in which it is most difficult to make the introduction of energy saving measures economically interesting. The reasons......' and therefore lead to non-optimal economic solutions, which may be right. But the objective of the optimisation is not to reach the best economic solution, but to relatively quickly develop the design of a simple and operationally friendly network without losing too much energy saving potential. (C) 2002...

  7. Integration of e-learning outcomes into work processes

    Directory of Open Access Journals (Sweden)

    Kerstin Grundén

    2011-07-01

    Full Text Available Three case studies of in-house developed e-learning education in public organizations with different pedagogical approaches are used as a starting point for discussion regarding the implementation challenges of e-learning at work. The aim of this article is to contribute to the understanding of integrating mechanisms of e-learning outcomes into work processes in large, public organizations. The case studies were analyzed from a socio-cultural perspective using the MOA-model as a frame of reference. Although the pedagogical approaches for all of the cases seemed to be relevant and most of the learners showed overall positive attitudes towards the courses, there were problems with integration of the e-learning outcomes into work processes. There were deficiencies in the adaption of the course contents to the local educational needs. There was also a lack of adjusting the local work organization and work routines in order to facilitate the integration of the e-learning outcomes into the work processes. A lack of local management engagement affected the learners’ motivation negatively. Group discussions in local work groups facilitated the integration of the e-learning outcomes. Much of the difficulties of integrating e-learning outcomes into work processes in big organizations are related to the problems with adjusting centrally developed e-learning courses to local needs and a lack of co-operation among among the developers (often IT-professionals and the Human Resources Department of the organizations.

  8. Process modeling for the Integrated Nonthermal Treatment System (INTS) study

    Energy Technology Data Exchange (ETDEWEB)

    Brown, B.W.

    1997-04-01

    This report describes the process modeling done in support of the Integrated Nonthermal Treatment System (INTS) study. This study was performed to supplement the Integrated Thermal Treatment System (ITTS) study and comprises five conceptual treatment systems that treat DOE contract-handled mixed low-level wastes (MLLW) at temperatures of less than 350{degrees}F. ASPEN PLUS, a chemical process simulator, was used to model the systems. Nonthermal treatment systems were developed as part of the INTS study and include sufficient processing steps to treat the entire inventory of MLLW. The final result of the modeling is a process flowsheet with a detailed mass and energy balance. In contrast to the ITTS study, which modeled only the main treatment system, the INTS study modeled each of the various processing steps with ASPEN PLUS, release 9.1-1. Trace constituents, such as radionuclides and minor pollutant species, were not included in the calculations.

  9. Collaboration process for integrated social and health care strategy implementation.

    Science.gov (United States)

    Korpela, Jukka; Elfvengren, Kalle; Kaarna, Tanja; Tepponen, Merja; Tuominen, Markku

    2012-01-01

    To present a collaboration process for creating a roadmap for the implementation of a strategy for integrated health and social care. The developed collaboration process includes multiple phases and uses electronic group decision support system technology (GDSS). A case study done in the South Karelia District of Social and Health Services in Finland during 2010-2011. An expert panel of 13 participants was used in the planning process of the strategy implementation. The participants were interviewed and observed during the case study. As a practical result, a roadmap for integrated health and social care strategy implementation has been developed. The strategic roadmap includes detailed plans of several projects which are needed for successful integration strategy implementation. As an academic result, a collaboration process to create such a roadmap has been developed. The collaboration process and technology seem to suit the planning process well. The participants of the meetings were satisfied with the collaboration process and the GDSS technology. The strategic roadmap was accepted by the participants, which indicates satisfaction with the developed process.

  10. Collaboration process for integrated social and health care strategy implementation

    Directory of Open Access Journals (Sweden)

    Jukka Korpela

    2012-05-01

    Full Text Available Objective:  To present collaboration process for creating a roadmap for the implementation of a strategy for integrated health and social care. The developed collaboration process includes multiple phases and uses electronic group decision support system technology (GDSS.Method: A case study done in the South Karelia District of Social and Health Services in Finland during 2010 - 2011. An expert panel of 13 participants was used in the planning process of the strategy implementation. The participants were interviewed and observed during the case study.Results: As a practical result, a roadmap for integrated health and social care strategy implementation has been developed. The strategic roadmap includes detailed plans of several projects which are needed for successful integration strategy implementation. As an academic result, a collaboration process to create such a roadmap has been developed.Conclusions: The collaboration process and technology seem to suit the planning process well. The participants of the meetings were satisfied with the collaboration process and the GDSS technology. The strategic roadmap was accepted by the participants, which indicates satisfaction with the developed process.

  11. Design Process for Integrated Concepts with Responsive Building Elements

    DEFF Research Database (Denmark)

    Aa, Van der A.; Heiselberg, Per

    2008-01-01

    An integrated building concept is a prerequisite to come to an energy efficient building with a good and healthy IAQ indoor comfort. A design process that defines the targets and boundary conditions in the very first stage of the design and guarantees them until the building is finished and used...... is needed. The hard question is however: how to make the right choice of the combination of individual measures from building components and building services elements. Within the framework of IEA-ECBCS Annex 44 research has been conducted about the design process for integrated building concepts...

  12. Advanced multiresponse process optimisation an intelligent and integrated approach

    CERN Document Server

    Šibalija, Tatjana V

    2016-01-01

    This book presents an intelligent, integrated, problem-independent method for multiresponse process optimization. In contrast to traditional approaches, the idea of this method is to provide a unique model for the optimization of various processes, without imposition of assumptions relating to the type of process, the type and number of process parameters and responses, or interdependences among them. The presented method for experimental design of processes with multiple correlated responses is composed of three modules: an expert system that selects the experimental plan based on the orthogonal arrays; the factor effects approach, which performs processing of experimental data based on Taguchi’s quality loss function and multivariate statistical methods; and process modeling and optimization based on artificial neural networks and metaheuristic optimization algorithms. The implementation is demonstrated using four case studies relating to high-tech industries and advanced, non-conventional processes.

  13. Pilot-scale investigation of the robustness and efficiency of a copper-based treated wood wastes recycling process

    Energy Technology Data Exchange (ETDEWEB)

    Coudert, Lucie [INRS-ETE (Canada); Blais, Jean-François, E-mail: blaisjf@ete.inrs.ca [INRS-ETE (Canada); Mercier, Guy [INRS-ETE (Canada); Cooper, Paul [University of Toronto (Canada); Gastonguay, Louis [IREQ (Canada); Morris, Paul [FPInnovations (Canada); Janin, Amélie; Reynier, Nicolas [INRS-ETE (Canada)

    2013-10-15

    Highlights: • A leaching process was studied for metals removal from CCA-treated wood wastes. • This decontamination process was studied at pilot scale (130-L reactor). • Removals up to 98% of As, 88% of Cr, and 96% of Cu were obtained from wood wastes. • The produced leachates can be treated by chemical precipitation. -- Abstract: The disposal of metal-bearing treated wood wastes is becoming an environmental challenge. An efficient recycling process based on sulfuric acid leaching has been developed to remove metals from copper-based treated wood chips (0 < x < 12 mm). The present study explored the performance and the robustness of this technology in removing metals from copper-based treated wood wastes at a pilot plant scale (130-L reactor tank). After 3× 2 h leaching steps followed by 3× 7 min rinsing steps, up to 97.5% of As, 87.9% of Cr, and 96.1% of Cu were removed from CCA-treated wood wastes with different initial metal loading (>7.3 kg m{sup −3}) and more than 94.5% of Cu was removed from ACQ-, CA- and MCQ-treated wood. The treatment of effluents by precipitation–coagulation was highly efficient; allowing removals more than 93% for the As, Cr, and Cu contained in the effluent. The economic analysis included operating costs, indirect costs and revenues related to remediated wood sales. The economic analysis concluded that CCA-treated wood wastes remediation can lead to a benefit of 53.7 US$ t{sup −1} or a cost of 35.5 US$ t{sup −1} and that ACQ-, CA- and MCQ-treated wood wastes recycling led to benefits ranging from 9.3 to 21.2 US$ t{sup −1}.

  14. Elements for successful sensor-based process control {Integrated Metrology}

    International Nuclear Information System (INIS)

    Butler, Stephanie Watts

    1998-01-01

    Current productivity needs have stimulated development of alternative metrology, control, and equipment maintenance methods. Specifically, sensor applications provide the opportunity to increase productivity, tighten control, reduce scrap, and improve maintenance schedules and procedures. Past experience indicates a complete integrated solution must be provided for sensor-based control to be used successfully in production. In this paper, Integrated Metrology is proposed as the term for an integrated solution that will result in a successful application of sensors for process control. This paper defines and explores the perceived four elements of successful sensor applications: business needs, integration, components, and form. Based upon analysis of existing successful commercially available controllers, the necessary business factors have been determined to be strong, measurable industry-wide business needs whose solution is profitable and feasible. This paper examines why the key aspect of integration is the decision making process. A detailed discussion is provided of the components of most importance to sensor based control: decision-making methods, the 3R's of sensors, and connectivity. A metric for one of the R's (resolution) is proposed to allow focus on this important aspect of measurement. A form for these integrated components which synergistically partitions various aspects of control at the equipment and MES levels to efficiently achieve desired benefits is recommended

  15. Elements for successful sensor-based process control {Integrated Metrology}

    Science.gov (United States)

    Butler, Stephanie Watts

    1998-11-01

    Current productivity needs have stimulated development of alternative metrology, control, and equipment maintenance methods. Specifically, sensor applications provide the opportunity to increase productivity, tighten control, reduce scrap, and improve maintenance schedules and procedures. Past experience indicates a complete integrated solution must be provided for sensor-based control to be used successfully in production. In this paper, Integrated Metrology is proposed as the term for an integrated solution that will result in a successful application of sensors for process control. This paper defines and explores the perceived four elements of successful sensor applications: business needs, integration, components, and form. Based upon analysis of existing successful commercially available controllers, the necessary business factors have been determined to be strong, measurable industry-wide business needs whose solution is profitable and feasible. This paper examines why the key aspect of integration is the decision making process. A detailed discussion is provided of the components of most importance to sensor based control: decision-making methods, the 3R's of sensors, and connectivity. A metric for one of the R's (resolution) is proposed to allow focus on this important aspect of measurement. A form for these integrated components which synergistically partitions various aspects of control at the equipment and MES levels to efficiently achieve desired benefits is recommended.

  16. A unified approach to the design of advanced proportional-integral-derivative controllers for time-delay processes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Moonyong [Yeungnam University, Gyeongsan (Korea, Republic of); Vu, Truong Nguyen Luan [University of Technical Education of Ho Chi Minh City, Ho Chi Minh (China)

    2013-03-15

    A unified approach for the design of proportional-integral-derivative (PID) controllers cascaded with first-order lead-lag filters is proposed for various time-delay processes. The proposed controller’s tuning rules are directly derived using the Padé approximation on the basis of internal model control (IMC) for enhanced stability against disturbances. A two-degrees-of-freedom (2DOF) control scheme is employed to cope with both regulatory and servo problems. Simulation is conducted for a broad range of stable, integrating, and unstable processes with time delays. Each simulated controller is tuned to have the same degree of robustness in terms of maximum sensitivity (Ms). The results demonstrate that the proposed controller provides superior disturbance rejection and set-point tracking when compared with recently published PID-type controllers. Controllers’ robustness is investigated through the simultaneous introduction of perturbation uncertainties to all process parameters to obtain worst-case process-model mismatch. The process-model mismatch simulation results demonstrate that the proposed method consistently affords superior robustness.

  17. Improved Anomaly Detection using Integrated Supervised and Unsupervised Processing

    Science.gov (United States)

    Hunt, B.; Sheppard, D. G.; Wetterer, C. J.

    There are two broad technologies of signal processing applicable to space object feature identification using nonresolved imagery: supervised processing analyzes a large set of data for common characteristics that can be then used to identify, transform, and extract information from new data taken of the same given class (e.g. support vector machine); unsupervised processing utilizes detailed physics-based models that generate comparison data that can then be used to estimate parameters presumed to be governed by the same models (e.g. estimation filters). Both processes have been used in non-resolved space object identification and yield similar results yet arrived at using vastly different processes. The goal of integrating the results of the two is to seek to achieve an even greater performance by building on the process diversity. Specifically, both supervised processing and unsupervised processing will jointly operate on the analysis of brightness (radiometric flux intensity) measurements reflected by space objects and observed by a ground station to determine whether a particular day conforms to a nominal operating mode (as determined from a training set) or exhibits anomalous behavior where a particular parameter (e.g. attitude, solar panel articulation angle) has changed in some way. It is demonstrated in a variety of different scenarios that the integrated process achieves a greater performance than each of the separate processes alone.

  18. The Role of CAD in Enterprise Integration Process

    Directory of Open Access Journals (Sweden)

    M. Ota

    2004-01-01

    Full Text Available This article deals with the problem of the mutual influence between software systems used in enterprise environment and enterprise integration processes. The position of CAD data and CAx systems in the integrated environment of manufacturing enterprises is clarified. As a consequence, the key role of CAx systems used in those companies is emphasized. It is noted that the integration of CAD data is nowadays only on a secondary level, via primarily integrated PDM systems. This limitation is a reason why we are developing a unified communication model focused on product-oriented data. Our approach is based on Internet technologies, so we believe that is independent enough. The proposed system of communication is based on a simple request-replay dialogue. The structure of this model is open and extensible, but we assume supervision supported by an Internet portal.

  19. Performance and Ageing Robustness of Graphite/NMC Pouch Prototypes Manufactured through Eco-Friendly Materials and Processes.

    Science.gov (United States)

    Loeffler, Nicholas; Kim, Guk-T; Passerini, Stefano; Gutierrez, Cesar; Cendoya, Iosu; De Meatza, Iratxe; Alessandrini, Fabrizio; Appetecchi, Giovanni B

    2017-09-22

    Graphite/lithium nickel-manganese-cobalt oxide (NMC), stacked pouch cells with nominal capacity of 15-18 Ah were designed, developed, and manufactured for automotive applications in the frame of the European Project GREENLION. A natural, water-soluble material was used as the main electrode binder, thus allowing the employment of H 2 O as the only processing solvent. The electrode formulations were developed, optimized, and upscaled for cell manufacturing. Prolonged cycling and ageing tests revealed excellent capacity retention and robustness toward degradation phenomena. For instance, above 99 % of the initial capacity is retained upon 500 full charge/discharge cycles, corresponding to a fading of 0.004 % per cycle, and about 80 % of the initial capacity is delivered after 8 months ageing at 45 °C. The stacked soft-packaged cells have shown very reproducible characteristics and performance, reflecting the goodness of design and manufacturing. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Business Process Management Integration Solution in Financial Sector

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available It is vital for financial services companies to ensure the rapid implementation of new processes to meet speed-to-market, service quality and compliance requirements. This has to be done against a background of increased complexity. An integrated approach to business processes allows products, processes, systems, data and the applications that underpin them to evolve quickly. Whether it’s providing a loan, setting up an insurance policy, or executing an investment instruction, optimizing the sale-to-fulfillment process will always win new business, cement customer loyalty, and reduce costs. Lack of integration across lending, payments and trading, on the other hand, simply presents competitors who are more efficient with a huge profit opportunity.

  1. A manufacturable process integration approach for graphene devices

    Science.gov (United States)

    Vaziri, Sam; Lupina, Grzegorz; Paussa, Alan; Smith, Anderson D.; Henkel, Christoph; Lippert, Gunther; Dabrowski, Jarek; Mehr, Wolfgang; Östling, Mikael; Lemme, Max C.

    2013-06-01

    In this work, we propose an integration approach for double gate graphene field effect transistors. The approach includes a number of process steps that are key for future integration of graphene in microelectronics: bottom gates with ultra-thin (2 nm) high-quality thermally grown SiO2 dielectrics, shallow trench isolation between devices and atomic layer deposited Al2O3 top gate dielectrics. The complete process flow is demonstrated with fully functional GFET transistors and can be extended to wafer scale processing. We assess, through simulation, the effects of the quantum capacitance and band bending in the silicon substrate on the effective electric fields in the top and bottom gate oxide. The proposed process technology is suitable for other graphene-based devices such as graphene-based hot electron transistors and photodetectors.

  2. Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes

    Science.gov (United States)

    Pai, Shantaram S.; Nagpal, Vinod K.

    2007-01-01

    An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.

  3. Space Medicine in the Human System Integration Process

    Science.gov (United States)

    Scheuring, Richard A.

    2010-01-01

    This slide presentation reviews the importance of integration of space medicine in the human system of lunar exploration. There is a review of historical precedence in reference to lunar surface operations. The integration process is reviewed in a chart which shows the steps from research to requirements development, requirements integration, design, verification, operations and using the lessons learned, giving more information and items for research. These steps are reviewed in view of specific space medical issues. Some of the testing of the operations are undertaken in an environment that is an analog to the exploration environment. Some of these analog environments are reviewed, and there is some discussion of the benefits of use of an analog environment in testing the processes that are derived.

  4. A Multiobjective Robust Scheduling Optimization Mode for Multienergy Hybrid System Integrated by Wind Power, Solar Photovoltaic Power, and Pumped Storage Power

    Directory of Open Access Journals (Sweden)

    Lihui Zhang

    2017-01-01

    Full Text Available Wind power plant (WPP, photovoltaic generators (PV, cell-gas turbine (CGT, and pumped storage power station (PHSP are integrated into multienergy hybrid system (MEHS. Firstly, this paper presents MEHS structure and constructs a scheduling model with the objective functions of maximum economic benefit and minimum power output fluctuation. Secondly, in order to relieve the uncertainty influence of WPP and PV on system, robust stochastic theory is introduced to describe uncertainty and propose a multiobjective stochastic scheduling optimization mode by transforming constraint conditions with uncertain variables. Finally, a 9.6 MW WPP, a 6.5 MW PV, three CGT units, and an upper reservoir with 10 MW·h equivalent capacity are chosen as simulation system. The results show MEHS system can achieve the best operation result by using the multienergy hybrid generation characteristic. PHSP could shave peak and fill valley of load curve by optimizing pumping storage and inflowing generating behaviors based on the load supply and demand status and the available power of WPP and PV. Robust coefficients can relieve the uncertainty of WPP and PV and provide flexible scheduling decision tools for decision-makers with different risk attitudes by setting different robust coefficients, which could maximize economic benefits and minimize operation risks at the same time.

  5. Integrating Sustainability into the Real Estate Valuation Process: A ...

    African Journals Online (AJOL)

    This paper sought the perception of Nigerian real estate valuers on sustainable development and how sustainability can be integrated into the real estate valuation process in Nigeria. One hundred and sixty Estate Surveyors and Valuers were asked, among others, to rate the significance of a range of sustainability features ...

  6. ESG Integration and the Investment Management Process : Fundamental Investing Reinvented

    NARCIS (Netherlands)

    van Duuren, Emiel; Plantinga, Auke; Scholtens, Bert

    2016-01-01

    We investigate how conventional asset managers account for environmental, social and governance factors (ESG) in their investment process. We do so on the basis of an international survey among fund managers. We find that many conventional managers integrate responsible investing in their investment

  7. Surface light scattering: integrated technology and signal processing

    DEFF Research Database (Denmark)

    Lading, L.; Dam-Hansen, C.; Rasmussen, E.

    1997-01-01

    systems representing increasing levels of integration are considered. It is demonstrated that efficient signal and data processing can be achieved by evaluation of the statistics of the derivative of the instantaneous phase of the detector signal. (C) 1997 Optical Society of America....

  8. A toroidal inductor integrated in a standard CMOS process

    DEFF Research Database (Denmark)

    Vandi, Luca; Andreani, Pietro; Temporiti, Enrico

    2007-01-01

    This paper presents a toroidal inductor integrated in a standard 0.13 um CMOS process. Finite-elements preliminary simulations are provided to prove the validity of the concept. In order to extract fundamental parameters by means of direct calculations, two different and well-known approaches...

  9. Prediction of thermo-mechanical integrity of wafer backend processes

    NARCIS (Netherlands)

    Gonda, V.; Toonder, den J.M.J.; Beijer, J.G.J.; Zhang, G.Q.; Hoofman, R.J.O.M.; Ernst, L.J.; Ernst, L.J.

    2003-01-01

    More than 65% of IC failures are related to thermal and mechanical problems. For wafer backend processes, thermo-mechanical failure is one of the major bottlenecks. The ongoing technological trends like miniaturization, introduction of new materials, and function/product integration will increase

  10. Aging Effect on Audiovisual Integrative Processing in Spatial Discrimination Task

    Directory of Open Access Journals (Sweden)

    Zhi Zou

    2017-11-01

    Full Text Available Multisensory integration is an essential process that people employ daily, from conversing in social gatherings to navigating the nearby environment. The aim of this study was to investigate the impact of aging on modulating multisensory integrative processes using event-related potential (ERP, and the validity of the study was improved by including “noise” in the contrast conditions. Older and younger participants were involved in perceiving visual and/or auditory stimuli that contained spatial information. The participants responded by indicating the spatial direction (far vs. near and left vs. right conveyed in the stimuli using different wrist movements. electroencephalograms (EEGs were captured in each task trial, along with the accuracy and reaction time of the participants’ motor responses. Older participants showed a greater extent of behavioral improvements in the multisensory (as opposed to unisensory condition compared to their younger counterparts. Older participants were found to have fronto-centrally distributed super-additive P2, which was not the case for the younger participants. The P2 amplitude difference between the multisensory condition and the sum of the unisensory conditions was found to correlate significantly with performance on spatial discrimination. The results indicated that the age-related effect modulated the integrative process in the perceptual and feedback stages, particularly the evaluation of auditory stimuli. Audiovisual (AV integration may also serve a functional role during spatial-discrimination processes to compensate for the compromised attention function caused by aging.

  11. Testing the Box-Cox Parameter for an Integrated Process

    NARCIS (Netherlands)

    J. Huang (Jian); M. Kobayashi (Masahito); M.J. McAleer (Michael)

    2011-01-01

    textabstractThis paper analyses the constant elasticity of volatility (CEV) model suggested by Chan et al. (1992). The CEV model without mean reversion is shown to be the inverse Box-Cox transformation of integrated processes asymptotically. It is demonstrated that the maximum likelihood estimator

  12. Integrating the Medical Home into the EHDI Process

    Science.gov (United States)

    Munoz, Karen F.; Nelson, Lauri; Bradham, Tamala S.; Hoffman, Jeff; Houston, K. Todd

    2011-01-01

    State coordinators of early hearing detection and intervention (EHDI) programs completed a strengths, weaknesses, opportunities, and threats, or SWOT, analysis that examined 12 areas within state EHDI programs. Related to how the medical home is integrated into the EHDI process, 273 items were listed by 48 coordinators, and themes were identified…

  13. Enhanced IMC design of load disturbance rejection for integrating and unstable processes with slow dynamics.

    Science.gov (United States)

    Liu, Tao; Gao, Furong

    2011-04-01

    In view of the deficiencies in existing internal model control (IMC)-based methods for load disturbance rejection for integrating and unstable processes with slow dynamics, a modified IMC-based controller design is proposed to deal with step- or ramp-type load disturbance that is often encountered in engineering practices. By classifying the ways through which such load disturbance enters into the process, analytical controller formulae are correspondingly developed, based on a two-degree-of-freedom (2DOF) control structure that allows for separate optimization of load disturbance rejection from setpoint tracking. An obvious merit is that there is only a single adjustable parameter in the proposed controller, which in essence corresponds to the time constant of the closed-loop transfer function for load disturbance rejection, and can be monotonically tuned to meet a good trade-off between disturbance rejection performance and closed-loop robust stability. At the same time, robust tuning constraints are given to accommodate process uncertainties in practice. Illustrative examples from the recent literature are used to show effectiveness and merits of the proposed method for different cases of load disturbance. Copyright © 2010. Published by Elsevier Ltd.

  14. An integrated computer aided system for integrated design of chemical processes

    DEFF Research Database (Denmark)

    Gani, Rafiqul; Hytoft, Glen; Jaksland, Cecilia

    1997-01-01

    In this paper, an Integrated Computer Aided System (ICAS), which is particularly suitable for solving problems related to integrated design of chemical processes; is presented. ICAS features include a model generator (generation of problem specific models including model simplification and model ...... form the basis for the toolboxes. The available features of ICAS are highlighted through a case study involving the separation of binary azeotropic mixtures. (C) 1997 Elsevier Science Ltd....

  15. Process and overview of diagnostics integration in ITER ports

    International Nuclear Information System (INIS)

    Drevon, J.M.; Walsh, M.; Andrew, P.; Barnsley, R.; Bertalot, L.; Bock, M. de; Bora, D.; Bouhamou, R.; Direz, M.F.; Encheva, A.; Fang, T.; Feder, R.; Giacomin, T.; Hellermann, M. von; Jakhar, S.; Johnson, D.; Kaschuk, Y.; Kusama, Y.; Lee, H.G.; Levesy, B.

    2013-01-01

    Highlights: ► An overview of the Port Integration hardware for tenant system hosting inside ITER diagnostics ports is given. ► The main challenges for diagnostic port integration engineering are presented. ► The actions taken for a common modular approach and a coordinated design are detailed. -- Abstract: ITER will have a set of 45 diagnostics to ensure controlled operation. Many of them are integrated in the ITER ports. This paper addresses the integration process of the diagnostic systems and the approach taken to enable coordinated progress. An overview of the Port Integration hardware introduces the various structures needed for hosting tenant systems inside ITER diagnostics ports. The responsibilities of the different parties involved (ITER Organization and the Domestic Agencies) are outlined. The main challenges for diagnostic port integration engineering are summarized. The plan for a common approach to design and manufacture of the supporting structures, in particular the Port Plug is detailed. A coordinated design including common components and a common approach for neutronic analyses is proposed. One particular port, the equatorial port 11, is used to illustrate the approach

  16. Moral Judgment as Information Processing: An Integrative Review

    Directory of Open Access Journals (Sweden)

    Steve eGuglielmo

    2015-10-01

    Full Text Available This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two fundamental questions: (1 What input information guides moral judgments?; and (2 What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework, critically evaluates them on empirical and theoretical grounds, outlines a general integrative model grounded in information processing, and offers conceptual and methodological suggestions for future research. The information processing perspective provides a useful theoretical framework for organizing extant and future work in the rapidly growing field of moral judgment.

  17. Behavioural design: A process for integrating behaviour change and design

    DEFF Research Database (Denmark)

    Cash, Philip; Hartlev, Charlotte Gram; Durazo, Christine Boysen

    2017-01-01

    Nudge, persuasion, and the influencing of human behaviour through design are increasingly important topics in design research and in the wider public consciousness. However, current theoretical approaches to behaviour change have yet to be operationalized this in design process support....... Specifically, there are few empirically grounded processes supporting designers in realising behaviour change projects. In response to this, 20 design projects from a case company are analysed in order to distil a core process for behavioural design. Results show a number of process stages and activities...... associated with project success, pointing to a new perspective on the traditional design process, and allowing designers to integrate key insights from behaviour change theory. Using this foundation we propose the Behavioural Design process....

  18. Integrated and Modular Design of an Optimized Process Architecture

    Directory of Open Access Journals (Sweden)

    Colin Raßfeld

    2013-07-01

    Full Text Available Global economic integration increased the complexity of business activities, so organizations are forced to become more efficient each day. Process organization is a very useful way of aligning organizational systems towards business processes. However, an organization must do more than just focus its attention and efforts on processes. The layout design has also a significant impact on the system performance.. We contribute to this field by developing a tailored process-oriented organizational structure and new layout design for the quality assurance of a leading German automotive manufacturer. The target concept we developed was evaluated by process owners and an IT-based process simulation. Our results provide solid empirical back-up in which the performance and effects are  assessed from a qualitative and quantitative perspective

  19. Ontological Analysis of Integrated Process Models: testing hypotheses

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    2001-11-01

    Full Text Available Integrated process modeling is achieving prominence in helping to document and manage business administration and IT processes in organizations. The ARIS framework is a popular example for a framework of integrated process modeling not least because it underlies the 800 or more reference models embedded in the world's most popular ERP package, SAP R/3. This paper demonstrates the usefulness of the Bunge-Wand-Weber (BWW representation model for evaluating modeling grammars such as those constituting ARIS. It reports some initial insights gained from pilot testing Green and Rosemann's (2000 evaluative propositions. Even when considering all five views of ARIS, modelers have problems representing business rules, the scope and boundary of systems, and decomposing models. However, even though it is completely ontologically redundant, users still find the function view useful in modeling.

  20. An Experimentation Platform for On-Chip Integration of Analog Neural Networks: A Pathway to Trusted and Robust Analog/RF ICs.

    Science.gov (United States)

    Maliuk, Dzmitry; Makris, Yiorgos

    2015-08-01

    We discuss the design of an experimentation platform intended for prototyping low-cost analog neural networks for on-chip integration with analog/RF circuits. The objective of such integration is to support various tasks, such as self-test, self-tuning, and trust/aging monitoring, which require classification of analog measurements obtained from on-chip sensors. Particular emphasis is given to cost-efficient implementation reflected in: 1) low energy and area budgets of circuits dedicated to neural networks; 2) robust learning in presence of analog inaccuracies; and 3) long-term retention of learned functionality. Our chip consists of a reconfigurable array of synapses and neurons operating below threshold and featuring sub-μW power consumption. The synapse circuits employ dual-mode weight storage: 1) a dynamic mode, for fast bidirectional weight updates during training and 2) a nonvolatile mode, for permanent storage of learned functionality. We discuss a robust learning strategy, and we evaluate the system performance on several benchmark problems, such as the XOR2-6 and two-spirals classification tasks.

  1. Integral management of the hazardous merchandise logistics process

    International Nuclear Information System (INIS)

    Moran, M.

    2003-01-01

    Because of the out sourcing of transportation operations and complementary services by producers and carriers, there has been a growing demand for global services that integrate the entire value chain of external logistics, which is understood to be the process including storage, transportation (modal or bi-multimodal) and end delivery. This circumstance has force transportation companies to undertake a process of internal transformation: from providing mere transportation services to becoming logistic operators. Express Truck, S. A. (hereinafter ETSA) could not overlook this market need. Following is a description of ETSA's process of evolution in this line of business. (Author)

  2. Spatial memory and integration processes in congenital blindness.

    Science.gov (United States)

    Vecchi, Tomaso; Tinti, Carla; Cornoldi, Cesare

    2004-12-22

    The paper tests the hypothesis that difficulties met by the blind in spatial processing are due to the simultaneous treatment of independent spatial representations. Results showed that lack of vision does not impede the ability to process and transform mental images; however, blind people are significantly poorer in the recall of more than a single spatial pattern at a time than in the recall of the corresponding material integrated into a single pattern. It is concluded that the simultaneous maintenance of different spatial information is affected by congenital blindness, while cognitive processes that may involve sequential manipulation are not.

  3. Biorefineries to integrate fuel, energy and chemical production processes

    Directory of Open Access Journals (Sweden)

    Enrica Bargiacchi

    2007-12-01

    Full Text Available The world of renewable energies is in fast evolution and arouses political and public interests, especially as an opportunity to boost environmental sustainability by mitigation of greenhouse gas emissions. This work aims at examining the possibilities related to the development of biorefineries, where biomass conversion processes to produce biofuels, electricity and biochemicals are integrated. Particular interest is given to the production processes of biodiesel, bioethanol and biogas, for which present world situation, problems, and perspectives are drawn. Potential areas for agronomic and biotech researches are also discussed. Producing biomass for biorefinery processing will eventually lead to maximize yields, in the non food agriculture.

  4. Future CO2 removal from pulp mills - Process integration consequences

    International Nuclear Information System (INIS)

    Hektor, Erik; Berntsson, Thore

    2007-01-01

    Earlier work has shown that capturing the CO 2 from flue gases in the recovery boiler at a pulp mill can be a cost-effective way of reducing mill CO 2 emissions. However, the CO 2 capture cost is very dependent on the fuel price. In this paper, the potential for reducing the need for external fuel and thereby the possibility to reduce the cost for capturing the CO 2 are investigated. The reduction is achieved by using thermal process integration. In alternative 1, the mill processes are integrated and a steam surplus made available for CO 2 capture, but still there is a need for external fuel. In alternative 2, the integration is taken one step further, the reboiler is fed with MP steam, and the heat of absorption from the absorption unit is used for generation of LP steam needed at the mill. The avoidance costs are in both cases lower than before the process integration. The avoidance cost in alternative 1 varies between 25.4 and 30.7 EUR/tonne CO 2 depending on the energy market parameters. For alternative 2, the cost varies between 22.5 and 27.2 EUR/tonne CO 2 . With tough CO 2 reduction targets and correspondingly high CO 2 emission costs, the annual earnings can be substantial, 18.6 MEUR with alternative 1 and 21.2 MEUR with alternative 2

  5. Attention Modulates the Neural Processes Underlying Multisensory Integration of Emotion

    Directory of Open Access Journals (Sweden)

    Hao Tam Ho

    2011-10-01

    Full Text Available Integrating emotional information from multiple sensory modalities is generally assumed to be a pre-attentive process (de Gelder et al., 1999. This assumption, however, presupposes that the integrative process occurs independent of attention. Using event-potentials (ERP the present study investigated whether the neural processes underlying the integration of dynamic facial expression and emotional prosody is indeed unaffected by attentional manipulations. To this end, participants were presented with congruent and incongruent face-voice combinations (eg, an angry face combined with a neutral voice and performed different two-choice tasks in four consecutive blocks. Three of the tasks directed the participants' attention to emotion expressions in the face, the voice or both. The fourth task required participants to attend to the synchronicity between voice and lip movements. The results show divergent modulations of early ERP components by the different attentional manipulations. For example, when attention was directed to the face (or the voice, incongruent stimuli elicited a reduced N1 as compared to congruent stimuli. This effect was absent, when attention was diverted away from the emotionality in both face and voice suggesting that the detection of emotional incongruence already requires attention. Based on these findings, we question whether multisensory integration of emotion occurs indeed pre-attentively.

  6. CIPSS [computer-integrated process and safeguards system]: The integration of computer-integrated manufacturing and robotics with safeguards, security, and process operations

    International Nuclear Information System (INIS)

    Leonard, R.S.; Evans, J.C.

    1987-01-01

    This poster session describes the computer-integrated process and safeguards system (CIPSS). The CIPSS combines systems developed for factory automation and automated mechanical functions (robots) with varying degrees of intelligence (expert systems) to create an integrated system that would satisfy current and emerging security and safeguards requirements. Specifically, CIPSS is an extension of the automated physical security functions concepts. The CIPSS also incorporates the concepts of computer-integrated manufacturing (CIM) with integrated safeguards concepts, and draws upon the Defense Advance Research Project Agency's (DARPA's) strategic computing program

  7. Integrated system of production information processing for surface mines

    Energy Technology Data Exchange (ETDEWEB)

    Li, K.; Wang, S.; Zeng, Z.; Wei, J.; Ren, Z. [China University of Mining and Technology, Xuzhou (China). Dept of Mining Engineering

    2000-09-01

    Based on the concept of geological statistic, mathematical program, condition simulation, system engineering, and the features and duties of each main department in surface mine production, an integrated system for surface mine production information was studied systematically and developed by using the technology of data warehousing, CAD, object-oriented and system integration, which leads to the systematizing and automating of the information management, data processing, optimization computing and plotting. In this paper, its overall object, system design, structure and functions and some key techniques were described. 2 refs., 3 figs.

  8. Integrated treatment process of hazardous and mixed wastes

    International Nuclear Information System (INIS)

    Shibuya, M.; Suzuki, K.; Fujimura, Y.; Nakashima, T.; Moriya, Y.

    1993-01-01

    An integrated waste treatment system was studied based on technologies developed for the treatment of liquid radioactive, organic, and aqueous wastes containing hazardous materials and soils contaminated with heavy metals. The system consists of submerged incineration, metal ion fixing and stabilization, and soil washing treatments. Introduction of this system allows for the simultaneous processing of toxic waste and contaminated soils. Hazardous organic wastes can be decomposed into harmless gases, and aqueous wastes can be converted into a dischargeable effluent. The contaminated soil is backfilled after the removal of toxic materials. Experimental data show that the integration system is practical for complicated toxic wastes

  9. Defending the scientific integrity of conservation-policy processes.

    Science.gov (United States)

    Carroll, Carlos; Hartl, Brett; Goldman, Gretchen T; Rohlf, Daniel J; Treves, Adrian; Kerr, Jeremy T; Ritchie, Euan G; Kingsford, Richard T; Gibbs, Katherine E; Maron, Martine; Watson, James E M

    2017-10-01

    Government agencies faced with politically controversial decisions often discount or ignore scientific information, whether from agency staff or nongovernmental scientists. Recent developments in scientific integrity (the ability to perform, use, communicate, and publish science free from censorship or political interference) in Canada, Australia, and the United States demonstrate a similar trajectory. A perceived increase in scientific-integrity abuses provokes concerted pressure by the scientific community, leading to efforts to improve scientific-integrity protections under a new administration. However, protections are often inconsistently applied and are at risk of reversal under administrations publicly hostile to evidence-based policy. We compared recent challenges to scientific integrity to determine what aspects of scientific input into conservation policy are most at risk of political distortion and what can be done to strengthen safeguards against such abuses. To ensure the integrity of outbound communications from government scientists to the public, we suggest governments strengthen scientific integrity policies, include scientists' right to speak freely in collective-bargaining agreements, guarantee public access to scientific information, and strengthen agency culture supporting scientific integrity. To ensure the transparency and integrity with which information from nongovernmental scientists (e.g., submitted comments or formal policy reviews) informs the policy process, we suggest governments broaden the scope of independent reviews, ensure greater diversity of expert input and transparency regarding conflicts of interest, require a substantive response to input from agencies, and engage proactively with scientific societies. For their part, scientists and scientific societies have a responsibility to engage with the public to affirm that science is a crucial resource for developing evidence-based policy and regulations in the public interest.

  10. A robust network of double-strand break repair pathways governs genome integrity during C. elegans development.

    NARCIS (Netherlands)

    Pontier, D.B.; Tijsterman, M.

    2009-01-01

    To preserve genomic integrity, various mechanisms have evolved to repair DNA double-strand breaks (DSBs). Depending on cell type or cell cycle phase, DSBs can be repaired error-free, by homologous recombination, or with concomitant loss of sequence information, via nonhomologous end-joining (NHEJ)

  11. Sustaining high energy efficiency in existing processes with advanced process integration technology

    International Nuclear Information System (INIS)

    Zhang, Nan; Smith, Robin; Bulatov, Igor; Klemeš, Jiří Jaromír

    2013-01-01

    Highlights: ► Process integration with better modelling and more advanced solution methods. ► Operational changes for better environmental performance through optimisation. ► Identification of process integration technology for operational optimisation. ► Systematic implementation procedure of process integration technology. ► A case study with crude oil distillation to demonstrate the operational flexibility. -- Abstract: To reduce emissions in the process industry, much emphasis has been put on making step changes in emission reduction, by developing new process technology and making renewable energy more affordable. However, the energy saving potential of existing systems cannot be simply ignored. In recent years, there have been significant advances in process integration technology with better modelling techniques and more advanced solution methods. These methods have been applied to the new design and retrofit studies in the process industry. Here attempts are made to apply these technologies to improve the environmental performance of existing facilities with operational changes. An industrial project was carried out to demonstrate the importance and effectiveness of exploiting the operational flexibility for energy conservation. By applying advanced optimisation technique to integrate the operation of distillation and heat recovery in a crude oil distillation unit, the energy consumption was reduced by 8% without capital expenditure. It shows that with correctly identified technology and the proper execution procedure, significant energy savings and emission reduction can be achieved very quickly without major capital expenditure. This allows the industry to improve its economic and environment performance at the same time.

  12. The nurses' work process in different countries: an integrative review

    Directory of Open Access Journals (Sweden)

    Juliana Alves Leite Leal

    Full Text Available ABSTRACT Objective: To analyze the characteristics of nurses' work process in different countries. Method: We have used the integrative review method and selected 84 publications (articles, theses and dissertations in national and foreign thesis banks and databases. We analyzed the evidence based on dialectical materialism. Results: The rejection of managerial tasks hides the singularity of nurses' work, due to the failure to understand the inseparable nature of managerial and healthcare tasks, given that it is what provides the expertise to coordinate the nursing work process and guide the healthcare work processes. The social and technical division is present in the work process in all countries studied, albeit in different ways. The nurse's position in the healthcare work process is subordinated to that of the physician. Conclusion: The characteristics are similar. The rejection of the dual nature of the work by nurses themselves due to alienation results in the non-recognition of their own work.

  13. Integration of disabled people in an automated work process

    Science.gov (United States)

    Jalba, C. K.; Muminovic, A.; Epple, S.; Barz, C.; Nasui, V.

    2017-05-01

    Automation processes enter more and more into all areas of life and production. Especially people with disabilities can hardly keep step with this change. In sheltered workshops in Germany people with physical and mental disabilities get help with much dedication, to be integrated into the work processes. This work shows that cooperation between disabled people and industrial robots by means of industrial image processing can successfully result in the production of highly complex products. Here is described how high-pressure hydraulic pumps are assembled by people with disabilities in cooperation with industrial robots in a sheltered workshop. After the assembly process, the pumps are checked for leaks at very high pressures in a completely automated process.

  14. Exploring continuous and integrated strategies for the up- and downstream processing of human mesenchymal stem cells.

    Science.gov (United States)

    Cunha, Bárbara; Aguiar, Tiago; Silva, Marta M; Silva, Ricardo J S; Sousa, Marcos F Q; Pineda, Earl; Peixoto, Cristina; Carrondo, Manuel J T; Serra, Margarida; Alves, Paula M

    2015-11-10

    The integration of up- and downstream unit operations can result in the elimination of hold steps, thus decreasing the footprint, and ultimately can create robust closed system operations. This type of design is desirable for the bioprocess of human mesenchymal stem cells (hMSC), where high numbers of pure cells, at low volumes, need to be delivered for therapy applications. This study reports a proof of concept of the integration of a continuous perfusion culture in bioreactors with a tangential flow filtration (TFF) system for the concentration and washing of hMSC. Moreover, we have also explored a continuous alternative for concentrating hMSC. Results show that expanding cells in a continuous perfusion operation mode provided a higher expansion ratio, and led to a shift in cells' metabolism. TFF operated either in continuous or discontinuous allowed to concentrate cells, with high cell recovery (>80%) and viability (>95%); furthermore, continuous TFF permitted to operate longer with higher cell concentrations. Continuous diafiltration led to higher protein clearance (98%) with lower cell death, when comparing to discontinuous diafiltration. Overall, an integrated process allowed for a shorter process time, recovering 70% of viable hMSC (>95%), with no changes in terms of morphology, immunophenotype, proliferation capacity and multipotent differentiation potential. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Low-loss, robust fusion splicing of silica to chalcogenide fiber for integrated mid-infrared laser technology development.

    Science.gov (United States)

    Thapa, Rajesh; Gattass, Rafael R; Nguyen, Vinh; Chin, Geoff; Gibson, Dan; Kim, Woohong; Shaw, L Brandon; Sanghera, Jasbinder S

    2015-11-01

    We demonstrate a low-loss, repeatable, and robust splice between single-mode silica fiber and single-mode chalcogenide (CHG) fiber. These splices are particularly difficult to create because of the significant difference in the two fibers' glass transition temperatures (∼1000°C) as well as the large difference in the coefficients of thermal expansion between the fibers (∼20×10(-6)/°C). With 90% light coupled through the silica-CHG fiber splice, predominantly in the fundamental circular-symmetric mode, into the core of the CHG fiber and with 0.5 dB of splice loss measured around the wavelength of 2.5 μm, after correcting only for the Fresnel loss, the silica-CHG splice offers excellent beam quality and coupling efficiency. The tensile strength of the splice is greater than 12 kpsi, and the laser damage threshold is greater than 2 W (CW) and was limited by the available laser pump power. We also utilized this splicing technique to demonstrate 2 to 4.5 μm ultrabroadband supercontinuum generation in a monolithic all-fiber system comprising a CHG fiber and a high peak power 2 μm pulsed Raman-shifted thulium fiber laser. This is a major development toward compact form factor commercial applications of soft-glass mid-IR fibers.

  16. Intelligent Integration between Human Simulated Intelligence and Expert Control Technology for the Combustion Process of Gas Heating Furnace

    Directory of Open Access Journals (Sweden)

    Yucheng Liu

    2014-01-01

    Full Text Available Due to being poor in control quality of the combustion process of gas heating furnace, this paper explored a sort of strong robust control algorithm in order to improve the control quality of the combustion process of gas heating furnace. The paper analyzed the control puzzle in the complex combustion process of gas heating furnace, summarized the cybernetics characteristic of the complex combustion process, researched into control strategy of the uncertainty complex control process, discussed the control model of the complex process, presented a sort of intelligent integration between human-simulated intelligence and expert control technology, and constructed the control algorithm for the combustion process controlling of gas heating furnace. The simulation results showed that the control algorithm proposed in the paper is not only better in dynamic and steady quality of the combustion process, but also obvious in energy saving effect, feasible, and effective in control strategy.

  17. Mess management in microbial ecology: Rhetorical processes of disciplinary integration

    Science.gov (United States)

    McCracken, Christopher W.

    As interdisciplinary work becomes more common in the sciences, research into the rhetorical processes mediating disciplinary integration becomes more vital. This dissertation, which takes as its subject the integration of microbiology and ecology, combines a postplural approach to rhetoric of science research with Victor Turner's "social drama" analysis and a third-generation activity theory methodological framework to identify conceptual and practical conflicts in interdisciplinary work and describe how, through visual and verbal communication, scientists negotiate these conflicts. First, to understand the conflicting disciplinary principles that might impede integration, the author conducts a Turnerian analysis of a disciplinary conflict that took place in the 1960s and 70s, during which American ecologists and biologists debated whether they should participate in the International Biological Program (IBP). Participation in the IBP ultimately contributed to the emergence of ecology as a discipline distinct from biology, and Turnerian social drama analysis of the debate surrounding participation lays bare the conflicting principles separating biology and ecology. Second, to answer the question of how these conflicting principles are negotiated in practice, the author reports on a yearlong qualitative study of scientists working in a microbial ecology laboratory. Focusing specifically on two case studies from this fieldwork that illustrate the key concept of textually mediated disciplinary integration, the author's analysis demonstrates how scientific objects emerge in differently situated practices, and how these objects manage to cohere despite their multiplicity through textually mediated rhetorical processes of calibration and alignment.

  18. Social Group Dynamics and Patterns of Latin American Integration Processes

    Directory of Open Access Journals (Sweden)

    Sébastien Dubé

    2017-04-01

    Full Text Available This article proposes to incorporate social psychology elements with mainstream political science and international relations theories to help understand the contradictions related to the integration processes in Latin America. Through a theoretical analysis, it contributes to the challenge proposed by Dabène (2009 to explain the “resilience” of the Latin American regional integration process in spite of its “instability and crises.” Our main proposition calls for considering Latin America as a community and its regional organizations as “social groups.” In conclusion, three phenomena from the field of social psychology and particularly social group dynamics shed light on these contradictory patterns: the value of the group and the emotional bond, groupthink, and cognitive dissonance.

  19. Accelerating customer integration into innovation processes using Pico-Jobs

    OpenAIRE

    Blohm, Ivo;Fähling, Jens;Leimeister, Jan Marco;Krcmar, Helmut;Fischer, Jan

    2014-01-01

    Crowdsourcing marketplaces emerged in the internet and enable the integration of customers in various tasks along the innovation process. Marketplaces such as Amazon?s Mechanical Turk install a member base for third parties, where they can offer small, highly structured paid tasks which can hardly be solved automatically with ICT which we call Pico Jobs. In this paper a new method for systematically utilizing the creative potential of the users of these marketplaces for new product developmen...

  20. Materials and Process Design for High-Temperature Carburizing: Integrating Processing and Performance

    Energy Technology Data Exchange (ETDEWEB)

    D. Apelian

    2007-07-23

    The objective of the project is to develop an integrated process for fast, high-temperature carburizing. The new process results in an order of magnitude reduction in cycle time compared to conventional carburizing and represents significant energy savings in addition to a corresponding reduction of scrap associated with distortion free carburizing steels.

  1. Ethanol fermentation integrated with PDMS composite membrane: An effective process.

    Science.gov (United States)

    Fu, Chaohui; Cai, Di; Hu, Song; Miao, Qi; Wang, Yong; Qin, Peiyong; Wang, Zheng; Tan, Tianwei

    2016-01-01

    The polydimethylsiloxane (PDMS) membrane, prepared in water phase, was investigated in separation ethanol from model ethanol/water mixture and fermentation-pervaporation integrated process. Results showed that the PDMS membrane could effectively separate ethanol from model solution. When integrated with batch ethanol fermentation, the ethanol productivity was enhanced compared with conventional process. Fed-batch and continuous ethanol fermentation with pervaporation were also performed and studied. 396.2-663.7g/m(2)h and 332.4-548.1g/m(2)h of total flux with separation factor of 8.6-11.7 and 8-11.6, were generated in the fed-batch and continuous fermentation with pervaporation scenario, respectively. At the same time, high titre ethanol production of ∼417.2g/L and ∼446.3g/L were also achieved on the permeate side of membrane in the two scenarios, respectively. The integrated process was environmental friendly and energy saving, and has a promising perspective in long-terms operation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Impact of informal institutions on the development integration processes

    Directory of Open Access Journals (Sweden)

    Sidorova Alexandra, M.

    2015-06-01

    Full Text Available The paper deals with the impact of informal institutions on the definition of the vector integration processes and the development of integration processes in the countries of the Customs Union and Ukraine. The degree of scientific development of the phenomenon in different economic schools is determined in this article. Economic mentality is a basic informal institutions, which determines the degree of effectiveness of the integration processes. This paper examines the nature, characteristics and effects of economic mentality on the economic activities of people. Ethnometrichal method allows to quantify the economic mentality that enables deeper understanding and analysis of the formation and functioning of political and economic system, especially business and management, establishing contacts with other cultures. It was measured modern Belarusian economic mentality based on international methodology Hofstede and compared with the economic mentality of Russia, Ukraine and Kazakhstan. With the help of cluster analysis congruence economic mentality of the Customs Union and Ukraine was determined. Economic mentality of these countries was also compared with the economic mentality of other countries in order to identify the main types of economic culture.

  3. Collaboration in Global Software Engineering Based on Process Description Integration

    Science.gov (United States)

    Klein, Harald; Rausch, Andreas; Fischer, Edward

    Globalization is one of the big trends in software development. Development projects need a variety of different resources with appropriate expert knowledge to be successful. More and more of these resources are nowadays obtained from specialized organizations and countries all over the world, varying in development approaches, processes, and culture. As seen with early outsourcing attempts, collaboration may fail due to these differences. Hence, the major challenge in global software engineering is to streamline collaborating organizations towards a successful conjoint development. Based on typical collaboration scenarios, this paper presents a structured approach to integrate processes in a comprehensible way.

  4. Integrated Design Process in Problem-Based Learning

    DEFF Research Database (Denmark)

    Knudstrup, Mary-Ann

    2004-01-01

    This article reports and reflects on the learning achievements and the educational experiences in connection with the first years of the curriculum in Architecture at Aalborg University ?s Civil Engineer Education in Architecture & Design. In the article I will focus on the learning activity and ...... the students need in order to concentrate, mobilize creativity and find the personal design language which is a precondition for making good architecture....... and the method that are developed during the semester when working with an Integrated Design Process combining architecture, design, functional aspects, energy consumption, indoor environment, technology, and construction. I will emphasize the importance of working with different tools in the design process, e...

  5. A secure and robust password-based remote user authentication scheme using smart cards for the integrated EPR information system.

    Science.gov (United States)

    Das, Ashok Kumar

    2015-03-01

    An integrated EPR (Electronic Patient Record) information system of all the patients provides the medical institutions and the academia with most of the patients' information in details for them to make corrective decisions and clinical decisions in order to maintain and analyze patients' health. In such system, the illegal access must be restricted and the information from theft during transmission over the insecure Internet must be prevented. Lee et al. proposed an efficient password-based remote user authentication scheme using smart card for the integrated EPR information system. Their scheme is very efficient due to usage of one-way hash function and bitwise exclusive-or (XOR) operations. However, in this paper, we show that though their scheme is very efficient, their scheme has three security weaknesses such as (1) it has design flaws in password change phase, (2) it fails to protect privileged insider attack and (3) it lacks the formal security verification. We also find that another recently proposed Wen's scheme has the same security drawbacks as in Lee at al.'s scheme. In order to remedy these security weaknesses found in Lee et al.'s scheme and Wen's scheme, we propose a secure and efficient password-based remote user authentication scheme using smart cards for the integrated EPR information system. We show that our scheme is also efficient as compared to Lee et al.'s scheme and Wen's scheme as our scheme only uses one-way hash function and bitwise exclusive-or (XOR) operations. Through the security analysis, we show that our scheme is secure against possible known attacks. Furthermore, we simulate our scheme for the formal security verification using the widely-accepted AVISPA (Automated Validation of Internet Security Protocols and Applications) tool and show that our scheme is secure against passive and active attacks.

  6. Data-Driven Robust RVFLNs Modeling of a Blast Furnace Iron-Making Process Using Cauchy Distribution Weighted M-Estimation

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Ping; Lv, Youbin; Wang, Hong; Chai, Tianyou

    2017-09-01

    Optimal operation of a practical blast furnace (BF) ironmaking process depends largely on a good measurement of molten iron quality (MIQ) indices. However, measuring the MIQ online is not feasible using the available techniques. In this paper, a novel data-driven robust modeling is proposed for online estimation of MIQ using improved random vector functional-link networks (RVFLNs). Since the output weights of traditional RVFLNs are obtained by the least squares approach, a robustness problem may occur when the training dataset is contaminated with outliers. This affects the modeling accuracy of RVFLNs. To solve this problem, a Cauchy distribution weighted M-estimation based robust RFVLNs is proposed. Since the weights of different outlier data are properly determined by the Cauchy distribution, their corresponding contribution on modeling can be properly distinguished. Thus robust and better modeling results can be achieved. Moreover, given that the BF is a complex nonlinear system with numerous coupling variables, the data-driven canonical correlation analysis is employed to identify the most influential components from multitudinous factors that affect the MIQ indices to reduce the model dimension. Finally, experiments using industrial data and comparative studies have demonstrated that the obtained model produces a better modeling and estimating accuracy and stronger robustness than other modeling methods.

  7. Integrating chemical engineering fundamentals in the capstone process design project

    DEFF Research Database (Denmark)

    von Solms, Nicolas; Woodley, John; Johnsson, Jan Erik

    2010-01-01

    Reaction Engineering. In order to incorporate reactor design into process design in a meaningful way, the teachers of the respective courses need to collaborate (Standard 9 – Enhancement of Faculty CDIO skills). The students also see that different components of the chemical engineering curriculum relate......All B.Eng. courses offered at the Technical University of Denmark (DTU) must now follow CDIO standards. The final “capstone” course in the B.Eng. education is Process Design, which for many years has been typical of chemical engineering curricula worldwide. The course at DTU typically has about 30...... of the CDIO standards – especially standard 3 – Integrated Curriculum - means that the course projects must draw on competences provided in other subjects which the students are taking in parallel with Process Design – specifically Process Control and Reaction Engineering. In each semester of the B...

  8. Adaptive Moving Object Tracking Integrating Neural Networks And Intelligent Processing

    Science.gov (United States)

    Lee, James S. J.; Nguyen, Dziem D.; Lin, C.

    1989-03-01

    A real-time adaptive scheme is introduced to detect and track moving objects under noisy, dynamic conditions including moving sensors. This approach integrates the adaptiveness and incremental learning characteristics of neural networks with intelligent reasoning and process control. Spatiotemporal filtering is used to detect and analyze motion, exploiting the speed and accuracy of multiresolution processing. A neural network algorithm constitutes the basic computational structure for classification. A recognition and learning controller guides the on-line training of the network, and invokes pattern recognition to determine processing parameters dynamically and to verify detection results. A tracking controller acts as the central control unit, so that tracking goals direct the over-all system. Performance is benchmarked against the Widrow-Hoff algorithm, for target detection scenarios presented in diverse FLIR image sequences. Efficient algorithm design ensures that this recognition and control scheme, implemented in software and commercially available image processing hardware, meets the real-time requirements of tracking applications.

  9. Integrating Process Management with Archival Management Systems: Lessons Learned

    Directory of Open Access Journals (Sweden)

    J. Gordon Daines, III

    2009-03-01

    Full Text Available The Integrated Digital Special Collections (INDI system is a prototype of a database-driven, Web application designed to automate and manage archival workflow for large institutions and consortia. This article discusses the how the INDI project enabled the successful implementation of a process to manage large technology projects in the Harold B. Lee Library at Brigham Young University. It highlights how the scope of these technology projects is set and how the major deliverables for each project are defined. The article also talks about how the INDI system followed the process and still failed to be completed. It examines why the process itself is successful and why the INDI project failed. It further underscores the importance of process management in archival management systems.

  10. Detection of a novel, integrative aging process suggests complex physiological integration.

    Science.gov (United States)

    Cohen, Alan A; Milot, Emmanuel; Li, Qing; Bergeron, Patrick; Poirier, Roxane; Dusseault-Bélanger, Francis; Fülöp, Tamàs; Leroux, Maxime; Legault, Véronique; Metter, E Jeffrey; Fried, Linda P; Ferrucci, Luigi

    2015-01-01

    Many studies of aging examine biomarkers one at a time, but complex systems theory and network theory suggest that interpretations of individual markers may be context-dependent. Here, we attempted to detect underlying processes governing the levels of many biomarkers simultaneously by applying principal components analysis to 43 common clinical biomarkers measured longitudinally in 3694 humans from three longitudinal cohort studies on two continents (Women's Health and Aging I & II, InCHIANTI, and the Baltimore Longitudinal Study on Aging). The first axis was associated with anemia, inflammation, and low levels of calcium and albumin. The axis structure was precisely reproduced in all three populations and in all demographic sub-populations (by sex, race, etc.); we call the process represented by the axis "integrated albunemia." Integrated albunemia increases and accelerates with age in all populations, and predicts mortality and frailty--but not chronic disease--even after controlling for age. This suggests a role in the aging process, though causality is not yet clear. Integrated albunemia behaves more stably across populations than its component biomarkers, and thus appears to represent a higher-order physiological process emerging from the structure of underlying regulatory networks. If this is correct, detection of this process has substantial implications for physiological organization more generally.

  11. Detection of a novel, integrative aging process suggests complex physiological integration.

    Directory of Open Access Journals (Sweden)

    Alan A Cohen

    Full Text Available Many studies of aging examine biomarkers one at a time, but complex systems theory and network theory suggest that interpretations of individual markers may be context-dependent. Here, we attempted to detect underlying processes governing the levels of many biomarkers simultaneously by applying principal components analysis to 43 common clinical biomarkers measured longitudinally in 3694 humans from three longitudinal cohort studies on two continents (Women's Health and Aging I & II, InCHIANTI, and the Baltimore Longitudinal Study on Aging. The first axis was associated with anemia, inflammation, and low levels of calcium and albumin. The axis structure was precisely reproduced in all three populations and in all demographic sub-populations (by sex, race, etc.; we call the process represented by the axis "integrated albunemia." Integrated albunemia increases and accelerates with age in all populations, and predicts mortality and frailty--but not chronic disease--even after controlling for age. This suggests a role in the aging process, though causality is not yet clear. Integrated albunemia behaves more stably across populations than its component biomarkers, and thus appears to represent a higher-order physiological process emerging from the structure of underlying regulatory networks. If this is correct, detection of this process has substantial implications for physiological organization more generally.

  12. Compound risk judgment in tasks with both idiosyncratic and systematic risk: The "Robust Beauty" of additive probability integration.

    Science.gov (United States)

    Sundh, Joakim; Juslin, Peter

    2018-02-01

    In this study, we explore how people integrate risks of assets in a simulated financial market into a judgment of the conjunctive risk that all assets decrease in value, both when assets are independent and when there is a systematic risk present affecting all assets. Simulations indicate that while mental calculation according to naïve application of probability theory is best when the assets are independent, additive or exemplar-based algorithms perform better when systematic risk is high. Considering that people tend to intuitively approach compound probability tasks using additive heuristics, we expected the participants to find it easiest to master tasks with high systematic risk - the most complex tasks from the standpoint of probability theory - while they should shift to probability theory or exemplar memory with independence between the assets. The results from 3 experiments confirm that participants shift between strategies depending on the task, starting off with the default of additive integration. In contrast to results in similar multiple cue judgment tasks, there is little evidence for use of exemplar memory. The additive heuristics also appear to be surprisingly context-sensitive, with limited generalization across formally very similar tasks. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Robustness analysis of a green chemistry-based model for the classification of silver nanoparticles synthesis processes

    Science.gov (United States)

    This paper proposes a robustness analysis based on Multiple Criteria Decision Aiding (MCDA). The ensuing model was used to assess the implementation of green chemistry principles in the synthesis of silver nanoparticles. Its recommendations were also compared to an earlier develo...

  14. Managing processes and information technology in mergers - the integration of finance processes and systems

    OpenAIRE

    Pedain, Christoph

    2003-01-01

    Many companies use mergers to achieve their growth goals or target technology position. To realise synergies that justify the merger transaction, an integration of the merged companies is often necessary. Such integartion takes place across company business areas (such as finance or sales) and across the layers of management consideration, which are strategy, human resources, organisation, processes, and information technology. In merger integration techniques, there is a significant gap ...

  15. Robust Scientists

    DEFF Research Database (Denmark)

    Gorm Hansen, Birgitte

    their core i nterests, 2) developing a selfsupply of industry interests by becoming entrepreneurs and thus creating their own compliant industry partner and 3) balancing resources within a larger collective of researchers, thus countering changes in the influx of funding caused by shifts in political...... knowledge", Danish research policy seems to have helped develop politically and economically "robust scientists". Scientific robustness is acquired by way of three strategies: 1) tasting and discriminating between resources so as to avoid funding that erodes academic profiles and push scientists away from...

  16. Process modeling for the Integrated Thermal Treatment System (ITTS) study

    Energy Technology Data Exchange (ETDEWEB)

    Liebelt, K.H.; Brown, B.W.; Quapp, W.J.

    1995-09-01

    This report describes the process modeling done in support of the integrated thermal treatment system (ITTS) study, Phases 1 and 2. ITTS consists of an integrated systems engineering approach for uniform comparison of widely varying thermal treatment technologies proposed for treatment of the contact-handled mixed low-level wastes (MLLW) currently stored in the U.S. Department of Energy complex. In the overall study, 19 systems were evaluated. Preconceptual designs were developed that included all of the various subsystems necessary for a complete installation, from waste receiving through to primary and secondary stabilization and disposal of the processed wastes. Each system included the necessary auxiliary treatment subsystems so that all of the waste categories in the complex were fully processed. The objective of the modeling task was to perform mass and energy balances of the major material components in each system. Modeling of trace materials, such as pollutants and radioactive isotopes, were beyond the present scope. The modeling of the main and secondary thermal treatment, air pollution control, and metal melting subsystems was done using the ASPEN PLUS process simulation code, Version 9.1-3. These results were combined with calculations for the remainder of the subsystems to achieve the final results, which included offgas volumes, and mass and volume waste reduction ratios.

  17. Process modeling for the Integrated Thermal Treatment System (ITTS) study

    International Nuclear Information System (INIS)

    Liebelt, K.H.; Brown, B.W.; Quapp, W.J.

    1995-09-01

    This report describes the process modeling done in support of the integrated thermal treatment system (ITTS) study, Phases 1 and 2. ITTS consists of an integrated systems engineering approach for uniform comparison of widely varying thermal treatment technologies proposed for treatment of the contact-handled mixed low-level wastes (MLLW) currently stored in the U.S. Department of Energy complex. In the overall study, 19 systems were evaluated. Preconceptual designs were developed that included all of the various subsystems necessary for a complete installation, from waste receiving through to primary and secondary stabilization and disposal of the processed wastes. Each system included the necessary auxiliary treatment subsystems so that all of the waste categories in the complex were fully processed. The objective of the modeling task was to perform mass and energy balances of the major material components in each system. Modeling of trace materials, such as pollutants and radioactive isotopes, were beyond the present scope. The modeling of the main and secondary thermal treatment, air pollution control, and metal melting subsystems was done using the ASPEN PLUS process simulation code, Version 9.1-3. These results were combined with calculations for the remainder of the subsystems to achieve the final results, which included offgas volumes, and mass and volume waste reduction ratios

  18. Integrated safeguards and security for a highly automated process

    International Nuclear Information System (INIS)

    Zack, N.R.; Hunteman, W.J.; Jaeger, C.D.

    1993-01-01

    Before the cancellation of the New Production Reactor Programs for the production of tritium, the reactors and associated processing were being designed to contain some of the most highly automated and remote systems conceived for a Department of Energy facility. Integrating safety, security, materials control and accountability (MC and A), and process systems at the proposed facilities would enhance the overall information and protection-in-depth available. Remote, automated fuel handling and assembly/disassembly techniques would deny access to the nuclear materials while upholding ALARA principles but would also require the full integration of all data/information systems. Such systems would greatly enhance MC and A as well as facilitate materials tracking. Physical protection systems would be connected with materials control features to cross check activities and help detect and resolve anomalies. This paper will discuss the results of a study of the safeguards and security benefits achieved from a highly automated and integrated remote nuclear facility and the impacts that such systems have on safeguards and computer and information security

  19. The process flow and structure of an integrated stroke strategy

    Directory of Open Access Journals (Sweden)

    Emma F. van Bussel

    2013-06-01

    Full Text Available Introduction: In the Canadian province of Alberta access and quality of stroke care were suboptimal, especially in remote areas. The government introduced the Alberta Provincial Stroke Strategy (APSS in 2005, an integrated strategy to improve access to stroke care, quality and efficiency which utilizes telehealth. Research question: What is the process flow and the structure of the care pathways of the APSS? Methodology: Information for this article was obtained using documentation, archival APSS records, interviews with experts, direct observation and participant observation. Results: The process flow is described. The APSS integrated evidence-based practice, multidisciplinary communication, and telestroke services. It includes regular quality evaluation and improvement. Conclusion: Access, efficiency and quality of care improved since the start of the APSS across many domains, through improvement of expertise and equipment in small hospitals, accessible consultation of stroke specialists using telestroke, enhancing preventive care, enhancing multidisciplinary collaboration, introducing uniform best practice protocols and bypass-protocols for the emergency medical services. Discussion: The APSS overcame substantial obstacles to decrease discrepancies and to deliver integrated higher quality care. Telestroke has proven itself to be safe and feasible. The APSS works efficiently, which is in line to other projects worldwide, and is, based on limited results, cost effective. Further research on cost-effectiveness is necessary.

  20. Integrated approaches to the application of advanced modeling technology in process development and optimization

    Energy Technology Data Exchange (ETDEWEB)

    Allgor, R.J.; Feehery, W.F.; Tolsma, J.E. [Massachusetts Institute of Technology, Cambridge, MA (United States)] [and others

    1995-12-31

    The batch process development problem serves as good candidate to guide the development of process modeling environments. It demonstrates that very robust numerical techniques are required within an environment that can collect, organize, and maintain the data and models required to address the batch process development problem. This paper focuses on improving the robustness and efficiency of the numerical algorithms required in such a modeling environment through the development of hybrid numerical and symbolic strategies.

  1. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform

    Science.gov (United States)

    Moutsatsos, Ioannis K.; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J.; Jenkins, Jeremy L.; Holway, Nicholas; Tallarico, John; Parker, Christian N.

    2016-01-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an “off-the-shelf,” open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community. PMID:27899692

  2. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform.

    Science.gov (United States)

    Moutsatsos, Ioannis K; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J; Jenkins, Jeremy L; Holway, Nicholas; Tallarico, John; Parker, Christian N

    2017-03-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an "off-the-shelf," open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community.

  3. Integrating protein engineering with process design for biocatalysis

    DEFF Research Database (Denmark)

    Woodley, John M.

    2017-01-01

    Biocatalysis uses enzymes for chemical synthesis and production, offering selective, safe and sustainable catalysis. While today the majority of applications are in the pharmaceutical sector, new opportunities are arising every day in other industry sectors, where production costs become a more...... important driver. In the early applications of the technology, it was necessary to design processes to match the properties of the biocatalyst. With the advent of protein engineering, organic chemists started to develop and improve enzymes to suit their needs. Likewise in industry, although not widespread......, a new paradigm was already implemented several years ago to engineer enzymes to suit process needs. Today, a new era is entered, where the effectiveness with which such integrated protein and process engineering is achieved becomes critical to implementation. In this paper, the development of a tool...

  4. Process integrated modelling for steelmaking Life Cycle Inventory analysis

    International Nuclear Information System (INIS)

    Iosif, Ana-Maria; Hanrot, Francois; Ablitzer, Denis

    2008-01-01

    During recent years, strict environmental regulations have been implemented by governments for the steelmaking industry in order to reduce their environmental impact. In the frame of the ULCOS project, we have developed a new methodological framework which combines the process integrated modelling approach with Life Cycle Assessment (LCA) method in order to carry out the Life Cycle Inventory of steelmaking. In the current paper, this new concept has been applied to the sinter plant which is the most polluting steelmaking process. It has been shown that this approach is a powerful tool to make the collection of data easier, to save time and to provide reliable information concerning the environmental diagnostic of the steelmaking processes

  5. Barriers and Challenges in the Integrated Design Process Approcach

    DEFF Research Database (Denmark)

    Knudstrup, Mary-Ann

    2006-01-01

    ABSTRACT: In the future, it will be a huge challenge to make sustainable building design by using a more holistic and innovative approach in order to be able to decrease or reduce the use of energy for heating and cooling in new building projects. This is seen in the perspective of the Kyoto agre....... It also describes the barriers and the challenges that must be overcome when trying to cross the borders between the two fields of engineering and architecture to design sustainable architecture....... agreement for reducing the global heating. This paper will briefly present the method of the Integrated Design Process, IDP [1]. It describes the background and means for developing a new method for designing integrated architecture in an interdisciplinary approach between architecture and engineering...

  6. Analog integrated circuits design for processing physiological signals.

    Science.gov (United States)

    Li, Yan; Poon, Carmen C Y; Zhang, Yuan-Ting

    2010-01-01

    Analog integrated circuits (ICs) designed for processing physiological signals are important building blocks of wearable and implantable medical devices used for health monitoring or restoring lost body functions. Due to the nature of physiological signals and the corresponding application scenarios, the ICs designed for these applications should have low power consumption, low cutoff frequency, and low input-referred noise. In this paper, techniques for designing the analog front-end circuits with these three characteristics will be reviewed, including subthreshold circuits, bulk-driven MOSFETs, floating gate MOSFETs, and log-domain circuits to reduce power consumption; methods for designing fully integrated low cutoff frequency circuits; as well as chopper stabilization (CHS) and other techniques that can be used to achieve a high signal-to-noise performance. Novel applications using these techniques will also be discussed.

  7. Moral judgment as information processing: an integrative review.

    Science.gov (United States)

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others' behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.

  8. Moral judgment as information processing: an integrative review

    Science.gov (United States)

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment. PMID:26579022

  9. [The dual process model of addiction. Towards an integrated model?].

    Science.gov (United States)

    Vandermeeren, R; Hebbrecht, M

    2012-01-01

    Neurobiology and cognitive psychology have provided us with a dual process model of addiction. According to this model, behavior is considered to be the dynamic result of a combination of automatic and controlling processes. In cases of addiction the balance between these two processes is severely disturbed. Automated processes will continue to produce impulses that ensure the continuance of addictive behavior. Weak, reflective or controlling processes are both the reason for and the result of the inability to forgo addiction. To identify features that are common to current neurocognitive insights into addiction and psychodynamic views on addiction. The picture that emerges from research is not clear. There is some evidence that attentional bias has a causal effect on addiction. There is no evidence that automatic associations have a causal effect, but there is some evidence that automatic action-tendencies do have a causal effect. Current neurocognitive views on the dual process model of addiction can be integrated with an evidence-based approach to addiction and with psychodynamic views on addiction.

  10. Integrable Floquet dynamics, generalized exclusion processes and "fused" matrix ansatz

    Science.gov (United States)

    Vanicat, Matthieu

    2018-04-01

    We present a general method for constructing integrable stochastic processes, with two-step discrete time Floquet dynamics, from the transfer matrix formalism. The models can be interpreted as a discrete time parallel update. The method can be applied for both periodic and open boundary conditions. We also show how the stationary distribution can be built as a matrix product state. As an illustration we construct parallel discrete time dynamics associated with the R-matrix of the SSEP and of the ASEP, and provide the associated stationary distributions in a matrix product form. We use this general framework to introduce new integrable generalized exclusion processes, where a fixed number of particles is allowed on each lattice site in opposition to the (single particle) exclusion process models. They are constructed using the fusion procedure of R-matrices (and K-matrices for open boundary conditions) for the SSEP and ASEP. We develop a new method, that we named "fused" matrix ansatz, to build explicitly the stationary distribution in a matrix product form. We use this algebraic structure to compute physical observables such as the correlation functions and the mean particle current.

  11. Modular Energy-Efficient and Robust Paradigms for a Disaster-Recovery Process over Wireless Sensor Networks.

    Science.gov (United States)

    Razaque, Abdul; Elleithy, Khaled

    2015-07-06

    Robust paradigms are a necessity, particularly for emerging wireless sensor network (WSN) applications. The lack of robust and efficient paradigms causes a reduction in the provision of quality of service (QoS) and additional energy consumption. In this paper, we introduce modular energy-efficient and robust paradigms that involve two archetypes: (1) the operational medium access control (O-MAC) hybrid protocol and (2) the pheromone termite (PT) model. The O-MAC protocol controls overhearing and congestion and increases the throughput, reduces the latency and extends the network lifetime. O-MAC uses an optimized data frame format that reduces the channel access time and provides faster data delivery over the medium. Furthermore, O-MAC uses a novel randomization function that avoids channel collisions. The PT model provides robust routing for single and multiple links and includes two new significant features: (1) determining the packet generation rate to avoid congestion and (2) pheromone sensitivity to determine the link capacity prior to sending the packets on each link. The state-of-the-art research in this work is based on improving both the QoS and energy efficiency. To determine the strength of O-MAC with the PT model; we have generated and simulated a disaster recovery scenario using a network simulator (ns-3.10) that monitors the activities of disaster recovery staff; hospital staff and disaster victims brought into the hospital. Moreover; the proposed paradigm can be used for general purpose applications. Finally; the QoS metrics of the O-MAC and PT paradigms are evaluated and compared with other known hybrid protocols involving the MAC and routing features. The simulation results indicate that O-MAC with PT produced better outcomes.

  12. Modular Energy-Efficient and Robust Paradigms for a Disaster-Recovery Process over Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Abdul Razaque

    2015-07-01

    Full Text Available Robust paradigms are a necessity, particularly for emerging wireless sensor network (WSN applications. The lack of robust and efficient paradigms causes a reduction in the provision of quality of service (QoS and additional energy consumption. In this paper, we introduce modular energy-efficient and robust paradigms that involve two archetypes: (1 the operational medium access control (O-MAC hybrid protocol and (2 the pheromone termite (PT model. The O-MAC protocol controls overhearing and congestion and increases the throughput, reduces the latency and extends the network lifetime. O-MAC uses an optimized data frame format that reduces the channel access time and provides faster data delivery over the medium. Furthermore, O-MAC uses a novel randomization function that avoids channel collisions. The PT model provides robust routing for single and multiple links and includes two new significant features: (1 determining the packet generation rate to avoid congestion and (2 pheromone sensitivity to determine the link capacity prior to sending the packets on each link. The state-of-the-art research in this work is based on improving both the QoS and energy efficiency. To determine the strength of O-MAC with the PT model; we have generated and simulated a disaster recovery scenario using a network simulator (ns-3.10 that monitors the activities of disaster recovery staff; hospital staff and disaster victims brought into the hospital. Moreover; the proposed paradigm can be used for general purpose applications. Finally; the QoS metrics of the O-MAC and PT paradigms are evaluated and compared with other known hybrid protocols involving the MAC and routing features. The simulation results indicate that O-MAC with PT produced better outcomes.

  13. Westinghouse integrated cementation facility. Smart process automation minimizing secondary waste

    International Nuclear Information System (INIS)

    Fehrmann, H.; Jacobs, T.; Aign, J.

    2015-01-01

    The Westinghouse Cementation Facility described in this paper is an example for a typical standardized turnkey project in the area of waste management. The facility is able to handle NPP waste such as evaporator concentrates, spent resins and filter cartridges. The facility scope covers all equipment required for a fully integrated system including all required auxiliary equipment for hydraulic, pneumatic and electric control system. The control system is based on actual PLC technology and the process is highly automated. The equipment is designed to be remotely operated, under radiation exposure conditions. 4 cementation facilities have been built for new CPR-1000 nuclear power stations in China

  14. Integration of e-learning outcomes into work processes

    OpenAIRE

    Kerstin Grundén

    2011-01-01

    Three case studies of in-house developed e-learning education in public organizations with different pedagogical approaches are used as a starting point for discussion regarding the implementation challenges of e-learning at work. The aim of this article is to contribute to the understanding of integrating mechanisms of e-learning outcomes into work processes in large, public organizations. The case studies were analyzed from a socio-cultural perspective using the MOA-model as a frame of refe...

  15. Integration of social networks in the teaching and learning process

    Directory of Open Access Journals (Sweden)

    Cynthia Dedós Reyes

    2015-09-01

    Full Text Available In this research we explored the integration of social media in the process of learning and teaching, in a private higher education institution, in Puerto Rico. Attention was given to the perspectives of teachers and students. The participants —9 part-time teachers and 118 students— were selected based on availability. The results showed that teachers and students alike use social the network You Tube for academic purposes; and use Facebook, Twitter, and blogs for social purposes and entertainment. Results also revealed that there is no significant contrast between the perspectives of teachers and students digital immigrants.

  16. Biodiesel production process from microalgae oil by waste heat recovery and process integration.

    Science.gov (United States)

    Song, Chunfeng; Chen, Guanyi; Ji, Na; Liu, Qingling; Kansha, Yasuki; Tsutsumi, Atsushi

    2015-10-01

    In this work, the optimization of microalgae oil (MO) based biodiesel production process is carried out by waste heat recovery and process integration. The exergy analysis of each heat exchanger presented an efficient heat coupling between hot and cold streams, thus minimizing the total exergy destruction. Simulation results showed that the unit production cost of optimized process is 0.592$/L biodiesel, and approximately 0.172$/L biodiesel can be avoided by heat integration. Although the capital cost of the optimized biodiesel production process increased 32.5% and 23.5% compared to the reference cases, the operational cost can be reduced by approximately 22.5% and 41.6%. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Business Intelligence Applied to the ALMA Software Integration Process

    Science.gov (United States)

    Zambrano, M.; Recabarren, C.; González, V.; Hoffstadt, A.; Soto, R.; Shen, T.-C.

    2012-09-01

    Software quality assurance and planning of an astronomy project is a complex task, specially if it is a distributed collaborative project such as ALMA, where the development centers are spread across the globe. When you execute a software project there is much valuable information about this process itself that you might be able to collect. One of the ways you can receive this input is via an issue tracking system that will gather the problem reports relative to software bugs captured during the testing of the software, during the integration of the different components or even worst, problems occurred during production time. Usually, there is little time spent on analyzing them but with some multidimensional processing you can extract valuable information from them and it might help you on the long term planning and resources allocation. We present an analysis of the information collected at ALMA from a collection of key unbiased indicators. We describe here the extraction, transformation and load process and how the data was processed. The main goal is to assess a software process and get insights from this information.

  18. Integrative device and process of oxidization, degassing, acidity adjustment of 1BP from APOR process

    Energy Technology Data Exchange (ETDEWEB)

    Zuo, Chen; Zheng, Weifang, E-mail: wfazh@ciae.ac.cn; Yan, Taihong; He, Hui; Li, Gaoliang; Chang, Shangwen; Li, Chuanbo; Yuan, Zhongwei

    2016-02-15

    Graphical abstract: Previous (left) and present (right) device of oxidation, degassing, acidity adjustment of 1BP. - Highlights: • We designed an integrative device and process. • The utilization efficiency of N{sub 2}O{sub 4} is increased significantly. • Our work results in considerable simplification of the device. • Process parameters are determined by experiments. - Abstract: Device and process of oxidization, degassing, acidity adjustment of 1BP (The Pu production feed from U/Pu separation section) from APOR process (Advanced Purex Process based on Organic Reductants) were improved through rational design and experiments. The device was simplified and the process parameters, such as feed position and flow ratio, were determined by experiments. Based on this new device and process, the reductants N,N-dimethylhydroxylamine (DMHAN) and methylhydrazine (MMH) in 1BP solution could be oxidized with much less N{sub 2}O{sub 4} consumption.

  19. New ways of integrating material knowledge into the design process

    DEFF Research Database (Denmark)

    Højris, Anders; Nielsen, Louise Møller

    2013-01-01

    – based on technical performance, no longer apply. Accordingly the approach in this paper is to view information and knowledge about materials through the perspective of organizational memory and technology brokering. This paper is build upon two cases from the German based design studio: designaffairs...... libraries and thereby access to information on new material possibilities has also changed the way designers integrate knowledge about materials into the design process. This means that the traditional design process model, where the selection of materials takes place after the design of form and function...... in order to help clients to find the right material among hundreds of samples. Furthermore a number of material libraries have also been developed into online database, which provides detailed information about new material and makes the information accessible from almost everywhere. The access to material...

  20. Integrated project scheduling and staff assignment with controllable processing times.

    Science.gov (United States)

    Fernandez-Viagas, Victor; Framinan, Jose M

    2014-01-01

    This paper addresses a decision problem related to simultaneously scheduling the tasks in a project and assigning the staff to these tasks, taking into account that a task can be performed only by employees with certain skills, and that the length of each task depends on the number of employees assigned. This type of problems usually appears in service companies, where both tasks scheduling and staff assignment are closely related. An integer programming model for the problem is proposed, together with some extensions to cope with different situations. Additionally, the advantages of the controllable processing times approach are compared with the fixed processing times. Due to the complexity of the integrated model, a simple GRASP algorithm is implemented in order to obtain good, approximate solutions in short computation times.

  1. Integrated Project Scheduling and Staff Assignment with Controllable Processing Times

    Directory of Open Access Journals (Sweden)

    Victor Fernandez-Viagas

    2014-01-01

    Full Text Available This paper addresses a decision problem related to simultaneously scheduling the tasks in a project and assigning the staff to these tasks, taking into account that a task can be performed only by employees with certain skills, and that the length of each task depends on the number of employees assigned. This type of problems usually appears in service companies, where both tasks scheduling and staff assignment are closely related. An integer programming model for the problem is proposed, together with some extensions to cope with different situations. Additionally, the advantages of the controllable processing times approach are compared with the fixed processing times. Due to the complexity of the integrated model, a simple GRASP algorithm is implemented in order to obtain good, approximate solutions in short computation times.

  2. Robust estimates of environmental effects on population vital rates: an integrated capture–recapture model of seasonal brook trout growth, survival and movement in a stream network

    Science.gov (United States)

    Letcher, Benjamin H.; Schueller, Paul; Bassar, Ronald D.; Nislow, Keith H.; Coombs, Jason A.; Sakrejda, Krzysztof; Morrissey, Michael; Sigourney, Douglas B.; Whiteley, Andrew R.; O'Donnell, Matthew J.; Dubreuil, Todd L.

    2015-01-01

    Modelling the effects of environmental change on populations is a key challenge for ecologists, particularly as the pace of change increases. Currently, modelling efforts are limited by difficulties in establishing robust relationships between environmental drivers and population responses.We developed an integrated capture–recapture state-space model to estimate the effects of two key environmental drivers (stream flow and temperature) on demographic rates (body growth, movement and survival) using a long-term (11 years), high-resolution (individually tagged, sampled seasonally) data set of brook trout (Salvelinus fontinalis) from four sites in a stream network. Our integrated model provides an effective context within which to estimate environmental driver effects because it takes full advantage of data by estimating (latent) state values for missing observations, because it propagates uncertainty among model components and because it accounts for the major demographic rates and interactions that contribute to annual survival.We found that stream flow and temperature had strong effects on brook trout demography. Some effects, such as reduction in survival associated with low stream flow and high temperature during the summer season, were consistent across sites and age classes, suggesting that they may serve as robust indicators of vulnerability to environmental change. Other survival effects varied across ages, sites and seasons, indicating that flow and temperature may not be the primary drivers of survival in those cases. Flow and temperature also affected body growth rates; these responses were consistent across sites but differed dramatically between age classes and seasons. Finally, we found that tributary and mainstem sites responded differently to variation in flow and temperature.Annual survival (combination of survival and body growth across seasons) was insensitive to body growth and was most sensitive to flow (positive) and temperature (negative

  3. Nurse practitioner integration: Qualitative experiences of the change management process.

    Science.gov (United States)

    Lowe, Grainne; Plummer, Virginia; Boyd, Leanne

    2018-04-30

    The aim of this qualitative research was to explore perceptions of organisational change related to the integration of nurse practitioners from key nursing stakeholders. The ongoing delivery of effective and efficient patient services is reliant upon the development and sustainability of nurse practitioner roles. Examination of the factors contributing to the underutilization of nurse practitioner roles is crucial to inform future management policies. A change management theory is used to reveal the complexity involved. Qualitative interviews were undertaken using a purposive sampling strategy of key stakeholders. Thematic analysis was undertaken and key themes were correlated to the theoretical framework. The results confirm the benefits of nurse practitioner roles, but suggest organisational structures and embedded professional cultures present barriers to full role optimization. Complicated policy processes are creating barriers to the integration of nurse practitioner roles. The findings increase understanding of the links between strategic planning, human resource management, professional and organisational cultures, governance and politics in change management. Effective leadership drives the change process through the ability to align key components necessary for success. Sustainability of nurse practitioners relies on recognition of their full potential in the health care team. The results of this study highlight the importance of management and leadership in the promotion of advanced nursing skills and experience to better meet patient outcomes. The findings reinforce the potential of nurse practitioners to deliver patient centred, timely and efficient health care. © 2018 John Wiley & Sons Ltd.

  4. Biodiesel production from Jatropha curcas: Integrated process optimization

    International Nuclear Information System (INIS)

    Huerga, Ignacio R.; Zanuttini, María Soledad; Gross, Martín S.; Querini, Carlos A.

    2014-01-01

    Highlights: • The oil obtained from Jatropha curcas fruits has high variability in its properties. • A process for biodiesel production has been developed for small scale projects. • Oil neutralization with the glycerine phase has important advantages. • The glycerine phase and the meal are adequate to produce biogas. - Abstract: Energy obtained from renewable sources has increased its participation in the energy matrix worldwide, and it is expected to maintain this tendency. Both in large and small scales, there have been numerous developments and research with the aim of generating fuels and energy using different raw materials such as alternative crops, algae and lignocellulosic residues. In this work, Jatropha curcas plantation from the North West of Argentina was studied, with the objective of developing integrated processes for low and medium sizes farms. In these cases, glycerine purification and meal detoxification processes represent a very high cost, and usually are not included in the project. Consequently, alternative uses for these products are proposed. This study includes the evaluation of the Jatropha curcas crop during two years, evaluating the yields and oil properties. The solids left after the oil extraction were evaluated as solid fuels, the glycerine and the meal were used to generate biogas, and the oil was used to produce biodiesel. The oil pretreatment was carried out with the glycerine obtained in the biodiesel production process, thus neutralizing the free fatty acid, and decreasing the phosphorous and water content

  5. Pyrometallurgical processing of Integral Fast Reactor metal fuels

    International Nuclear Information System (INIS)

    Battles, J.E.; Miller, W.E.; Gay, E.C.

    1991-01-01

    The pyrometallurgical process for recycling spent metal fuels from the Integral Fast Reactor is now in an advanced state of development. This process involves electrorefining spent fuel with a cadmium anode, solid and liquid cathodes, and a molten salt electrolyte (LiCl-KCl) at 500 degrees C. The initial process feasibility and flowsheet verification studies have been conducted in a laboratory-scale electrorefiner. Based on these studies, a dual cathode approach has been adopted, where uranium is recovered on a solid cathode mandrel and uranium-plutonium is recovered in a liquid cadmium cathode. Consolidation and purification (salt and cadmium removal) of uranium and uranium-plutonium products from the electrorefiner have been successful. The process is being developed with the aid of an engineering-scale electrorefiner, which has been successfully operated for more than three years. In this electrorefiner, uranium has been electrotransported from the cadmium anode to a solid cathode in 10 kg quantities. Also, anodic dissolution of 10 kg batches of chopped, simulated fuel (U--10% Zr) has been demonstrated. Development of the liquid cadmium cathode for recovering uranium-plutonium is under way

  6. Modern integrated environmental monitoring and processing systems for nuclear facilities

    International Nuclear Information System (INIS)

    Oprea, I.

    2000-01-01

    The continuous activity to survey and monitor releases and the current radiation levels in the vicinity of a nuclear object is essential for person and environment protection. Considering the vast amount of information and data needed to keep an updated overview of a situation both during the daily surveillance work and during accident situations, the need for an efficient monitoring and processing system is evident. The rapid development, both in computer technology and in telecommunications, the evolution of fast and accurate computer codes enabling the on-line calculations improve the quality of decision-making in complex situations and assure a high efficiency. The monitoring and processing systems are used both for environmental protection and for controlling nuclear power plant emergency and post-accident situations. Such a system can offer information to the radiation management systems in order to assess the consequences of nuclear accidents and to establish a basis for right decisions in civil defense. The integrated environmental monitoring systems have as main task to record, collect, process and transmit the radiation levels and weather data, incorporating a number of stationary or mobile radiation monitoring equipment, weather parameter measuring station, an information processing center and the communication network, all running under a real-time operating system.They provide the automatic data collection on-line and off-line, remote diagnostic, advanced presentation techniques, including a graphically oriented executive support, which has the ability to respond to an emergency by geographical representation of the hazard zones on the map. The systems are based on local intelligent measuring and transmission units, simultaneous processing and data presentation using a real-time operating system for personal computers and geographical information system (GIS). All information can be managed directly from the map by multilevel data retrieving and

  7. Transient flow analysis of integrated valve opening process

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Xinming; Qin, Benke; Bo, Hanliang, E-mail: bohl@tsinghua.edu.cn; Xu, Xingxing

    2017-03-15

    Highlights: • The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the integrated valve (IV) is the key control component. • The transient flow experiment induced by IV is conducted and the test results are analyzed to get its working mechanism. • The theoretical model of IV opening process is established and applied to get the changing rule of the transient flow characteristic parameters. - Abstract: The control rod hydraulic driving system (CRHDS) is a new type of built-in control rod drive technology and the IV is the key control component. The working principle of integrated valve (IV) is analyzed and the IV hydraulic experiment is conducted. There is transient flow phenomenon in the valve opening process. The theoretical model of IV opening process is established by the loop system control equations and boundary conditions. The valve opening boundary condition equation is established based on the IV three dimensional flow field analysis results and the dynamic analysis of the valve core movement. The model calculation results are in good agreement with the experimental results. On this basis, the model is used to analyze the transient flow under high temperature condition. The peak pressure head is consistent with the one under room temperature and the pressure fluctuation period is longer than the one under room temperature. Furthermore, the changing rule of pressure transients with the fluid and loop structure parameters is analyzed. The peak pressure increases with the flow rate and the peak pressure decreases with the increase of the valve opening time. The pressure fluctuation period increases with the loop pipe length and the fluctuation amplitude remains largely unchanged under different equilibrium pressure conditions. The research results lay the base for the vibration reduction analysis of the CRHDS.

  8. Processes subject to integrated pollution control. Petroleum processes: oil refining and associated processes

    International Nuclear Information System (INIS)

    1995-01-01

    This document, part of a series offering guidance on pollution control regulations issued by Her Majesty's Inspectorate of Pollution, (HMIP) focuses on petroleum processes such as oil refining and other associated processes. The various industrial processes used, their associated pollution release routes into the environment and techniques for controlling these releases are all discussed. Environmental quality standards are related to national and international agreements on pollution control and abatement. HMIP's work on air, water and land pollution monitoring is also reported. (UK)

  9. The integrated model of innovative processes management in foreign countries

    Directory of Open Access Journals (Sweden)

    M. T. Kurametova

    2017-01-01

    Full Text Available The formation of an innovative economy must correspond to the promising areas of development of scientific, technical and social progress. To ensure sustainable innovative development of the national economy, it is not only necessary to develop our own tools and mechanisms that are characteristic of the domestic management model, but also the rational use of foreign experience in this field. Analysis of international experience in the use of various tools and mechanisms, management structures for the creation of high-tech and knowledge-based enterprises showed: the integrated nature of innovative development and modernization of the economy is the most sound methodological approach of a phased, systemic transition to new technological structures; When developing tools and mechanisms for innovative development of the economy, one should take into account the actual state of the material and technical base and the existing industrial structure of production, take into account the real possibilities of using different types of resources. The greatest innovation activity is shown by those countries in which the national integrated system effectively provides favorable conditions for the development and introduction of innovations in various spheres of life. International experience in the use of forms of governance can be considered as a mobile system of relations with the real sector of the economy. In the article is given the experience of foreign countries, and examples of adaptation for Kazakhstan integrated models of management of innovative processes to create high-tech enterprises, innovative products which can be competitive in the world market. The author highlighted the role of JSC “Kazakhtelecom” with the widespread provision of public services, having the status of a National operator associated with the provision of the services including long-distance and an international telecommunication for telecommunication networks in General

  10. An Intercompany Perspective on Biopharmaceutical Drug Product Robustness Studies.

    Science.gov (United States)

    Morar-Mitrica, Sorina; Adams, Monica L; Crotts, George; Wurth, Christine; Ihnat, Peter M; Tabish, Tanvir; Antochshuk, Valentyn; DiLuzio, Willow; Dix, Daniel B; Fernandez, Jason E; Gupta, Kapil; Fleming, Michael S; He, Bing; Kranz, James K; Liu, Dingjiang; Narasimhan, Chakravarthy; Routhier, Eric; Taylor, Katherine D; Truong, Nobel; Stokes, Elaine S E

    2018-02-01

    The Biophorum Development Group (BPDG) is an industry-wide consortium enabling networking and sharing of best practices for the development of biopharmaceuticals. To gain a better understanding of current industry approaches for establishing biopharmaceutical drug product (DP) robustness, the BPDG-Formulation Point Share group conducted an intercompany collaboration exercise, which included a bench-marking survey and extensive group discussions around the scope, design, and execution of robustness studies. The results of this industry collaboration revealed several key common themes: (1) overall DP robustness is defined by both the formulation and the manufacturing process robustness; (2) robustness integrates the principles of quality by design (QbD); (3) DP robustness is an important factor in setting critical quality attribute control strategies and commercial specifications; (4) most companies employ robustness studies, along with prior knowledge, risk assessments, and statistics, to develop the DP design space; (5) studies are tailored to commercial development needs and the practices of each company. Three case studies further illustrate how a robustness study design for a biopharmaceutical DP balances experimental complexity, statistical power, scientific understanding, and risk assessment to provide the desired product and process knowledge. The BPDG-Formulation Point Share discusses identified industry challenges with regard to biopharmaceutical DP robustness and presents some recommendations for best practices. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  11. Critical factors in the implementation process of integrated management systems

    Directory of Open Access Journals (Sweden)

    Ademir Antonio Ferreira

    2015-09-01

    Full Text Available This study is the result of research whose purpose was to study the implementation process of integrated management systems, called ERP Enterprise Resource Planning in the business environment. This study, more specifically, tried to identify the variables in this process and that, somehow, made it easy or caused some type of difficulty implementing the system. Based on the mixed method approach (Creswell, 2003, the study was performed by means of the content analysis of technical and scientific publications about this theme and by means of a field research for data collection from primary sources. The content analysis was based on the per mile procedure by Bardin (1977, making it possible to identify critical factors that may be found in the implementation of ERP system projects. Primary data was collected from structured interviews with the managers in charge of the implementation of the system, in each of the 12 companies in different sectors of the economy and based in Brazil. Based on this information, it was possible to test the factors extracted from the content analysis and then develop a list of factors that may effectively influence the implementation process of the system. In order to recognize the possible relations between the selected factors, the Spearman (rsp correlation coefficient was applied and the multiple regression analysis was performed by means of the stepwise procedure. The purpose of the regression analysis was to determine the relation of the “Assessment of the Implementation” dependent variable with other dependent variables in the selected categories. The results of these analyses showed that the support of the top management, the communication process for the clear evidence of this support, the technical support of the ERP program provider together with the project team expertise, training and qualification processes of the team in the system operation are significantly correlated and relevant factors for a

  12. Process integration and pinch analysis in sugarcane industry

    Energy Technology Data Exchange (ETDEWEB)

    Prado, Adelk de Carvalho; Pinheiro, Ricardo Brant [UFMG, Departamento de Engenharia Nuclear, Programa de Pos-Graduacao em Ciencias e Tecnicas Nucleares, Belo Horizonte, MG (Brazil)], E-mail: rbp@nuclear.ufmg.br

    2010-07-01

    Process integration techniques were applied, particularly through the Pinch Analysis method, to sugarcane industry. Research was performed upon harvest data from an agroindustrial complex which processes sugarcane plant in excess of 3.5 million metric tons per year, producing motor fuel grade ethanol, standard quality sugar, and delivering excess electric power to the grid. Pinch Analysis was used in assessing internal heat recovery as well as external utility demand targets, while keeping the lowest but economically achievable targets for entropy increase. Efficiency on the use of energy was evaluated for the plant as it was found (the base case) as well as for five selected process and/or plant design modifications, always with guidance of the method. The first alternative design (case 2) was proposed to evaluate equipment mean idle time in the base case, to support subsequent comparisons. Cases 3 and 4 were used to estimate the upper limits of combined heat and power generation while raw material supply of the base case is kept; both the cases did not prove worth implementing. Cases 5 and 6 were devised to deal with the bottleneck of the plant, namely boiler capacity, in order to allow for some production increment. Inexpensive, minor modifications considered in case 5 were found unable to produce reasonable outcome gain. Nevertheless, proper changes in cane juice evaporation section (case 6) could allow sugar and ethanol combined production to rise up to 9.1% relative to the base case, without dropping cogenerated power. (author)

  13. The IBA Easy-E-Beam™ Integrated Processing System

    Science.gov (United States)

    Cleland, Marshall R.; Galloway, Richard A.; Lisanti, Thomas F.

    2011-06-01

    IBA Industrial Inc., (formerly known as Radiation Dynamics, Inc.) has been making high-energy and medium-energy, direct-current proton and electron accelerators for research and industrial applications for many years. Some industrial applications of high-power electron accelerators are the crosslinking of polymeric materials and products, such as the insulation on electrical wires, multi-conductor cable jackets, heat-shrinkable plastic tubing and film, plastic pipe, foam and pellets, the partial curing of rubber sheet for automobile tire components, and the sterilization of disposable medical devices. The curing (polymerization and crosslinking) of carbon and glass fiber-reinforced composite plastic parts, the preservation of foods and the treatment of waste materials are attractive possibilities for future applications. With electron energies above 1.0 MeV, the radiation protection for operating personnel is usually provided by surrounding the accelerator facility with thick concrete walls. With lower energies, steel and lead panels can be used, which are substantially thinner and more compact than the equivalent concrete walls. IBA has developed a series of electron processing systems called Easy-e-Beam™ for the medium energy range from 300 keV to 1000 keV. These systems include the shielding as an integral part of a complete radiation processing facility. The basic concepts of the electron accelerator, the product processing equipment, the programmable control system, the configuration of the radiation shielding and some performance characteristics are described in this paper.

  14. The IBA Easy-E-Beam Integrated Processing System

    International Nuclear Information System (INIS)

    Cleland, Marshall R.; Galloway, Richard A.; Lisanti, Thomas F.

    2011-01-01

    IBA Industrial Inc., (formerly known as Radiation Dynamics, Inc.) has been making high-energy and medium-energy, direct-current proton and electron accelerators for research and industrial applications for many years. Some industrial applications of high-power electron accelerators are the crosslinking of polymeric materials and products, such as the insulation on electrical wires, multi-conductor cable jackets, heat-shrinkable plastic tubing and film, plastic pipe, foam and pellets, the partial curing of rubber sheet for automobile tire components, and the sterilization of disposable medical devices. The curing (polymerization and crosslinking) of carbon and glass fiber-reinforced composite plastic parts, the preservation of foods and the treatment of waste materials are attractive possibilities for future applications. With electron energies above 1.0 MeV, the radiation protection for operating personnel is usually provided by surrounding the accelerator facility with thick concrete walls. With lower energies, steel and lead panels can be used, which are substantially thinner and more compact than the equivalent concrete walls. IBA has developed a series of electron processing systems called Easy-e-Beam for the medium energy range from 300 keV to 1000 keV. These systems include the shielding as an integral part of a complete radiation processing facility. The basic concepts of the electron accelerator, the product processing equipment, the programmable control system, the configuration of the radiation shielding and some performance characteristics are described in this paper.

  15. The importance of sensory integration processes for action cascading

    Science.gov (United States)

    Gohil, Krutika; Stock, Ann-Kathrin; Beste, Christian

    2015-01-01

    Dual tasking or action cascading is essential in everyday life and often investigated using tasks presenting stimuli in different sensory modalities. Findings obtained with multimodal tasks are often broadly generalized, but until today, it has remained unclear whether multimodal integration affects performance in action cascading or the underlying neurophysiology. To bridge this gap, we asked healthy young adults to complete a stop-change paradigm which presented different stimuli in either one or two modalities while recording behavioral and neurophysiological data. Bimodal stimulus presentation prolonged response times and affected bottom-up and top-down guided attentional processes as reflected by the P1 and N1, respectively. However, the most important effect was the modulation of response selection processes reflected by the P3 suggesting that a potentially different way of forming task goals operates during action cascading in bimodal vs. unimodal tasks. When two modalities are involved, separate task goals need to be formed while a conjoint task goal may be generated when all stimuli are presented in the same modality. On a systems level, these processes seem to be related to the modulation of activity in fronto-polar regions (BA10) as well as Broca's area (BA44). PMID:25820681

  16. Integration Processes on Civil Service Reform in the Eurasian Space

    Directory of Open Access Journals (Sweden)

    George A. Borshevskiy

    2016-01-01

    Full Text Available In the article was studied the process of reforming the institute of civil service in the countries of the Eurasian space (e.g. Russia, Belarus and Kazakhstan. The integration of national systems of public administration and, in particular, the civil service, is an important factor contributing to the implementation of the centripetal tendencies in the post-Soviet space. The research methodology is based on a combination of comparative legal analysis, historical retrospective method, normalization and scaling, structural-functional and system analysis. A comparison of the legal models of public service was made in research. The author puts forward the hypothesis that it is presence the relationship between the quantitative changes (for example, number of employees of civil service and the dynamics of macroeconomic indicators (e.g. number of employed in the economy. In this regard were observed common trends. On materials of the statistical surveys were considered quantitative changes in national systems of civil service. The study of the socio-demographic characteristics of the public service (gender, age, profession allowed to formulate conclusions about the general and specific trends in the reform of the civil service of the analyzed countries. A number of values were first calculated by the author. The work is intended to become the basis for a broad international research on the development of civil service, which is the central mechanism for implementation the integration in the post-Soviet space.

  17. Integrated preventive maintenance and production decisions for imperfect processes

    International Nuclear Information System (INIS)

    Nourelfath, Mustapha; Nahas, Nabil; Ben-Daya, Mohamed

    2016-01-01

    This paper integrates production, maintenance, and quality for an imperfect process in a multi-period multi-product capacitated lot-sizing context. The production system is modeled as an imperfect machine, whose the status is considered to be either in-control or out-of-control. When the machine is out of control, it produces a fraction of nonconforming items. During each period, this machine is inspected and imperfect preventive maintenance activities are simultaneously performed to reduce its age, proportional to the preventive maintenance level. The objective is to minimize the total cost, while satisfying the demand for all products. Our optimization model allows for a joint selection of the optimal values of production plan, and the maintenance policy, while taking into account quality related costs. A solution algorithm is developed and illustrative numerical examples are presented. It is found that the increase in PM level leads to reductions in quality control costs. Furthermore, if the cost of performing PM is high to the point where it is not compensated for by reductions in the quality related costs, then performing PM is not justifiable. Finally, using non-periodic preventive maintenance with the possibility of different preventive maintenance levels may result in an improvement of the total cost. - Highlights: • We integrate production, maintenance, and quality. • We evaluate all the expected costs. • Our model allows for a joint selection of the optimal values. • A solution algorithm is developed. • Increasing PM level will decrease quality control costs.

  18. UNCERTAINTY IN THE PROCESS INTEGRATION FOR THE BIOREFINERIES DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Meilyn González Cortés

    2015-07-01

    Full Text Available This paper presents how the design approaches with high level of flexibility can reduce the additional costs of the strategies that apply overdesign factors to consider parameters with uncertainty that impact on the economic feasibility of a project. The elements with associate uncertainties and that are important in the configurations of the process integration under a biorefinery scheme are: raw material, raw material technologies of conversion, and variety of products that can be obtained. From the analysis it is obtained that in the raw materials and products with potentialities in a biorefinery scheme, there are external uncertainties such as availability, demands and prices in the market. Those external uncertainties can determine their impact on the biorefinery and also in the product prices we can find minimum and maximum limits that can be identified in intervals which should be considered for the project economic evaluation and the sensibility analysis due to varied conditions.

  19. Clinical process in an integrative psychotherapy for self-wounds.

    Science.gov (United States)

    Wolfe, Barry E

    2013-09-01

    In this article, I will briefly describe the clinical process of an integrative psychotherapy for the healing of self-wounds, including its intended interventions and the variability of their application and outcome. Four specific strategies will be considered, including (a) the role of empathy throughout the course of therapy; (b) exposure therapy as a paradigmatic treatment for the treatment of feared thoughts, behavior, and emotions; (c) focusing and other experiential interventions for eliciting self-wounds; and (d) modification and healing of self-wounds with an individualized array of psychodynamic, experiential, and cognitive-behavioral strategies. In addition, we will briefly consider the impact of transference and countertransference on the trajectory of therapy. 2013 APA, all rights reserved

  20. An Integrated Numerical Model of the Spray Forming Process

    DEFF Research Database (Denmark)

    Pryds, Nini; Hattel, Jesper; Pedersen, Trine Bjerre

    2002-01-01

    of the deposition model is accomplished using a 2D cylindrical heat flow model. This model is now coupled with an atomization model via a log-normal droplet size distribution. The coupling between the atomization and the deposition is accomplished by ensuring that the total droplet size distribution of the spray......In this paper, an integrated approach for modelling the entire spray forming process is presented. The basis for the analysis is a recently developed model which extents previous studies and includes the interaction between an array of droplets and the enveloping gas. The formulation...... is in fact the summation of 'local' droplet size distributions along the r-axis. A key parameter, which determines the yield and the shape of the deposit material, is the sticking efficiency. The sticking phenomenon is therefore incorporated into the deposition model. (C) 2002 Acta Materialia Inc. Published...

  1. PERSPECTIVES ON INTEROPERABILITY INTEGRATION WITHIN NATO DEFENSE PLANNING PROCESS

    Directory of Open Access Journals (Sweden)

    Florian CIOCAN

    2011-01-01

    Full Text Available Interoperability is not a new area of effort at NATO level. In fact, interoperability and more specifi cally standardization, has been a key element of the Alliance’s approach to fi elding forces for decades. But as the security and operational environment has been in a continuous change, the need to face the new threats and the current involvement in challenging operations in Afghanistan and elsewhere alongside with the necessity to interoperate at lower and lower levels of command with an increasing number of nations, including non-NATO ISAF partners, NGOs, and other organizations, have made the task even more challenging. In this respect Interoperability Integration within NATO Defense Planning Process will facilitate the timely identifi cation, development and delivery of required forces and capabilities that are interoperable and adequately prepared, equipped, trained and supported to undertake the Alliance’s full spectrum of missions.

  2. Quantifying chaotic dynamics from integrate-and-fire processes

    Energy Technology Data Exchange (ETDEWEB)

    Pavlov, A. N. [Department of Physics, Saratov State University, Astrakhanskaya Str. 83, 410012 Saratov (Russian Federation); Saratov State Technical University, Politehnicheskaya Str. 77, 410054 Saratov (Russian Federation); Pavlova, O. N. [Department of Physics, Saratov State University, Astrakhanskaya Str. 83, 410012 Saratov (Russian Federation); Mohammad, Y. K. [Department of Physics, Saratov State University, Astrakhanskaya Str. 83, 410012 Saratov (Russian Federation); Tikrit University Salahudin, Tikrit Qadisiyah, University Str. P.O. Box 42, Tikrit (Iraq); Kurths, J. [Potsdam Institute for Climate Impact Research, Telegraphenberg A 31, 14473 Potsdam (Germany); Institute of Physics, Humboldt University Berlin, 12489 Berlin (Germany)

    2015-01-15

    Characterizing chaotic dynamics from integrate-and-fire (IF) interspike intervals (ISIs) is relatively easy performed at high firing rates. When the firing rate is low, a correct estimation of Lyapunov exponents (LEs) describing dynamical features of complex oscillations reflected in the IF ISI sequences becomes more complicated. In this work we discuss peculiarities and limitations of quantifying chaotic dynamics from IF point processes. We consider main factors leading to underestimated LEs and demonstrate a way of improving numerical determining of LEs from IF ISI sequences. We show that estimations of the two largest LEs can be performed using around 400 mean periods of chaotic oscillations in the regime of phase-coherent chaos. Application to real data is discussed.

  3. Second stage gasifier in staged gasification and integrated process

    Science.gov (United States)

    Liu, Guohai; Vimalchand, Pannalal; Peng, Wan Wang

    2015-10-06

    A second stage gasification unit in a staged gasification integrated process flow scheme and operating methods are disclosed to gasify a wide range of low reactivity fuels. The inclusion of second stage gasification unit operating at high temperatures closer to ash fusion temperatures in the bed provides sufficient flexibility in unit configurations, operating conditions and methods to achieve an overall carbon conversion of over 95% for low reactivity materials such as bituminous and anthracite coals, petroleum residues and coke. The second stage gasification unit includes a stationary fluidized bed gasifier operating with a sufficiently turbulent bed of predefined inert bed material with lean char carbon content. The second stage gasifier fluidized bed is operated at relatively high temperatures up to 1400.degree. C. Steam and oxidant mixture can be injected to further increase the freeboard region operating temperature in the range of approximately from 50 to 100.degree. C. above the bed temperature.

  4. A more robust model of the biodiesel reaction, allowing identification of process conditions for significantly enhanced rate and water tolerance.

    Science.gov (United States)

    Eze, Valentine C; Phan, Anh N; Harvey, Adam P

    2014-03-01

    A more robust kinetic model of base-catalysed transesterification than the conventional reaction scheme has been developed. All the relevant reactions in the base-catalysed transesterification of rapeseed oil (RSO) to fatty acid methyl ester (FAME) were investigated experimentally, and validated numerically in a model implemented using MATLAB. It was found that including the saponification of RSO and FAME side reactions and hydroxide-methoxide equilibrium data explained various effects that are not captured by simpler conventional models. Both the experiment and modelling showed that the "biodiesel reaction" can reach the desired level of conversion (>95%) in less than 2min. Given the right set of conditions, the transesterification can reach over 95% conversion, before the saponification losses become significant. This means that the reaction must be performed in a reactor exhibiting good mixing and good control of residence time, and the reaction mixture must be quenched rapidly as it leaves the reactor. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Robustness testing in pharmaceutical freeze-drying: inter-relation of process conditions and product quality attributes studied for a vaccine formulation.

    Science.gov (United States)

    Schneid, Stefan C; Stärtzel, Peter M; Lettner, Patrick; Gieseler, Henning

    2011-01-01

    The recent US Food and Drug Administration (FDA) legislation has introduced the evaluation of the Design Space of critical process parameters in manufacturing processes. In freeze-drying, a "formulation" is expected to be robust when minor deviations of the product temperature do not negatively affect the final product quality attributes. To evaluate "formulation" robustness by investigating the effect of elevated product temperature on product quality using a bacterial vaccine solution. The vaccine solution was characterized by freeze-dry microscopy to determine the critical formulation temperature. A conservative cycle was developed using the SMART™ mode of a Lyostar II freeze dryer. Product temperature was elevated to imitate intermediate and aggressive cycle conditions. The final product was analyzed using X-ray powder diffraction (XRPD), scanning electron microscopy (SEM), Karl Fischer, and modulated differential scanning calorimetry (MDSC), and the life cell count (LCC) during accelerated stability testing. The cakes processed at intermediate and aggressive conditions displayed larger pores with microcollapse of walls and stronger loss in LCC than the conservatively processed product, especially during stability testing. For all process conditions, a loss of the majority of cells was observed during storage. For freeze-drying of life bacterial vaccine solutions, the product temperature profile during primary drying appeared to be inter-related to product quality attributes.

  6. Integrated HLW Conceptual Process Flowsheet(s) for the Crystalline Silicotitanate Process SRDF-98-04

    International Nuclear Information System (INIS)

    Jacobs, R.A.

    1998-01-01

    The Strategic Research and Development Fund (SRDF) provided funds to develop integrated conceptual flowsheets and material balances for a CST process as a potential replacement for, or second generation to, the ITP process. This task directly supports another SRDF task: Glass Form for HLW Sludge with CST, SRDF-98-01, by M. K. Andrews which seeks to further develop sludge/CST glasses that could be used if the ITP process were replaced by CST ion exchange. The objective of the proposal was to provide flowsheet support for development and evaluation of a High Level Waste Division process to replace ITP. The flowsheets would provide a conceptual integrated material balance showing the impact on the HLW division. The evaluation would incorporate information to be developed by Andrews and Harbour on CST/DWPF glass formulations and provide the bases for evaluating the economic impact of the proposed replacement process. Coincident with this study, the Salt Disposition Team began its evaluation of alternatives for disposition of the HLW salts in the SRS waste tanks. During that time, the CST IX process was selected as one of four alternatives (of eighteen Phase II alternatives) for further evaluation during Phase III

  7. Application of the Hydroecological Integrity Assessment Process for Missouri Streams

    Science.gov (United States)

    Kennen, Jonathan G.; Henriksen, James A.; Heasley, John; Cade, Brian S.; Terrell, James W.

    2009-01-01

    Natural flow regime concepts and theories have established the justification for maintaining or restoring the range of natural hydrologic variability so that physiochemical processes, native biodiversity, and the evolutionary potential of aquatic and riparian assemblages can be sustained. A synthesis of recent research advances in hydroecology, coupled with stream classification using hydroecologically relevant indices, has produced the Hydroecological Integrity Assessment Process (HIP). HIP consists of (1) a regional classification of streams into hydrologic stream types based on flow data from long-term gaging-station records for relatively unmodified streams, (2) an identification of stream-type specific indices that address 11 subcomponents of the flow regime, (3) an ability to establish environmental flow standards, (4) an evaluation of hydrologic alteration, and (5) a capacity to conduct alternative analyses. The process starts with the identification of a hydrologic baseline (reference condition) for selected locations, uses flow data from a stream-gage network, and proceeds to classify streams into hydrologic stream types. Concurrently, the analysis identifies a set of non-redundant and ecologically relevant hydrologic indices for 11 subcomponents of flow for each stream type. Furthermore, regional hydrologic models for synthesizing flow conditions across a region and the development of flow-ecology response relations for each stream type can be added to further enhance the process. The application of HIP to Missouri streams identified five stream types ((1) intermittent, (2) perennial runoff-flashy, (3) perennial runoff-moderate baseflow, (4) perennial groundwater-stable, and (5) perennial groundwater-super stable). Two Missouri-specific computer software programs were developed: (1) a Missouri Hydrologic Assessment Tool (MOHAT) which is used to establish a hydrologic baseline, provide options for setting environmental flow standards, and compare past and

  8. Integrated Process Design and Control of Multi-element Reactive Distillation Processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil; Sales-Cruz, Mauricio; Huusom, Jakob Kjøbsted

    2016-01-01

    In this work, integrated process design and control of reactive distillation processes involving multi-elements is presented. The reactive distillation column is designed using methods and tools which are similar in concept to non-reactive distillation design methods, such as driving force approach....... The methods employed in this work are based on equivalent element concept. This concept facilitates the representation of a multi-element reactive system as equivalent binary light and heavy key elements. First, the reactive distillation column is designed at the maximum driving force where through steady...

  9. Process for integrating surface drainage constraints on mine planning

    Energy Technology Data Exchange (ETDEWEB)

    Sawatsky, L.F; Ade, F.L.; McDonald, D.M.; Pullman, B.J. [Golder Associates Ltd., Calgary, AB (Canada)

    2009-07-01

    Surface drainage for mine closures must be considered during all phases of mine planning and design in order to minimize environmental impacts and reduce costs. This paper discussed methods of integrating mine drainage criteria and associated mine planning constraints into the mine planning process. Drainage constraints included stream diversions; fish compensation channels; collection receptacles for the re-use of process water; separation of closed circuit water from fresh water; and the provision of storage ponds. The geomorphic approach replicated the ability of natural channels to respond to local and regional changes in hydrology as well as channel disturbances from extreme flood events, sedimentation, debris, ice jams, and beaver activity. The approach was designed to enable a sustainable system and provide conveyance capacity for extreme floods without spillage to adjacent watersheds. Channel dimensions, bank and bed materials, sediment loads, bed material supplies and the hydrologic conditions of the analogue stream were considered. Hydrologic analyses were conducted to determine design flood flow. Channel routes, valley slopes, sinuosity, width, and depth were established. It was concluded that by incorporating the geomorphic technique, mine operators and designers can construct self-sustaining drainage systems that require little or no maintenance in the long-term. 7 refs.

  10. Integrating Process Mining and Cognitive Analysis to Study EHR Workflow.

    Science.gov (United States)

    Furniss, Stephanie K; Burton, Matthew M; Grando, Adela; Larson, David W; Kaufman, David R

    2016-01-01

    There are numerous methods to study workflow. However, few produce the kinds of in-depth analyses needed to understand EHR-mediated workflow. Here we investigated variations in clinicians' EHR workflow by integrating quantitative analysis of patterns of users' EHR-interactions with in-depth qualitative analysis of user performance. We characterized 6 clinicians' patterns of information-gathering using a sequential process-mining approach. The analysis revealed 519 different screen transition patterns performed across 1569 patient cases. No one pattern was followed for more than 10% of patient cases, the 15 most frequent patterns accounted for over half ofpatient cases (53%), and 27% of cases exhibited unique patterns. By triangulating quantitative and qualitative analyses, we found that participants' EHR-interactive behavior was associated with their routine processes, patient case complexity, and EHR default settings. The proposed approach has significant potential to inform resource allocation for observation and training. In-depth observations helped us to explain variation across users.

  11. Integrative techniques related to positive processes in psychotherapy.

    Science.gov (United States)

    Cromer, Thomas D

    2013-09-01

    This review compiles and evaluates a number of therapist interventions that have been found to significantly contribute to positive psychotherapy processes (i.e., increased alliance, patient engagement/satisfaction, and symptomatic improvement). Four forms of intervention are presented: Affect-focused, Supportive, Exploratory, and Patient-Therapist Interaction. The intention of this review is to link specific interventions to applied practice so that integrative clinicians can potentially use these techniques to improve their clinical work. To this end, there is the inclusion of theory and empirical studies from a range of orientations including Emotionally Focused, Psychodynamic, Client-Centered, Cognitive-Behavioral, Interpersonal, Eclectic, and Motivational Interviewing. Each of the four sections will include the theoretical basis and proposed mechanism of change for the intervention, research that supports its positive impact on psychotherapy processes, and conclude with examples demonstrating its use in actual practice. Clinical implications and considerations regarding the use of these interventions will also be presented. 2013 APA, all rights reserved

  12. Social networks in nursing work processes: an integrative literature review

    Directory of Open Access Journals (Sweden)

    Ana Cláudia Mesquita

    Full Text Available Abstract OBJECTIVE To identify and analyze the available evidence in the literature on the use of social networks in nursing work processes. METHOD An integrative review of the literature conducted in PubMed, CINAHL, EMBASE and LILACS databases in January 2016, using the descriptors social media, social networking, nursing, enfermagem, redes sociais, mídias sociais, and the keyword nursing practice, without year restriction. RESULTS The sample consisted of 27 international articles which were published between 2011 and 2016. The social networks used were Facebook (66.5%, Twitter (30% and WhatsApp (3.5%. In 70.5% of the studies, social networks were used for research purposes, in 18.5% they were used as a tool aimed to assist students in academic activities, and in 11% for executing interventions via the internet. CONCLUSION Nurses have used social networks in their work processes such as Facebook, Twitter and WhatsApp to research, teach and watch. The articles show several benefits in using such tools in the nursing profession; however, ethical considerations regarding the use of social networks deserve further discussion.

  13. Development of post-decontamination process after integrated EPN treatment

    International Nuclear Information System (INIS)

    Park, J. H.; Chung, U. S.; Lee, K. W.; Hwang, D. S.; Kim, T. J.; Hong, S. B.; Hwang, S. T.

    2010-02-01

    It is desirable to recycle or reuse the waste generated from nuclear facilities for the better economy of the nuclear industry. Metal shares a large portion of the waste and its reuse becomes a world wide interest. It is well known that the measurement of such a low radioactivity that is lower than the release criteria is not easy. It becomes more difficult when the contamination is not homogenous and the shape of metal waste is complex. In order to get the more accurate results of the measurement of low radioactivity, a melting process was proposed for homogenizing the residual contaminants and for not missing of hot spots of contamination. A induction furnace system was selected after evaluation of the processes and a furnace of 15 kg/batch scale with cooling water supply, inverter for high frequency electricity source and off gas treatment system was installed and operated. The homogeneity of the ingot was verified by using an experiment of the metal tracers and chemical tracers. The volume radioactivity limit of each nuclide was calculated. From these calculation results and comparison study with the limits valued defined in other countries and recommended by the IAEA, the criteria of volume radioactivity for the release of metal waste were proposed. In order to determine the target radioactivity in the integrated EPN decontamination, a correlation between the residual radioactivity and the public radiation dose was proposed

  14. Case Study: Accelerating Process Improvement by Integrating the TSP and CMMI

    National Research Council Canada - National Science Library

    Wall, Daniel S; McHale, James; Pomeroy-Huff, Marsha

    2005-01-01

    .... This case study describes the process improvement efforts of both NAVAIR divisions and how they integrated the two SEI technologies to accelerate process improvement within their organizations...

  15. Neural processing of gendered information is more robustly associated with mothers' gendered communication with children than mothers' implicit and explicit gender stereotypes.

    Science.gov (United States)

    Endendijk, Joyce J; Spencer, Hannah; Bos, Peter A; Derks, Belle

    2018-04-26

    Processes like gender socialization (the ways in which parents convey information to their children about how girls and boys should behave) often happen unconsciously and might therefore be studied best with neuroscientific measures. We examined whether neural processing of gender-stereotype-congruent and incongruent information is more robustly related to mothers' gendered socialization of their child than mothers' implicit and explicit gender stereotypes. To this end, we examined event-related potentials (ERPs) of mothers (N = 35) completing an implicit gender-stereotype task and mothers' gender stereotypes in relation to observed gendered communication with their child (2-6 years old) in a naturalistic picture-book-reading setting. Increased N2 activity (previously related to attentional processes) to gender stimuli in the implicit gender-stereotype task was associated with mothers' positive evaluation of similar gendered behaviors and activities in the picture book they read with their child. Increased P300 activity (previously related to attention to unexpected events) to incongruent trials in the gender-stereotype task was associated with a more positive evaluation of congruent versus incongruent pictures. Compared to mothers' gender stereotypes, neural processing of gendered information was more robustly related to how mothers talk to their children about boys' and girls' stereotype-congruent and incongruent behavior, and masculine and feminine activities.

  16. Robustness in econometrics

    CERN Document Server

    Sriboonchitta, Songsak; Huynh, Van-Nam

    2017-01-01

    This book presents recent research on robustness in econometrics. Robust data processing techniques – i.e., techniques that yield results minimally affected by outliers – and their applications to real-life economic and financial situations are the main focus of this book. The book also discusses applications of more traditional statistical techniques to econometric problems. Econometrics is a branch of economics that uses mathematical (especially statistical) methods to analyze economic systems, to forecast economic and financial dynamics, and to develop strategies for achieving desirable economic performance. In day-by-day data, we often encounter outliers that do not reflect the long-term economic trends, e.g., unexpected and abrupt fluctuations. As such, it is important to develop robust data processing techniques that can accommodate these fluctuations.

  17. Chemical modification of protein a chromatography ligands with polyethylene glycol. II: Effects on resin robustness and process selectivity.

    Science.gov (United States)

    Weinberg, Justin; Zhang, Shaojie; Kirkby, Allison; Shachar, Enosh; Carta, Giorgio; Przybycien, Todd

    2018-04-20

    We have proposed chemical modification of Protein A (ProA) chromatography ligands with polyethylene glycol (PEGylation) as a strategy to increase the resin selectivity and robustness by providing the ligand with a steric repulsion barrier against non-specific binding. Here, we report on robustness and selectivity benefits for Repligen CaptivA PriMAB resin with ligands modified with 5.2 kDa and 21.5 kDa PEG chains, respectively. PEGylation of ProA ligands allowed the resin to retain a higher percentage of static binding capacity relative to the unmodified resin upon digestion with chymotrypsin, a representative serine protease. The level of protection against digestion was independent of the PEG molecular weight or modification extent for the PEGylation chemistry used. Additionally, PEGylation of the ligands was found to decrease the level of non-specific binding of fluorescently labeled bovine serum albumin (BSA) aggregates to the surface of the resin particles as visualized via confocal laser scanning microscopy (CLSM). The level of aggregate binding decreased as the PEG molecular weight increased, but increasing the extent of modification with 5.2 kDa PEG chains had no effect. Further examination of resin particles via CLSM confirmed that the PEG chains on the modified ligands were capable of blocking the "hitchhiking" association of BSA, a mock contaminant, to an adsorbed mAb that is prone to BSA binding. Ligands modified with 21.5 kDa PEG chains were effective at blocking the association, while ligands modified with 5.2 kDa PEG chains were not. Finally, ligands with 21.5 kDa PEG chains increased the selectivity of the resin against host cell proteins (HCPs) produced by Chinese Hamster Ovary (CHO) cells by up to 37% during purification of a monoclonal antibody (mAb) from harvested cell culture fluid (HCCF) using a standard ProA chromatography protocol. The combined work suggests that PEGylating ProA chromatography media is a viable pathway for

  18. Energy landscape reveals that the budding yeast cell cycle is a robust and adaptive multi-stage process.

    Directory of Open Access Journals (Sweden)

    Cheng Lv

    2015-03-01

    Full Text Available Quantitatively understanding the robustness, adaptivity and efficiency of cell cycle dynamics under the influence of noise is a fundamental but difficult question to answer for most eukaryotic organisms. Using a simplified budding yeast cell cycle model perturbed by intrinsic noise, we systematically explore these issues from an energy landscape point of view by constructing an energy landscape for the considered system based on large deviation theory. Analysis shows that the cell cycle trajectory is sharply confined by the ambient energy barrier, and the landscape along this trajectory exhibits a generally flat shape. We explain the evolution of the system on this flat path by incorporating its non-gradient nature. Furthermore, we illustrate how this global landscape changes in response to external signals, observing a nice transformation of the landscapes as the excitable system approaches a limit cycle system when nutrients are sufficient, as well as the formation of additional energy wells when the DNA replication checkpoint is activated. By taking into account the finite volume effect, we find additional pits along the flat cycle path in the landscape associated with the checkpoint mechanism of the cell cycle. The difference between the landscapes induced by intrinsic and extrinsic noise is also discussed. In our opinion, this meticulous structure of the energy landscape for our simplified model is of general interest to other cell cycle dynamics, and the proposed methods can be applied to study similar biological systems.

  19. Robust Design Impact Metrics: Measuring the effect of implementing and using Robust Design

    DEFF Research Database (Denmark)

    Ebro, Martin; Olesen, Jesper; Howard, Thomas J.

    2014-01-01

    Measuring the performance of an organisation’s product development process can be challenging due to the limited use of metrics in R&D. An organisation considering whether to use Robust Design as an integrated part of their development process may find it difficult to define whether it is relevant......, and afterwards measure the effect of having implemented it. This publication identifies and evaluates Robust Design-related metrics and finds that 2 metrics are especially useful: 1) Relative amount of R&D Resources spent after Design Verification and 2) Number of ‘change notes’ after Design Verification....... The metrics have been applied in a case company to test the assumptions made during the evaluation. It is concluded that the metrics are useful and relevant, but further work is necessary to make a proper overview and categorisation of different types of robustness related metrics....

  20. CONSIDERATIONS FOR THE MASS AND ENERGY INTEGRATION IN THE SUGAR PROCESS PRODUCTION AND DERIVATIVE PROCESS

    Directory of Open Access Journals (Sweden)

    Dennis Abel Clavelo Sierra

    2015-04-01

    Full Text Available The current society needs now more than ever of industries that create new forms and methods where the saving of energy and materials is a fundamental aspect. For this reason, in the present investigation we present an outline with the considerations for the integration of the processes of sugar and other derived products, in an outline of bio refinery with the objective of achieving efficient processes with an appropriate use of the material resources and an efficient use of the energy, with minimum operation costs and investment. In the outline we take as base for the study, it is considered that the integrated complex has as basic input the sugarcane; it is also considered the variation of the prices of the products in the market. In the article we make an outline with the precise steps for the development of a methodology that allows analyzing the processes involved in the biorefinery outline and in this way to identify the common material and energy resources that the processes exchange. A heuristic diagram is presented that guides the strategy to continue for it.

  1. Development of integrated control system for smart factory in the injection molding process

    Science.gov (United States)

    Chung, M. J.; Kim, C. Y.

    2018-03-01

    In this study, we proposed integrated control system for automation of injection molding process required for construction of smart factory. The injection molding process consists of heating, tool close, injection, cooling, tool open, and take-out. Take-out robot controller, image processing module, and process data acquisition interface module are developed and assembled to integrated control system. By adoption of integrated control system, the injection molding process can be simplified and the cost for construction of smart factory can be inexpensive.

  2. An integrated knowledge-based and optimization tool for the sustainable selection of wastewater treatment process concepts

    DEFF Research Database (Denmark)

    Castillo, A.; Cheali, Peam; Gómez, V.

    2016-01-01

    The increasing demand on wastewater treatment plants (WWTPs) has involved an interest in improving the alternative treatment selection process. In this study, an integrated framework including an intelligent knowledge-based system and superstructure-based optimization has been developed and applied...... to a real case study. Hence, a multi-criteria analysis together with mathematical models is applied to generate a ranked short-list of feasible treatments for three different scenarios. Finally, the uncertainty analysis performed allows for increasing the quality and robustness of the decisions considering...... benefit and synergy is achieved when both tools are integrated because expert knowledge and expertise are considered together with mathematical models to select the most appropriate treatment alternative...

  3. Systematic Integrated Process Design and Control of Binary Element Reactive Distillation Processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil; Sales-Cruz, Mauricio; Huusom, Jakob Kjøbsted

    2016-01-01

    In this work, integrated process design and control of reactive distillation processes is considered through a computer-aided framework. First, a set of simple design methods for reactive distillation column that are similar in concept to non-reactive distillation design methods are extended...... to design-control of reactive distillation columns. These methods are based on the element concept where the reacting system of compounds is represented as elements. When only two elements are needed to represent the reacting system of more than two compounds, a binary element system is identified....... It is shown that the same design-control principles that apply to a non-reacting binary system of compounds are also valid for a reactive binary system of elements for distillation columns. Application of this framework shows that designing the reactive distillation process at the maximum driving force...

  4. Model-Based Integrated Process Design and Controller Design of Chemical Processes

    DEFF Research Database (Denmark)

    Abd Hamid, Mohd Kamaruddin Bin

    that is typically formulated as a mathematical programming (optimization with constraints) problem is solved by the so-called reverse approach by decomposing it into four sequential hierarchical sub-problems: (i) pre-analysis, (ii) design analysis, (iii) controller design analysis, and (iv) final selection......This thesis describes the development and application of a new systematic modelbased methodology for performing integrated process design and controller design (IPDC) of chemical processes. The new methodology is simple to apply, easy to visualize and efficient to solve. Here, the IPDC problem...... are ordered according to the defined performance criteria (objective function). The final selected design is then verified through rigorous simulation. In the pre-analysis sub-problem, the concepts of attainable region and driving force are used to locate the optimal process-controller design solution...

  5. A Multi-Objective Optimization Method to integrate Heat Pumps in Industrial Processes

    OpenAIRE

    Becker, Helen; Spinato, Giulia; Maréchal, François

    2011-01-01

    Aim of process integration methods is to increase the efficiency of industrial processes by using pinch analysis combined with process design methods. In this context, appropriate integrated utilities offer promising opportunities to reduce energy consumption, operating costs and pollutants emissions. Energy integration methods are able to integrate any type of predefined utility, but so far there is no systematic approach to generate potential utilities models based on their technology limit...

  6. Integrable dissipative exclusion process: Correlation functions and physical properties

    Science.gov (United States)

    Crampe, N.; Ragoucy, E.; Rittenberg, V.; Vanicat, M.

    2016-09-01

    We study a one-parameter generalization of the symmetric simple exclusion process on a one-dimensional lattice. In addition to the usual dynamics (where particles can hop with equal rates to the left or to the right with an exclusion constraint), annihilation and creation of pairs can occur. The system is driven out of equilibrium by two reservoirs at the boundaries. In this setting the model is still integrable: it is related to the open XXZ spin chain through a gauge transformation. This allows us to compute the full spectrum of the Markov matrix using Bethe equations. We also show that the stationary state can be expressed in a matrix product form permitting to compute the multipoints correlation functions as well as the mean value of the lattice and the creation-annihilation currents. Finally, the variance of the lattice current is computed for a finite-size system. In the thermodynamic limit, it matches the value obtained from the associated macroscopic fluctuation theory.

  7. Dynamics robustness of cascading systems.

    Directory of Open Access Journals (Sweden)

    Jonathan T Young

    2017-03-01

    Full Text Available A most important property of biochemical systems is robustness. Static robustness, e.g., homeostasis, is the insensitivity of a state against perturbations, whereas dynamics robustness, e.g., homeorhesis, is the insensitivity of a dynamic process. In contrast to the extensively studied static robustness, dynamics robustness, i.e., how a system creates an invariant temporal profile against perturbations, is little explored despite transient dynamics being crucial for cellular fates and are reported to be robust experimentally. For example, the duration of a stimulus elicits different phenotypic responses, and signaling networks process and encode temporal information. Hence, robustness in time courses will be necessary for functional biochemical networks. Based on dynamical systems theory, we uncovered a general mechanism to achieve dynamics robustness. Using a three-stage linear signaling cascade as an example, we found that the temporal profiles and response duration post-stimulus is robust to perturbations against certain parameters. Then analyzing the linearized model, we elucidated the criteria of when signaling cascades will display dynamics robustness. We found that changes in the upstream modules are masked in the cascade, and that the response duration is mainly controlled by the rate-limiting module and organization of the cascade's kinetics. Specifically, we found two necessary conditions for dynamics robustness in signaling cascades: 1 Constraint on the rate-limiting process: The phosphatase activity in the perturbed module is not the slowest. 2 Constraints on the initial conditions: The kinase activity needs to be fast enough such that each module is saturated even with fast phosphatase activity and upstream changes are attenuated. We discussed the relevance of such robustness to several biological examples and the validity of the above conditions therein. Given the applicability of dynamics robustness to a variety of systems, it

  8. Robust lyapunov controller for uncertain systems

    KAUST Repository

    Laleg-Kirati, Taous-Meriem; Elmetennani, Shahrazed

    2017-01-01

    Various examples of systems and methods are provided for Lyapunov control for uncertain systems. In one example, a system includes a process plant and a robust Lyapunov controller configured to control an input of the process plant. The robust

  9. Extreme robustness of scaling in sample space reducing processes explains Zipf’s law in diffusion on directed networks

    International Nuclear Information System (INIS)

    Corominas-Murtra, Bernat; Hanel, Rudolf; Thurner, Stefan

    2016-01-01

    It has been shown recently that a specific class of path-dependent stochastic processes, which reduce their sample space as they unfold, lead to exact scaling laws in frequency and rank distributions. Such sample space reducing processes offer an alternative new mechanism to understand the emergence of scaling in countless processes. The corresponding power law exponents were shown to be related to noise levels in the process. Here we show that the emergence of scaling is not limited to the simplest SSRPs, but holds for a huge domain of stochastic processes that are characterised by non-uniform prior distributions. We demonstrate mathematically that in the absence of noise the scaling exponents converge to −1 (Zipf’s law) for almost all prior distributions. As a consequence it becomes possible to fully understand targeted diffusion on weighted directed networks and its associated scaling laws in node visit distributions. The presence of cycles can be properly interpreted as playing the same role as noise in SSRPs and, accordingly, determine the scaling exponents. The result that Zipf’s law emerges as a generic feature of diffusion on networks, regardless of its details, and that the exponent of visiting times is related to the amount of cycles in a network could be relevant for a series of applications in traffic-, transport- and supply chain management. (paper)

  10. An analysis of the perceived difficulties arising during the process of integrating management systems

    Directory of Open Access Journals (Sweden)

    Jesus Abad

    2016-09-01

    Originality/value: Most research emphasises the benefits of integrated management systems. By analysing the difficulties that arise during the integration process, this study contributes to fill a gap in the literature on the problems associated with processes of organisational change, in our case the integration of management systems.

  11. On stochastic integration for volatility modulated Brownian-driven Volterra processes via white noise analysis

    DEFF Research Database (Denmark)

    E. Barndorff-Nielsen, Ole; Benth, Fred Espen; Szozda, Benedykt

    This paper generalizes the integration theory for volatility modulated Brownian-driven Volterra processes onto the space G* of Potthoff-Timpel distributions. Sufficient conditions for integrability of generalized processes are given, regularity results and properties of the integral are discussed...

  12. On stochastic integration for volatility modulated Brownian-driven Volterra processes via white noise analysis

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole E.; Benth, Fred Espen; Szozda, Benedykt

    This paper generalizes the integration theory for volatility modulated Brownian-driven Volterra processes onto the space G∗ of Potthoff--Timpel distributions. Sufficient conditions for integrability of generalized processes are given, regularity results and properties of the integral are discusse...

  13. Science Integrating Learning Objectives: A Cooperative Learning Group Process

    Science.gov (United States)

    Spindler, Matt

    2015-01-01

    The integration of agricultural and science curricular content that capitalizes on natural and inherent connections represents a challenge for secondary agricultural educators. The purpose of this case study was to create information about the employment of Cooperative Learning Groups (CLG) to enhance the science integrating learning objectives…

  14. Integrated management of information inside maintenance processes. From the building registry to BIM systems

    Directory of Open Access Journals (Sweden)

    Cinzia Talamo

    2014-10-01

    Full Text Available The paper presents objec- tives, methods and results of two researches dealing with the improvement of integrated information management within maintenance processes. Focusing on information needs regarding the last phases of the building process, the two researches draft approaches characterizing a path of progressive improve- ment of strategies for integration: from a building registry, unique for the whole construction process, to an integrated management of the building process with the support of BIM systems.

  15. Social networks in nursing work processes: an integrative literature review.

    Science.gov (United States)

    Mesquita, Ana Cláudia; Zamarioli, Cristina Mara; Fulquini, Francine Lima; Carvalho, Emilia Campos de; Angerami, Emilia Luigia Saporiti

    2017-03-20

    To identify and analyze the available evidence in the literature on the use of social networks in nursing work processes. An integrative review of the literature conducted in PubMed, CINAHL, EMBASE and LILACS databases in January 2016, using the descriptors social media, social networking, nursing, enfermagem, redes sociais, mídias sociais, and the keyword nursing practice, without year restriction. The sample consisted of 27 international articles which were published between 2011 and 2016. The social networks used were Facebook (66.5%), Twitter (30%) and WhatsApp (3.5%). In 70.5% of the studies, social networks were used for research purposes, in 18.5% they were used as a tool aimed to assist students in academic activities, and in 11% for executing interventions via the internet. Nurses have used social networks in their work processes such as Facebook, Twitter and WhatsApp to research, teach and watch. The articles show several benefits in using such tools in the nursing profession; however, ethical considerations regarding the use of social networks deserve further discussion. Identificar e analisar as evidências disponíveis na literatura sobre a utilização de redes sociais nos processos de trabalho em enfermagem. Revisão integrativa da literatura realizada em janeiro de 2016, nas bases de dados PubMed, CINAHL, EMBASE e LILACS, com os descritores social media, social networking, nursing, enfermagem, redes sociais, mídias sociais e a palavra-chave nursing practice, sem restrição de ano. A amostra foi composta por 27 artigos, os quais foram publicados entre 2011 e 2016, todos internacionais. As redes sociais utilizadas foram o Facebook (66,5%), o Twitter (30%) e o WhatsApp (3,5%). Em 70,5% dos estudos as redes sociais foram utilizadas para fins de pesquisa, em 18,5% como ferramenta para auxiliar estudantes nas atividades acadêmicas, e em 11% para a realização de intervenções via internet. Em seus processos de trabalho, os enfermeiros têm utilizado

  16. Multi-functional integration of pore P25@C@MoS{sub 2} core-double shell nanostructures as robust ternary anodes with enhanced lithium storage properties

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Biao [School of Materials Science and Engineering and Tianjin Key Laboratory of Composite and Functional Materials, Tianjin University, Tianjin 300350 (China); Zhao, Naiqin [School of Materials Science and Engineering and Tianjin Key Laboratory of Composite and Functional Materials, Tianjin University, Tianjin 300350 (China); Collaborative Innovation Centre of Chemical Science and Engineering, Tianjin 300072 (China); Wei, Chaopeng; Zhou, Jingwen; He, Fang; Shi, Chunsheng; He, Chunnian [School of Materials Science and Engineering and Tianjin Key Laboratory of Composite and Functional Materials, Tianjin University, Tianjin 300350 (China); Liu, Enzuo, E-mail: ezliu@tju.edu.cn [School of Materials Science and Engineering and Tianjin Key Laboratory of Composite and Functional Materials, Tianjin University, Tianjin 300350 (China); Collaborative Innovation Centre of Chemical Science and Engineering, Tianjin 300072 (China)

    2017-04-15

    Highlights: • P25@carbon supported MoS{sub 2} composite was prepared by a one-step process. • The distribution and interaction of C, MoS{sub 2} and TiO{sub 2} are systematically examined. • The enjoyable features of the three components are complementarily integrated. • The smart ternary electrode exhibits excellent cycling stability and rate capability. - Abstract: Ternary anodes have attracted more and more attention due to the characteristic advantages resulting from the effect integration of three different materials on the lithium storage mechanism with functional interfaces interaction. However, clarifying the distribution and interaction of carbon, MoS{sub 2} and TiO{sub 2} in the MoS{sub 2}/C/TiO{sub 2} composite, which is helpful for the understanding of the formation and lithium storage mechanism of the ternary anodes, is a well-known challenge. Herein, a novel pore core-double shell nanostructure of P25@carbon network supported few-layer MoS{sub 2} nanosheet (P25@C@FL-MoS{sub 2}) is successfully synthesized by a one-pot hydrothermal approach. The distribution and interaction of the carbon, MoS{sub 2} and TiO{sub 2} in the obtained P25@C@FL-MoS{sub 2} hybrid are systematically characterized by transmission electron microscopy, Raman spectra and X-ray photoelectron spectroscopy analysis et al. It is found that the carbon serves as binder, which supports few-layer MoS{sub 2} shell and coats the P25 core via Ti−O−C bonds at the same time. Such multi-functional integration with smart structure and strong interfacial contact generates favorable structure stability and interfacial pseudocapacity-like storage mechanism. As a consequence, superior cycling and rate capacity of the muti-functional integration ternary P25@C@FL-MoS{sub 2} anode are achieved.

  17. Sliding mode control of dissolved oxygen in an integrated nitrogen removal process in a sequencing batch reactor (SBR).

    Science.gov (United States)

    Muñoz, C; Young, H; Antileo, C; Bornhardt, C

    2009-01-01

    This paper presents a sliding mode controller (SMC) for dissolved oxygen (DO) in an integrated nitrogen removal process carried out in a suspended biomass sequencing batch reactor (SBR). The SMC performance was compared against an auto-tuning PI controller with parameters adjusted at the beginning of the batch cycle. A method for cancelling the slow DO sensor dynamics was implemented by using a first order model of the sensor. Tests in a lab-scale reactor showed that the SMC offers a better disturbance rejection capability than the auto-tuning PI controller, furthermore providing reasonable performance in a wide range of operation. Thus, SMC becomes an effective robust nonlinear tool to the DO control in this process, being also simple from a computational point of view, allowing its implementation in devices such as industrial programmable logic controllers (PLCs).

  18. Towards an Evaluation Framework for Business Process Integration and Management

    NARCIS (Netherlands)

    Mutschler, B.B.; Reichert, M.U.; Bumiller, J.

    2005-01-01

    Process-awareness in enterprise computing is a must in order to adequately support business processes. Particularly the interoperability of the (process-oriented) business information systems and the management of a company’s process map are difficult to handle. Process-oriented approaches (like

  19. Enabling a robust scalable manufacturing process for therapeutic exosomes through oncogenic immortalization of human ESC-derived MSCs

    Directory of Open Access Journals (Sweden)

    Choo Andre

    2011-04-01

    Full Text Available Abstract Background Exosomes or secreted bi-lipid vesicles from human ESC-derived mesenchymal stem cells (hESC-MSCs have been shown to reduce myocardial ischemia/reperfusion injury in animal models. However, as hESC-MSCs are not infinitely expansible, large scale production of these exosomes would require replenishment of hESC-MSC through derivation from hESCs and incur recurring costs for testing and validation of each new batch. Our aim was therefore to investigate if MYC immortalization of hESC-MSC would circumvent this constraint without compromising the production of therapeutically efficacious exosomes. Methods The hESC-MSCs were transfected by lentivirus carrying a MYC gene. The transformed cells were analyzed for MYC transgene integration, transcript and protein levels, and surface markers, rate of cell cycling, telomerase activity, karyotype, genome-wide gene expression and differentiation potential. The exosomes were isolated by HPLC fractionation and tested in a mouse model of myocardial ischemia/reperfusion injury, and infarct sizes were further assessed by using Evans' blue dye injection and TTC staining. Results MYC-transformed MSCs largely resembled the parental hESC-MSCs with major differences being reduced plastic adherence, faster growth, failure to senesce, increased MYC protein expression, and loss of in vitro adipogenic potential that technically rendered the transformed cells as non-MSCs. Unexpectedly, exosomes from MYC-transformed MSCs were able to reduce relative infarct size in a mouse model of myocardial ischemia/reperfusion injury indicating that the capacity for producing therapeutic exosomes was preserved. Conclusion Our results demonstrated that MYC transformation is a practical strategy in ensuring an infinite supply of cells for the production of exosomes in the milligram range as either therapeutic agents or delivery vehicles. In addition, the increased proliferative rate by MYC transformation reduces the time

  20. Robust control technique for nuclear power plants

    International Nuclear Information System (INIS)

    Murphy, G.V.; Bailey, J.M.

    1989-03-01

    This report summarizes the linear quadratic Guassian (LQG) design technique with loop transfer recovery (LQG/LTR) for design of control systems. The concepts of return ratio, return difference, inverse return difference, and singular values are summarized. The LQG/LTR design technique allows the synthesis of a robust control system. To illustrate the LQG/LTR technique, a linearized model of a simple process has been chosen. The process has three state variables, one input, and one output. Three control system design methods are compared: LQG, LQG/LTR, and a proportional plus integral controller (PI). 7 refs., 20 figs., 6 tabs

  1. An integrated architecture for shallow and deep processing

    OpenAIRE

    Crysmann, Berthold; Frank, Anette; Kiefer, Bernd; Müller, Stefan; Neumann, Günter; Piskorski, Jakub; Schäfer, Ulrich; Siegel, Melanie; Uszkoreit, Hans; Xu, Feiyu; Becker, Markus; Krieger, Hans-Ulrich

    2011-01-01

    We present an architecture for the integration of shallow and deep NLP components which is aimed at flexible combination of different language technologies for a range of practical current and future applications. In particular, we describe the integration of a high-level HPSG parsing system with different high-performance shallow components, ranging from named entity recognition to chunk parsing and shallow clause recognition. The NLP components enrich a representation of natural language te...

  2. Optimization of Tape Winding Process Parameters to Enhance the Performance of Solid Rocket Nozzle Throat Back Up Liners using Taguchi's Robust Design Methodology

    Science.gov (United States)

    Nath, Nayani Kishore

    2017-08-01

    The throat back up liners is used to protect the nozzle structural members from the severe thermal environment in solid rocket nozzles. The throat back up liners is made with E-glass phenolic prepregs by tape winding process. The objective of this work is to demonstrate the optimization of process parameters of tape winding process to achieve better insulative resistance using Taguchi's robust design methodology. In this method four control factors machine speed, roller pressure, tape tension, tape temperature that were investigated for the tape winding process. The presented work was to study the cogency and acceptability of Taguchi's methodology in manufacturing of throat back up liners. The quality characteristic identified was Back wall temperature. Experiments carried out using L 9 ' (34) orthogonal array with three levels of four different control factors. The test results were analyzed using smaller the better criteria for Signal to Noise ratio in order to optimize the process. The experimental results were analyzed conformed and successfully used to achieve the minimum back wall temperature of the throat back up liners. The enhancement in performance of the throat back up liners was observed by carrying out the oxy-acetylene tests. The influence of back wall temperature on the performance of throat back up liners was verified by ground firing test.

  3. The determinants of response time in a repeated constant-sum game: A robust Bayesian hierarchical dual-process model.

    Science.gov (United States)

    Spiliopoulos, Leonidas

    2018-03-01

    The investigation of response time and behavior has a long tradition in cognitive psychology, particularly for non-strategic decision-making. Recently, experimental economists have also studied response time in strategic interactions, but with an emphasis on either one-shot games or repeated social-dilemmas. I investigate the determinants of response time in a repeated (pure-conflict) game, admitting a unique mixed strategy Nash equilibrium, with fixed partner matching. Response times depend upon the interaction of two decision models embedded in a dual-process framework (Achtziger and Alós-Ferrer, 2014; Alós-Ferrer, 2016). The first decision model is the commonly used win-stay/lose-shift heuristic and the second the pattern-detecting reinforcement learning model in Spiliopoulos (2013b). The former is less complex and can be executed more quickly than the latter. As predicted, conflict between these two models (i.e., each one recommending a different course of action) led to longer response times than cases without conflict. The dual-process framework makes other qualitative response time predictions arising from the interaction between the existence (or not) of conflict and which one of the two decision models the chosen action is consistent with-these were broadly verified by the data. Other determinants of RT were hypothesized on the basis of existing theory and tested empirically. Response times were strongly dependent on the actions chosen by both players in the previous rounds and the resulting outcomes. Specifically, response time was shortest after a win in the previous round where the maximum possible payoff was obtained; response time after losses was significantly longer. Strongly auto-correlated behavior (regardless of its sign) was also associated with longer response times. I conclude that, similar to other tasks, there is a strong coupling in repeated games between behavior and RT, which can be exploited to further our understanding of decision

  4. Trends And Economic Assessment Of Integration Processes At The Metal Market

    Directory of Open Access Journals (Sweden)

    Olga Aleksandrovna Romanova

    2015-03-01

    Full Text Available The article discussed the integration process from the perspective of three dimensions that characterize the corresponding increase in the number and appearance of new relationships; strength, character, and stability of emerging communications; dynamics and the appropriate form of the process. In the article, trends of development of integration processes in metallurgy are identified, identification of five stages of development in Russian metal trading are justified. We propose a step by step way to implement the integration process, developed a methodical approach to assessing the feasibility of economic integration processes steel producers and steel traders, including three consecutive stages of its implementing respectively, the principles of reflexive control, entropy approach, the traditional assessment of mergers and acquisitions. The algorithm for the practical realization of the author’s approach, which allows to identify the optimal trajectory of the integration process as a series of horizontal and vertical integration steps is developed.

  5. The Integration of Extrarational and Rational Learning Processes: Moving Towards the Whole Learner.

    Science.gov (United States)

    Puk, Tom

    1996-01-01

    Discusses the dichotomy between rational and nonrational learning processes, arguing for an integration of both. Reviews information processing theory and related learning strategies. Presents a model instructional strategy that fully integrates rational and nonrational processes. Describes implications for teaching and learning of the learning…

  6. Integration Processes of Delay Differential Equation Based on Modified Laguerre Functions

    Directory of Open Access Journals (Sweden)

    Yeguo Sun

    2012-01-01

    Full Text Available We propose long-time convergent numerical integration processes for delay differential equations. We first construct an integration process based on modified Laguerre functions. Then we establish its global convergence in certain weighted Sobolev space. The proposed numerical integration processes can also be used for systems of delay differential equations. We also developed a technique for refinement of modified Laguerre-Radau interpolations. Lastly, numerical results demonstrate the spectral accuracy of the proposed method and coincide well with analysis.

  7. Solution processable mixed-solvent exfoliated MoS2 nanosheets for efficient and robust organic light-emitting diodes

    Science.gov (United States)

    Liu, Chia-Wei; Wang, Chia; Liao, Chia-Wei; Golder, Jan; Tsai, Ming-Chih; Young, Hong-Tsu; Chen, Chin-Ti; Wu, Chih-I.

    2018-04-01

    We demonstrate the use of solution-processed molybdenum trioxide (MoO3) nanoparticle-decorated molybdenum disulfide (MoS2) nanosheets (MoS2/MoO3) as hole injection layer (HIL) in organic lighting diodes (OLEDs). The device performance is shown to be significantly improved by the introduction of such MoS2/MoO3 HIL without any post-ultraviolet-ozone treatment, and is shown to better the performance of devices fabricated using conventional poly(3,4-ethylenedioxythiophene)-poly(styrenesulfonate) (PEDOT:PSS) and MoO3 nanoparticle HILs. The MoS2/MoO3 nanosheets form a compact film, as smooth as PEDOT:PSS films and smoother than MoO3 nanoparticle films, when simply spin-coated on indium tin oxide substrates. The improvement in device efficiency can be attributed to the smooth surface of the nanostructured MoS2/MoO3 HIL and the excellent conductivity characteristics of the two-dimensional (2D) layered material (MoS2), which facilitate carrier transport in the device and reduce the sheet resistance. Moreover, the long-term stability of OLED devices that use such MoS2/MoO3 layers is shown to be improved dramatically compared with hygroscopic and acidic PEDOT:PSS-based devices.

  8. Solution processable mixed-solvent exfoliated MoS2 nanosheets for efficient and robust organic light-emitting diodes

    Directory of Open Access Journals (Sweden)

    Chia-Wei Liu

    2018-04-01

    Full Text Available We demonstrate the use of solution-processed molybdenum trioxide (MoO3 nanoparticle-decorated molybdenum disulfide (MoS2 nanosheets (MoS2/MoO3 as hole injection layer (HIL in organic lighting diodes (OLEDs. The device performance is shown to be significantly improved by the introduction of such MoS2/MoO3 HIL without any post-ultraviolet-ozone treatment, and is shown to better the performance of devices fabricated using conventional poly(3,4-ethylenedioxythiophene-poly(styrenesulfonate (PEDOT:PSS and MoO3 nanoparticle HILs. The MoS2/MoO3 nanosheets form a compact film, as smooth as PEDOT:PSS films and smoother than MoO3 nanoparticle films, when simply spin-coated on indium tin oxide substrates. The improvement in device efficiency can be attributed to the smooth surface of the nanostructured MoS2/MoO3 HIL and the excellent conductivity characteristics of the two-dimensional (2D layered material (MoS2, which facilitate carrier transport in the device and reduce the sheet resistance. Moreover, the long-term stability of OLED devices that use such MoS2/MoO3 layers is shown to be improved dramatically compared with hygroscopic and acidic PEDOT:PSS-based devices.

  9. Analysis of Fleet Readiness Center Southwest Concept Integration: New-Employee Orientation and Communication Processes

    National Research Council Canada - National Science Library

    Clemmons, Francini R; Falconieri, Holly M

    2007-01-01

    Fleet Readiness Center Southwest has embraced integration of personnel and processes from Aircraft Intermediate Maintenance Departments and Naval Aviation Depots supporting Naval Aviation Maintenance...

  10. SYSTEMS FOR IMPROVEMENT OF BUSINESS INTEGRATED MANAGEMENT PROCESSES IN PORTS

    Directory of Open Access Journals (Sweden)

    Pavle Popovic

    2017-03-01

    Full Text Available Based on test results obtained so far, the objective relating to modern scientific methods and approaches has been defined with the intention to develop model which defines new management system integration model, by adopting necessary practical terms of real systems that most frequently circulate in maritime practice in the field of integrated management systems (IMS in terms of safety and security of vessels and ports. The subject of research is maritime, particularly port services, by way of defining internal and external advantages through adopting integrated management systems. The research will be conducted through theoretical and applied assessments of case study analyses using example of the Port of Kotor H.Co.

  11. Design optimization for cost and quality: The robust design approach

    Science.gov (United States)

    Unal, Resit

    1990-01-01

    Designing reliable, low cost, and operable space systems has become the key to future space operations. Designing high quality space systems at low cost is an economic and technological challenge to the designer. A systematic and efficient way to meet this challenge is a new method of design optimization for performance, quality, and cost, called Robust Design. Robust Design is an approach for design optimization. It consists of: making system performance insensitive to material and subsystem variation, thus allowing the use of less costly materials and components; making designs less sensitive to the variations in the operating environment, thus improving reliability and reducing operating costs; and using a new structured development process so that engineering time is used most productively. The objective in Robust Design is to select the best combination of controllable design parameters so that the system is most robust to uncontrollable noise factors. The robust design methodology uses a mathematical tool called an orthogonal array, from design of experiments theory, to study a large number of decision variables with a significantly small number of experiments. Robust design also uses a statistical measure of performance, called a signal-to-noise ratio, from electrical control theory, to evaluate the level of performance and the effect of noise factors. The purpose is to investigate the Robust Design methodology for improving quality and cost, demonstrate its application by the use of an example, and suggest its use as an integral part of space system design process.

  12. Process performance and product quality in an integrated continuous antibody production process.

    Science.gov (United States)

    Karst, Daniel J; Steinebach, Fabian; Soos, Miroslav; Morbidelli, Massimo

    2017-02-01

    Continuous manufacturing is currently being seriously considered in the biopharmaceutical industry as the possible new paradigm for producing therapeutic proteins, due to production cost and product quality related benefits. In this study, a monoclonal antibody producing CHO cell line was cultured in perfusion mode and connected to a continuous affinity capture step. The reliable and stable integration of the two systems was enabled by suitable control loops, regulating the continuous volumetric flow and adapting the operating conditions of the capture process. For the latter, an at-line HPLC measurement of the harvest concentration subsequent to the bioreactor was combined with a mechanistic model of the capture chromatographic unit. Thereby, optimal buffer consumption and productivity throughout the process was realized while always maintaining a yield above the target value of 99%. Stable operation was achieved at three consecutive viable cell density set points (20, 60, and 40 × 10 6 cells/mL), together with consistent product quality in terms of aggregates, fragments, charge isoforms, and N-linked glycosylation. In addition, different values for these product quality attributes such as N-linked glycosylation, charge variants, and aggregate content were measured at the different steady states. As expected, the amount of released DNA and HCP was significantly reduced by the capture step for all considered upstream operating conditions. This study is exemplary for the potential of enhancing product quality control and modulation by integrated continuous manufacturing. Biotechnol. Bioeng. 2017;114: 298-307. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  13. The June 2014 eruption at Piton de la Fournaise: Robust methods developed for monitoring challenging eruptive processes

    Science.gov (United States)

    Villeneuve, N.; Ferrazzini, V.; Di Muro, A.; Peltier, A.; Beauducel, F.; Roult, G. C.; Lecocq, T.; Brenguier, F.; Vlastelic, I.; Gurioli, L.; Guyard, S.; Catry, T.; Froger, J. L.; Coppola, D.; Harris, A. J. L.; Favalli, M.; Aiuppa, A.; Liuzzo, M.; Giudice, G.; Boissier, P.; Brunet, C.; Catherine, P.; Fontaine, F. J.; Henriette, L.; Lauret, F.; Riviere, A.; Kowalski, P.

    2014-12-01

    After almost 3.5 years of quiescence, Piton de la Fournaise (PdF) produced a small summit eruption on 20 June 2014 at 21:35 (GMT). The eruption lasted 20 hours and was preceded by: i) onset of deep eccentric seismicity (15-20 km bsl; 9 km NW of the volcano summit) in March and April 2014; ii) enhanced CO2 soil flux along the NW rift zone; iii) increase in the number and energy of shallow (shallow location, was not characteristic of an eruptive crisis. However, at 20:06 on 20/06 their character changed. This was 74 minutes before the onset of tremor. Deformations then began at 20:20. Since 2007, PdF has emitted small magma volumes (processing of seismic data, borehole tiltmeters and permanent monitoring of summit gas emissions, plus CO2 soil flux, were used to track precursory activity. JERK, based on an analysis of the acceleration slope of a broad-band seismometer data, allowed advanced notice of the new eruption by 50 minutes. MSNoise, based on seismic velocity determination, showed a significant decrease 7 days before the eruption. These signals were coupled with change in summit fumarole composition. Remote sensing allowed the following syn-eruptive observations: - INSAR confirmed measurements made by the OVPF geodetic network, showing that deformation was localized around the eruptive fissures; - A SPOT5 image acquired at 05:41 on 21/06 allowed definition of the flow field area (194 500 m2); - A MODIS image acquired at 06:35 on 21/06 gave a lava discharge rate of 6.9±2.8 m3 s-1, giving an erupted volume of 0.3 and 0.4 Mm3. - This rate was used with the DOWNFLOW and FLOWGO models, calibrated with the textural data from Piton's 2010 lava, to run lava flow projections; showing that the event was volume limited. Preliminary sample analyses suggest that the olivine rich lavas have a differentiated character (melt MgO: 5.8 - 6.2 wt.%); proof of chamber residence. However, some aphyric tephra are more primitive (MgO: 8.2 wt.%). This suggests eruption due to

  14. Managing uncertainty in fishing and processing of an integrated ...

    African Journals Online (AJOL)

    trawler scheduling, catch quota, processing and labour allocation, and inventory control. .... of trawlers, a number of processing plants and market requirements. ...... Norwegian University of Science and Technology, Trondheim, Norway.

  15. Deficits in Thematic Integration Processes in Broca's and Wernicke's Aphasia

    Science.gov (United States)

    Nakano, Hiroko; Blumstein, Sheila E.

    2004-01-01

    This study investigated how normal subjects and Broca's and Wernicke's aphasics integrate thematic information incrementally using syntax, lexical-semantics, and pragmatics in a simple active declarative sentence. Three priming experiments were conducted using an auditory lexical decision task in which subjects made a lexical decision on a…

  16. Mechanical integrity of PFHE in LNG liquefaction process

    NARCIS (Netherlands)

    Ligterink, N.E.; Hageraats-Ponomareva, S.V.; Velthuis, J.F.M.

    2012-01-01

    The mechanical integrity of heat exchangers is the result of an interplay of many physical phenomena. A full, albeit simple, model, that covers mixture phase thermodynamics, heat transfer, flow dynamics, and the thermal and pressure stresses has been created. Using this simulation model it is now

  17. Empathy: An Integral Model in the Counseling Process

    Science.gov (United States)

    Clark, Arthur J.

    2010-01-01

    Expanding on a framework introduced by Carl Rogers, an integral model of empathy in counseling uses empathic understanding through 3 ways of knowing: Subjective empathy enables a counselor to momentarily experience what it is like to be a client, interpersonal empathy relates to understanding a client's phenomenological experiencing, and objective…

  18. Numerical evaluation of path-integral solutions to Fokker-Planck equations. II. Restricted stochastic processes

    International Nuclear Information System (INIS)

    Wehner, M.F.

    1983-01-01

    A path-integral solution is derived for processes described by nonlinear Fokker-Plank equations together with externally imposed boundary conditions. This path-integral solution is written in the form of a path sum for small time steps and contains, in addition to the conventional volume integral, a surface integral which incorporates the boundary conditions. A previously developed numerical method, based on a histogram representation of the probability distribution, is extended to a trapezoidal representation. This improved numerical approach is combined with the present path-integral formalism for restricted processes and is show t give accurate results. 35 refs., 5 figs

  19. INTEGRATION OF COST MODELS AND PROCESS SIMULATION TOOLS FOR OPTIMUM COMPOSITE MANUFACTURING PROCESS

    Energy Technology Data Exchange (ETDEWEB)

    Pack, Seongchan [General Motors; Wilson, Daniel [General Motors; Aitharaju, Venkat [General Motors; Kia, Hamid [General Motors; Yu, Hang [ESI, Group.; Doroudian, Mark [ESI Group

    2017-09-05

    Manufacturing cost of resin transfer molded composite parts is significantly influenced by the cycle time, which is strongly related to the time for both filling and curing of the resin in the mold. The time for filling can be optimized by various injection strategies, and by suitably reducing the length of the resin flow distance during the injection. The curing time can be reduced by the usage of faster curing resins, but it requires a high pressure injection equipment, which is capital intensive. Predictive manufacturing simulation tools that are being developed recently for composite materials are able to provide various scenarios of processing conditions virtually well in advance of manufacturing the parts. In the present study, we integrate the cost models with process simulation tools to study the influence of various parameters such as injection strategies, injection pressure, compression control to minimize high pressure injection, resin curing rate, and demold time on the manufacturing cost as affected by the annual part volume. A representative automotive component was selected for the study and the results are presented in this paper

  20. Integrating Effectiveness, Transparency and Fairness into a Value Elicitation Process

    International Nuclear Information System (INIS)

    Fortier, Michael; Sheng, Grant

    2001-01-01

    As part of the evaluation of Canada's proposed nuclear fuel waste disposal concept, the Federal Environmental Assessment and Review Panel (FEARP) undertook an extensive, nation-wide public hearing process. The hearing process itself was contentious and has been criticized on numerous grounds. It is our contention that the fundamental weakness of the FEARP process was that it was designed as an information-based forum, as opposed to a value-based forum.' Our observations and analyses of these hearings indicate that the FEARP envisioned a different purpose and a different outcome of this process than the public in general. As a result, public acceptability for the Concept or even the assessment process itself was not garnered due to a failure in the process to identify, address and incorporate values. To address this, we proposed a seven-step value elicitation process specifically designed to assess public acceptability of the disposal concept. An unfortunate consequence of the flawed public consultation process employed by the FEARP is that it is unclear exactly what it is the public finds unacceptable. Both from discussions and observations, it is difficult to ascertain whether the unacceptability lies with the Concept itself and/or the process by which the Concept was to be assessed. As a result, there is uncertainty as to what questions should be asked and how should the 'unacceptability' be addressed. In other words, does Canada need a new concept? Does Canada need to develop a mechanism for assessing the public acceptability of the Concept? Or both? The inability of the current process to answer such fundamental questions demonstrates the importance of developing an effective public acceptability and consultation process. We submit that, to create an acceptable Public Participation mechanism, it is necessary to found the construction of such a mechanism on the principles of effectiveness, transparency and fairness. Moreover, we believe that the larger decision

  1. Integrating Effectiveness, Transparency and Fairness into a Value Elicitation Process

    Energy Technology Data Exchange (ETDEWEB)

    Fortier, Michael; Sheng, Grant [York Univ., Toronto, ON (Canada). Faculty of Environmental Studies; Collins, Alison [York Centre for Applied Sustainability, Toronto, ON (Canada)

    2001-07-01

    As part of the evaluation of Canada's proposed nuclear fuel waste disposal concept, the Federal Environmental Assessment and Review Panel (FEARP) undertook an extensive, nation-wide public hearing process. The hearing process itself was contentious and has been criticized on numerous grounds. It is our contention that the fundamental weakness of the FEARP process was that it was designed as an information-based forum, as opposed to a value-based forum.' Our observations and analyses of these hearings indicate that the FEARP envisioned a different purpose and a different outcome of this process than the public in general. As a result, public acceptability for the Concept or even the assessment process itself was not garnered due to a failure in the process to identify, address and incorporate values. To address this, we proposed a seven-step value elicitation process specifically designed to assess public acceptability of the disposal concept. An unfortunate consequence of the flawed public consultation process employed by the FEARP is that it is unclear exactly what it is the public finds unacceptable. Both from discussions and observations, it is difficult to ascertain whether the unacceptability lies with the Concept itself and/or the process by which the Concept was to be assessed. As a result, there is uncertainty as to what questions should be asked and how should the 'unacceptability' be addressed. In other words, does Canada need a new concept? Does Canada need to develop a mechanism for assessing the public acceptability of the Concept? Or both? The inability of the current process to answer such fundamental questions demonstrates the importance of developing an effective public acceptability and consultation process. We submit that, to create an acceptable Public Participation mechanism, it is necessary to found the construction of such a mechanism on the principles of effectiveness, transparency and fairness. Moreover, we believe that

  2. Supporting BPMN choreography with system integration artefacts for enterprise process collaboration

    NARCIS (Netherlands)

    Nie, H.; Lu, X.; Duan, H.

    2014-01-01

    Business Process Model and Notation (BPMN) choreography modelling depicts externally visible message exchanges between collaborating processes of enterprise information systems. Implementation of choreography relies on designing system integration solutions to realise message exchanges between

  3. Knowledge Integrated Business Process Management for Third Party Logistics Companies

    OpenAIRE

    Zhang, Hongyan

    2013-01-01

    The growing importance of logistics as well as the increasing dynamic complexity of markets, technologies, and customer needs has brought great challenges to logistics. In order to focus on their core competency in such a competitive environment, more and more companies have outsourced a part or the entirety of the logistics process to third party logistics (3PL) service providers. 3PL has played a crucial role in managing logistics processes within supply chain management. Logistics processe...

  4. Integrative Dynamic Reconfiguration in a Parallel Stream Processing Engine

    DEFF Research Database (Denmark)

    Madsen, Kasper Grud Skat; Zhou, Yongluan; Cao, Jianneng

    2017-01-01

    Load balancing, operator instance collocations and horizontal scaling are critical issues in Parallel Stream Processing Engines to achieve low data processing latency, optimized cluster utilization and minimized communication cost respectively. In previous work, these issues are typically tackled...... solution called ALBIC, which support general jobs. We implement the proposed techniques on top of Apache Storm, an open-source Parallel Stream Processing Engine. The extensive experimental results over both synthetic and real datasets show that our techniques clearly outperform existing approaches....

  5. "COUPLED PROCESSES" AS DYNAMIC CAPABILITIES IN SYSTEMS INTEGRATION

    OpenAIRE

    Chagas Jr, Milton de Freitas; Leite, Dinah Eluze Sales; Jesus, Gabriel Torres de

    2017-01-01

    ABSTRACT The dynamics of innovation in complex systems industries is becoming an independent research stream. Apart from conventional uncertainties related to commerce and technology, complex-system industries must cope with systemic uncertainty. This paper's objective is to analyze evolving technological paths from one product generation to the next through two case studies in the Brazilian aerospace industry, considering systems integration as an empirical instantiation of dynamic capabilit...

  6. Enhancing Student Learning of Enterprise Integration and Business Process Orientation through an ERP Business Simulation Game

    Science.gov (United States)

    Seethamraju, Ravi

    2011-01-01

    The sophistication of the integrated world of work and increased recognition of business processes as critical corporate assets require graduates to develop "process orientation" and an "integrated view" of business. Responding to these dynamic changes in business organizations, business schools are also continuing to modify…

  7. Career-Oriented Performance Tasks in Chemistry: Effects on Students Integrated Science Process Skills

    OpenAIRE

    Allen A. Espinosa; Sheryl Lyn C. Monterola; Amelia E. Punzalan

    2013-01-01

    The study was conducted to assess the effectiveness of Career-Oriented Performance Task (COPT) approach against the traditional teaching approach (TTA) in enhancing students’ integrated science process skills. Specifically, it sought to find out if students exposed to COPT have higher integrated science process skills than those students exposed to the traditional teaching approach (TTA). Career-Oriented Performance Task (COPT) approach aims to integrate career-oriented examples and inquiry-b...

  8. Integrating system safety into the basic systems engineering process

    Science.gov (United States)

    Griswold, J. W.

    1971-01-01

    The basic elements of a systems engineering process are given along with a detailed description of what the safety system requires from the systems engineering process. Also discussed is the safety that the system provides to other subfunctions of systems engineering.

  9. Business process compliance management : an integrated proactive approach

    NARCIS (Netherlands)

    Elgammal, A.; Sebahi, S.; Turetken, O.; Hacid, M.-S.; Papazoglou, M.; van den Heuvel, W.; Soliman, K.S.

    2014-01-01

    Today’s enterprises demand a high degree of compliance of business processes to meet regulations, such as Sarbanes-Oxley and Basel I-III. To ensure continuous guaranteed compliance, compliance management should be considered during all phases of the business process lifecycle; from the analysis and

  10. Integrating ICT applications for farm business collaboration processes using FIspace

    NARCIS (Netherlands)

    Kruize, J.W.; Wolfert, J.; Goense, D.; Scholten, H.; Beulens, A.J.M.; Veenstra, T.

    2014-01-01

    Agri-Food Supply Chain Networks are required to increase production and to be transparent while reducing environmental impact. This challenges farm enterprises to innovate their production processes. These processes need to be supported by advanced ICT components that are developed by multiple

  11. Manufacturing Squares: An Integrative Statistical Process Control Exercise

    Science.gov (United States)

    Coy, Steven P.

    2016-01-01

    In the exercise, students in a junior-level operations management class are asked to manufacture a simple product. Given product specifications, they must design a production process, create roles and design jobs for each team member, and develop a statistical process control plan that efficiently and effectively controls quality during…

  12. IDEAS, for integral logistics in centralized wood processing

    NARCIS (Netherlands)

    Reinders, M.P.

    1989-01-01

    A decision support system (DSS) is developed to improve the quality of decision making in wood processing companies. Wood processing companies are businesses that import (buy, harvest) raw materials, stems and logs, and export (deliver) products, assortments and boards, after a multi-step

  13. Effect of different machining processes on the tool surface integrity and fatigue life

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Chuan Liang [College of Mechanical and Electrical Engineering, Nanchang University, Nanchang (China); Zhang, Xianglin [School of Materials Science and Engineering, Huazhong University of Science and Technology, Wuhan (China)

    2016-08-15

    Ultra-precision grinding, wire-cut electro discharge machining and lapping are often used to machine the tools in fine blanking industry. And the surface integrity from these machining processes causes great concerns in the research field. To study the effect of processing surface integrity on the fine blanking tool life, the surface integrity of different tool materials under different processing conditions and its influence on fatigue life were thoroughly analyzed in the present study. The result shows that the surface integrity of different materials was quite different on the same processing condition. For the same tool material, the surface integrity on varying processing conditions was quite different too and deeply influenced the fatigue life.

  14. Integrating artificial and human intelligence into tablet production process.

    Science.gov (United States)

    Gams, Matjaž; Horvat, Matej; Ožek, Matej; Luštrek, Mitja; Gradišek, Anton

    2014-12-01

    We developed a new machine learning-based method in order to facilitate the manufacturing processes of pharmaceutical products, such as tablets, in accordance with the Process Analytical Technology (PAT) and Quality by Design (QbD) initiatives. Our approach combines the data, available from prior production runs, with machine learning algorithms that are assisted by a human operator with expert knowledge of the production process. The process parameters encompass those that relate to the attributes of the precursor raw materials and those that relate to the manufacturing process itself. During manufacturing, our method allows production operator to inspect the impacts of various settings of process parameters within their proven acceptable range with the purpose of choosing the most promising values in advance of the actual batch manufacture. The interaction between the human operator and the artificial intelligence system provides improved performance and quality. We successfully implemented the method on data provided by a pharmaceutical company for a particular product, a tablet, under development. We tested the accuracy of the method in comparison with some other machine learning approaches. The method is especially suitable for analyzing manufacturing processes characterized by a limited amount of data.

  15. An Open Computing Infrastructure that Facilitates Integrated Product and Process Development from a Decision-Based Perspective

    Science.gov (United States)

    Hale, Mark A.

    1996-01-01

    Computer applications for design have evolved rapidly over the past several decades, and significant payoffs are being achieved by organizations through reductions in design cycle times. These applications are overwhelmed by the requirements imposed during complex, open engineering systems design. Organizations are faced with a number of different methodologies, numerous legacy disciplinary tools, and a very large amount of data. Yet they are also faced with few interdisciplinary tools for design collaboration or methods for achieving the revolutionary product designs required to maintain a competitive advantage in the future. These organizations are looking for a software infrastructure that integrates current corporate design practices with newer simulation and solution techniques. Such an infrastructure must be robust to changes in both corporate needs and enabling technologies. In addition, this infrastructure must be user-friendly, modular and scalable. This need is the motivation for the research described in this dissertation. The research is focused on the development of an open computing infrastructure that facilitates product and process design. In addition, this research explicitly deals with human interactions during design through a model that focuses on the role of a designer as that of decision-maker. The research perspective here is taken from that of design as a discipline with a focus on Decision-Based Design, Theory of Languages, Information Science, and Integration Technology. Given this background, a Model of IPPD is developed and implemented along the lines of a traditional experimental procedure: with the steps of establishing context, formalizing a theory, building an apparatus, conducting an experiment, reviewing results, and providing recommendations. Based on this Model, Design Processes and Specification can be explored in a structured and implementable architecture. An architecture for exploring design called DREAMS (Developing Robust

  16. Robustness in Railway Operations (RobustRailS)

    DEFF Research Database (Denmark)

    Jensen, Jens Parbo; Nielsen, Otto Anker

    This study considers the problem of enhancing railway timetable robustness without adding slack time, hence increasing the travel time. The approach integrates a transit assignment model to assess how passengers adapt their behaviour whenever operations are changed. First, the approach considers...

  17. Integrated Modeling of Process, Structures and Performance in Cast Parts

    DEFF Research Database (Denmark)

    Kotas, Petr

    This thesis deals with numerical simulations of gravity sand casting processes for the production of large steel parts. The entire manufacturing process is numerically modeled and evaluated, taking into consideration mould filling, solidification, solid state cooling and the subsequent stress build...... and to defects occurrence. In other words, it is desired to eliminate all of the potential casting defects and at the same time to maximize the casting yield. The numerical optimization algorithm then takes these objectives and searches for a set of the investigated process, design or material parameters e.......g. chill design, riser design, gating system design, etc., which would satisfy these objectives the most. The first step in the numerical casting process simulation is to analyze mould filling where the emphasis is put on the gating system design. There are still a lot of foundry specialists who ignore...

  18. Dosimetry as an integral part of radiation processing

    International Nuclear Information System (INIS)

    Zagorski, Z.P.

    1999-01-01

    Different connections between high-dose dosimetry and radiation processing are discussed. Radiation processing cannot be performed without proper dosimetry. Accurate high dose and high dose rate dosimetry exhibits several aspects: first of all it is the preservation of the quality of the product, then fulfillment of legal aspects and last but not the least the safety of processing. Further, seldom discussed topics are as follow: dosimetric problems occurring with double-side EB irradiations, discussed in connection with the deposition of electric charge during electron beam irradiation. Although dosimetry for basic research and for medical purposes are treated here only shortly, some conclusions reached from these fields are considered in dosimetry for radiation processing. High-dose dosimetry of radiation has become a separate field, with many papers published every year, but applied dosimetric projects are usually initiated by a necessity of particular application. (author)

  19. Synthesis and Design of Integrated Process and Water Networks

    DEFF Research Database (Denmark)

    Handani, Zainatul B.; Quaglia, Alberto; Gani, Rafiqul

    2015-01-01

    This work presents the development of a systematic framework for a simultaneous synthesis and design of process and water networks using the superstructure-based optimization approach. In this framework, a new superstructure combining both networks is developed by attempting to consider all...... possible options with respect to the topology of the process and water networks, leading to Mixed Integer Non Linear Programming (MINLP) problem. A solution strategy to solve the multi-network problem accounts explicitly the interactions between the networks by selecting suitable technologies in order...... to transform raw materials into products and produce clean water to be reused in the process at the early stage of design. Since the connection between the process network and the wastewater treatment network is not a straight forward connection, a new converter interval is introduced in order to convert...

  20. Integration of distributed computing into the drug discovery process.

    Science.gov (United States)

    von Korff, Modest; Rufener, Christian; Stritt, Manuel; Freyss, Joel; Bär, Roman; Sander, Thomas

    2011-02-01

    Grid computing offers an opportunity to gain massive computing power at low costs. We give a short introduction into the drug discovery process and exemplify the use of grid computing for image processing, docking and 3D pharmacophore descriptor calculations. The principle of a grid and its architecture are briefly explained. More emphasis is laid on the issues related to a company-wide grid installation and embedding the grid into the research process. The future of grid computing in drug discovery is discussed in the expert opinion section. Most needed, besides reliable algorithms to predict compound properties, is embedding the grid seamlessly into the discovery process. User friendly access to powerful algorithms without any restrictions, that is, by a limited number of licenses, has to be the goal of grid computing in drug discovery.