WorldWideScience

Sample records for applying model production

  1. Shrinking core models applied to the sodium silicate production process

    Directory of Open Access Journals (Sweden)

    Stanković Mirjana S.

    2007-01-01

    Full Text Available The sodium silicate production process, with the molar ratio SiO2/Na2O = 2, for detergent zeolite 4A production, is based on quartz sand dissolving in NaOH aqueous solution, with a specific molality. It is a complex process performed at high temperature and pressure. It is of vital importance to develop adequate mathematical models, which are able to predict the dynamical response of the process parameters. A few kinetic models were developed within this study, which were adjusted and later compared to experimental results. It was assumed that SiO2 particles are smooth spheres, with uniform diameter. This diameter decreases during dissolving. The influence of particle diameter, working temperature and hydroxide ion molality on the dissolution kinetics was investigated. It was concluded that the developed models are sufficiently correct, in the engineering sense, and can be used for the dynamical prediction of process parameters.

  2. A system simulation model applied to the production schedule of a fish processing facility

    Directory of Open Access Journals (Sweden)

    Carla Roberta Pereira

    2012-11-01

    Full Text Available The simulation seeks to import the reality to a controlled environment, where it is possible to study it behavior, under several conditions, without involving physical risks and/or high costs. Thus, the system simulation becomes a useful and powerful technique in emergence markets, as the tilapiculture sector that needs to expand its business. The main purpose of this study was the development of a simulation model to assist the decisions making of the production scheduling of a fish processing facility. It was applied, as research method, the case study and the modeling/simulation, including in this set the SimuCAD methodology and the development phases of a simulation model. The model works with several alternative scenarios, testing different working shifts, types of flows and production capacity, besides variations of the ending inventory and sales. The result of this research was a useful and differentiated model simulation to assist the decision making of the production scheduling of fish processing facility studied.

  3. Mercury's geochronology revised by applying Model Production Functions to Mariner 10 data: geological implications

    CERN Document Server

    Massironi, M; Marchi, S; Martellato, M; Mottola, M; Wagner, R J

    2009-01-01

    Model Production Function chronology uses dynamic models of the Main Belt Asteroids (MBAs) and Near Earth Objects (NEOs) to derive the impactor flux to a target body. This is converted into the crater size-frequency-distribution for a specific planetary surface, and calibrated using the radiometric ages of different regions of the Moon's surface. This new approach has been applied to the crater counts on Mariner 10 images of the highlands and of several large impact basins on Mercury. MPF estimates for the plains show younger ages than those of previous chronologies. Assuming a variable uppermost layering of the Hermean crust, the age of the Caloris interior plains may be as young as 3.59 Ga, in agreement with MESSENGER results that imply that long-term volcanism overcame contractional tectonics. The MPF chronology also suggests a variable projectile flux through time, coherent with the MBAs for ancient periods and then gradually comparable also to the NEOs.

  4. Assessing switchability for biosimilar products: modelling approaches applied to children's growth.

    Science.gov (United States)

    Belleli, Rossella; Fisch, Roland; Renard, Didier; Woehling, Heike; Gsteiger, Sandro

    2015-01-01

    The present paper describes two statistical modelling approaches that have been developed to demonstrate switchability from the original recombinant human growth hormone (rhGH) formulation (Genotropin(®) ) to a biosimilar product (Omnitrope(®) ) in children suffering from growth hormone deficiency. Demonstrating switchability between rhGH products is challenging because the process of growth varies with the age of the child and across children. The first modelling approach aims at predicting individual height measured at several time-points after switching to the biosimilar. The second modelling approach provides an estimate of the deviation from the overall growth rate after switching to the biosimilar, which can be regarded as an estimate of switchability. The results after applying these approaches to data from a randomized clinical trial are presented. The accuracy and precision of the predictions made using the first approach and the small deviation from switchability estimated with the second approach provide sufficient evidence to conclude that switching from Genotropin(®) to Omnitrope(®) has a very small effect on growth, which is neither statistically significant nor clinically relevant.

  5. Modeling environmental and human health risks of veterinary medicinal products applied in pond aquaculture.

    Science.gov (United States)

    Rico, Andreu; Geng, Yue; Focks, Andreas; Van den Brink, Paul J

    2013-04-01

    A model called ERA-AQUA was developed to assess the risks posed by the use of veterinary medicinal products (VMPs) applied in aquaculture ponds for the targeted produce, surrounding aquatic ecosystems, consumers, and trade of the aquaculture produce. The model calculates risks by following a risk quotient approach, calculating predicted exposure concentrations (exposure assessment) and predicted no-effect concentrations (effect assessment) for the endpoint under study. The exposure assessment is performed by combining information on the environmental characteristics of the aquaculture pond, characteristics of the cultured species, aquaculture management practices, and physicochemical properties of the compound under study. The model predicts concentrations of VMPs in the pond water, pond sediment, cultured species, and watercourse receiving pond effluent discharges by mass balance equations. The effect assessment is performed by combining (eco)toxicological information and food safety threshold concentrations for the studied compound. In the present study, the scientific background, strengths, and limitations of the ERA-AQUA model are presented together with a sensitivity analysis and an example showing its potential applications.

  6. Agents Modeling Experience Applied To Control Of Semi-Continuous Production Process

    Directory of Open Access Journals (Sweden)

    Gabriel Rojek

    2014-01-01

    Full Text Available The lack of proper analytical models of some production processes prevents us from obtaining proper values of process parameters by simply computing optimal values. Possible solutions of control problems in such areas of industrial processes can be found using certain methods from the domain of artificial intelligence: neural networks, fuzzy logic, expert systems, or evolutionary algorithms. Presented in this work, a solution to such a control problem is an alternative approach that combines control of the industrial process with learning based on production results. By formulating the main assumptions of the proposed methodology, decision processes of a human operator using his experience are taken into consideration. The researched model of using and gathering experience of human beings is designed with the contribution of agent technology. The presented solution of the control problem coincides with case-based reasoning (CBR methodology.

  7. Applying Neural Network to Dynamic Modeling of Biosurfactant Production Using Soybean Oil Refinery Wastes

    Directory of Open Access Journals (Sweden)

    Shokoufe Tayyebi

    2013-01-01

    Full Text Available Biosurfactants are surface active compounds produced by various microorganisms. Production of biosurfactants via fermentation of immiscible wastes has the dual benefit of creating economic opportunities for manufacturers, while improving environmental health. A predictor system, recommended in such processes, must be scaled-up. Hence, four neural networks were developed for the dynamic modeling of the biosurfactant production kinetics, in presence of soybean oil or refinery wastes including acid oil, deodorizer distillate and soap stock. Each proposed feed forward neural network consists of three layers which are not fully connected. The input and output data for the training and validation of the neural network models were gathered from batch fermentation experiments. The proposed neural network models were evaluated by three statistical criteria (R2, RMSE and SE. The typical regression analysis showed high correlation coefficients greater than 0.971, demonstrating that the neural network is an excellent estimator for prediction of biosurfactant production kinetic data in a two phase liquid-liquid batch fermentation system. In addition, sensitivity analysis indicates that residual oil has the significant effect (i.e. 49% on the biosurfactant in the process.

  8. Optimization of grapevine yield by applying mathematical models to obtain quality wine products

    Science.gov (United States)

    Alina, Dobrei; Alin, Dobrei; Eleonora, Nistor; Teodor, Cristea; Marius, Boldea; Florin, Sala

    2016-06-01

    Relationship between the crop load and the grape yield and quality is a dynamic process, specific for wine cultivars and for fresh consumption varieties. Modeling these relations is important for the improvement of technological works. This study evaluated the interrelationship of crop load (B - buds number) and several production parameters (Y - yield; S - sugar; A - acidity; GaI - Glucoacidimetric index; AP - alcoholic potential; F - flavorings, WA - wine alcohol; SR - sugar residue, in Muscat Ottonel wine cultivar and Y - yield; S - sugar; A - acidity; GaI - Glucoacidimetric Index; CP - commercial production; BS - berries size in the Victoria table grape cultivar). In both varieties have been identified correlations between the independent variable (B - buds number as a result of pruning and training practices) and quality parameters analyzed (r = -0.699 for B vsY relationship; r = 0.961 for the relationship B vs S; r = -0.959 for B vs AP relationship; r = 0.743 for the relationship Y vs S, p <0.01, in the Muscat Ottonel cultivar, respectively r = -0.907 for relationship B vs Y; r = -0.975 for B vs CP relationship; r = -0.971 for relationship B vs BS; r = 0.990 for CP vs BS relationship in the Victoria cultivar. Through regression analysis were obtained models that describe the variation concerning production and quality parameters in relation to the independent variable (B - buds number) with statistical significance results.

  9. Applying complex models to poultry production in the future--economics and biology.

    Science.gov (United States)

    Talpaz, H; Cohen, M; Fancher, B; Halley, J

    2013-09-01

    The ability to determine the optimal broiler feed nutrient density that maximizes margin over feeding cost (MOFC) has obvious economic value. To determine optimal feed nutrient density, one must consider ingredient prices, meat values, the product mix being marketed, and the projected biological performance. A series of 8 feeding trials was conducted to estimate biological responses to changes in ME and amino acid (AA) density. Eight different genotypes of sex-separate reared broilers were fed diets varying in ME (2,723-3,386 kcal of ME/kg) and AA (0.89-1.65% digestible lysine with all essential AA acids being indexed to lysine) levels. Broilers were processed to determine carcass component yield at many different BW (1.09-4.70 kg). Trial data generated were used in model constructed to discover the dietary levels of ME and AA that maximize MOFC on a per broiler or per broiler annualized basis (bird × number of cycles/year). The model was designed to estimate the effects of dietary nutrient concentration on broiler live weight, feed conversion, mortality, and carcass component yield. Estimated coefficients from the step-wise regression process are subsequently used to predict the optimal ME and AA concentrations that maximize MOFC. The effects of changing feed or meat prices across a wide spectrum on optimal ME and AA levels can be evaluated via parametric analysis. The model can rapidly compare both biological and economic implications of changing from current practice to the simulated optimal solution. The model can be exploited to enhance decision making under volatile market conditions.

  10. A multi-layered approach to product architecture modeling: Applied to technology prototypes

    DEFF Research Database (Denmark)

    Ravn, Poul Martin; Guðlaugsson, Tómas Vignir; Mortensen, Niels Henrik

    2016-01-01

    Companies that wish to include novel technology in the product portfolio may need to test and evaluate the technology with the use of prototypes to learn its benefits. Without clear knowledge of the benefits of the technology to the products in the portfolio, in the form of increased performance...

  11. The Toyota product development system applied to a design management workshop model

    DEFF Research Database (Denmark)

    Thyssen, Mikael Hygum; Emmitt, Stephen; Bonke, Sten

    2008-01-01

    reports the early findings of a research project which aims to develop a workshop method for lean design management in construction through a deeper understanding of the Toyota product development system (TPDS) and value theory in general. Results from a case-study will be presented and a theoretical......Within a lean framework the goal is to enhance productivity by maximizing client value and minimizing waste known as muda. In the construction industry focus has mainly been on minimizing waste within the construction site production process. However, research has shown that a great amount...... of the waste experienced during site assembly can be traced back to the early design phase. In addition minimizing waste does not guarantee overall project success if client values are not fully under-stood. Indeed it is possible to effectively produce a product that the client does not value. This paper...

  12. Applied impulsive mathematical models

    CERN Document Server

    Stamova, Ivanka

    2016-01-01

    Using the theory of impulsive differential equations, this book focuses on mathematical models which reflect current research in biology, population dynamics, neural networks and economics. The authors provide the basic background from the fundamental theory and give a systematic exposition of recent results related to the qualitative analysis of impulsive mathematical models. Consisting of six chapters, the book presents many applicable techniques, making them available in a single source easily accessible to researchers interested in mathematical models and their applications. Serving as a valuable reference, this text is addressed to a wide audience of professionals, including mathematicians, applied researchers and practitioners.

  13. A Hybrid MCDM Model for New Product Development: Applied on the Taiwanese LiFePO4 Industry

    Directory of Open Access Journals (Sweden)

    Wen-Chin Chen

    2015-01-01

    Full Text Available Recent years, since problems with respect to atmosphere pollution hasten countries to accentuate green-related policy regarding the sustainable energy, the lithium-iron phosphate (LiFePO4 battery has been appealed to the world. However, more and more firms invest the LiFePO4 batteries production that has launched a fierce competition. Successful new product development (NPD processes have been considered the key for LiFePO4 battery firms to increase their competitive advantage. Firms must make correct decision faster due to the rapid development of technology and the decreasing product life cycle. This study proposes a hybrid multiple criteria decision making (MCDM model based on the literature review and consultation with the experts, interpretive structural modeling (ISM, and fuzzy analytic network process (FANP for evaluating various strategies for NPD. First of all, reviewing of literature and meeting with the experts are used to screen factors and select the criteria. Then, an ISM is managed to determine the feedback and interdependency of those factors in a network. Finally, a fuzzy theory is applied to resolve the linguistic hedges and an ANP is adopted to obtain the weights of all the factors. A case study is undertaken to validate the model in a Taiwanese company that provides professional packing and design for lithium-iron phosphate battery.

  14. Performance Comparison Between Logistic and Generalized Surplus-Production Models Applied to the Sillago sihama Fishery in Pakistan

    Institute of Scientific and Technical Information of China (English)

    Sher Khan Panhwar; LIU Qun; Shabir Ali Amir; Muhsan Ali Kalhoro

    2012-01-01

    The catch and effort data of Sillago sihama fishery in Pakistani waters were used to investigate the performance of two closely related stock assessment models:logistic and generalized surplus-production models.Compared with the generalized production model,the logistic model produced more reasonable estimates for parameters such as maximum sustainable yield.The Akaike's Information Criterion values estimated at 4.265 and-51.152 respectively by the logistic and generalized models.Simulation analyses of the S.sihama fishery showed that the estimated and observed abundance indices for the logistic model were closer than those for the generalized production model.Standardized residuals were distributed closer for logistic model,but exhibited a slightly increasing trend for the generalized model.Statistical outliers were seen in 1989 and 1993 for the logistic model,and in 1981 and 1999 for the generalized model.Simulated results revealed that the logistic estimates were close to the true value for low CVs(coefficients of variation)but widely dispersed for high CVs.In contrast,the generalized model estimates were loose for all CV levels.The estimated production model curve parameter φ was not reasonable at all the tested levels of white noise.With the increase in white noise R2 for the catch per unit effort decreased.Therefore,we conclude that the logistic model performs more reasonably than the generalized production model.

  15. Applied Bayesian modelling

    CERN Document Server

    Congdon, Peter

    2014-01-01

    This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU

  16. Applying the Context, Input, Process, Product Evaluation Model for Evaluation, Research, and Redesign of an Online Master’s Program

    Directory of Open Access Journals (Sweden)

    Hatice Sancar Tokmak

    2013-07-01

    Full Text Available This study aimed to evaluate and redesign an online master’s degree program consisting of 12 courses from the informatics field using a context, input, process, product (CIPP evaluation model. Research conducted during the redesign of the online program followed a mixed methodology in which data was collected through a CIPP survey, focus-group interview, and open-ended questionnaire. An initial CIPP survey sent to students, which had a response rate of approximately 60%, indicated that the Fuzzy Logic course did not fully meet the needs of students. Based on these findings, the program managers decided to improve this course, and a focus group was organized with the students of the Fuzzy Logic course in order to obtain more information to help in redesigning the course. Accordingly, the course was redesigned to include more examples and visuals, including videos; student-instructor interaction was increased through face-to-face meetings; and extra meetings were arranged before exams so that additional examples could be presented for problem-solving to satisfy students about assessment procedures. Lastly, the modifications to the Fuzzy Logic course were implemented, and the students in the course were sent an open-ended form asking them what they thought about the modifications. The results indicated that most students were pleased with the new version of the course.

  17. Terahertz spectroscopy applied to food model systems

    DEFF Research Database (Denmark)

    Møller, Uffe

    Water plays a crucial role in the quality of food. Apart from the natural water content of a food product, the state of that water is very important. Water can be found integrated into the biological material or it can be added during production of the product. Currently it is difficult to differ...... to differentiate between these types of water in subsequent quality controls. This thesis describes terahertz time-domain spectroscopy applied on aqueous food model systems, with particular focus on ethanol-water mixtures and confined water pools in inverse micelles.......Water plays a crucial role in the quality of food. Apart from the natural water content of a food product, the state of that water is very important. Water can be found integrated into the biological material or it can be added during production of the product. Currently it is difficult...

  18. Risk assessment of topically applied products

    DEFF Research Database (Denmark)

    Søborg, Tue; Basse, Line Hollesen; Halling-Sørensen, Bent

    2007-01-01

    The human risk of harmful substances in semisolid topical dosage forms applied topically to normal skin and broken skin, respectively, was assessed. Bisphenol A diglycidyl ether (BADGE) and three derivatives of BADGE previously quantified in aqueous cream and the UV filters 3-BC and 4-MBC were used...... as model compounds. Tolerable daily intake (TDI) values have been established for BADGE and derivatives. Endocrine disruption was chosen as endpoint for 3-BC and 4-MBC. Skin permeation of the model compounds was investigated in vitro using pig skin membranes. Tape stripping was applied to simulate broken...... parameters for estimating the risk. The immediate human risk of BADGE and derivatives in topical dosage forms was found to be low. However, local treatment of broken skin may lead to higher exposure of BADGE and derivatives compared to application to normal skin. 3-BC permeated skin at higher flux than 4-MBC...

  19. Exergy analysis applied to biodiesel production

    Energy Technology Data Exchange (ETDEWEB)

    Talens, Laura; Villalba, Gara [SosteniPra UAB-IRTA. Environmental Science and Technology Institute (ICTA), Edifici Cn, Universitat Autonoma de Barcelona (UAB), 08193 Bellaterra, Cerdanyola del Valles, Barcelona (Spain); Gabarrell, Xavier [SosteniPra UAB-IRTA. Environmental Science and Technology Institute ICTA, Edifici Cn, Universitat Autonoma de Barcelona (UAB), 08193 Bellaterra (Cerdanyola del Valles), Barcelona (Spain); Department of Chemical Engineering, Universitat Autonoma de Barcelona (UAB), 08193 Bellaterra Cerdanyola del Valles, Barcelona (Spain)

    2007-08-15

    In our aim to decrease the consumption of materials and energy and promote the use of renewable resources, such as biofuels, rises the need to measure materials and energy fluxes. This paper suggests the use of Exergy Flow Analysis (ExFA) as an environmental assessment tool to account wastes and emissions, determine the exergetic efficiency, compare substitutes and other types of energy sources: all useful in defining environmental and economical policies for resource use. In order to illustrate how ExFA is used, it is applied to the process of biodiesel production. The results show that the production process has a low exergy loss (492 MJ). The exergy loss is reduced by using potassium hydroxide and sulphuric acid as process catalysts and it can be further minimised by improving the quality of the used cooking oil. (author)

  20. Production models

    DEFF Research Database (Denmark)

    Svensson, Carsten

    2002-01-01

    The Project is co-financed with Nilpeter A/S and investigates the industrialization of build to order production. Project content: - Enterprise engineering - Specification processes - Mass Customization/ Build To Order - Knowledge/information management - Configuration - Supply Chain Management...

  1. DATA CRYSTALLIZATION APPLIED FOR DESIGNING NEW PRODUCTS

    Institute of Scientific and Technical Information of China (English)

    Kenichi HORIE; Yoshiharu MAENO; Yukio OHSAWA

    2007-01-01

    It is only the observable part of the real world that can be stored in data.For such incomplete and ill-structured data,data crystallizing aims at presenting the hidden structure among events including unobservable events.This is realized by data crystallization,where dummy items,corresponding to potential existence ofunobservable events,are inserted to the given data.These dummy items and their relations with observable events are visualized by applying KeyGraph to the data with dummy items,like the crystallization of snow where dusts are involved in the formation of crystallization of water molecules.For tuning the granularity level of structure to be visualized,the tool of data crystallization is integrated with human's process of understanding significant scenarios in the real world.This basic method is expected to be applicable for various real world domains where previous methods of chance-discovery lead human to successful decision making.In this paper,we apply the data crystallization with human-interactive annealing (DCHA) to the design of products in a real company.The results show its effect to industrial decision making.

  2. Assessing exposure to transformation products of soil-applied organic contaminants in surface water: comparison of model predictions and field data.

    Science.gov (United States)

    Kern, Susanne; Singer, Heinz; Hollender, Juliane; Schwarzenbach, René P; Fenner, Kathrin

    2011-04-01

    Transformation products (TPs) of chemicals released to soil, for example, pesticides, are regularly detected in surface and groundwater with some TPs even dominating observed pesticide levels. Given the large number of TPs potentially formed in the environment, straightforward prioritization methods based on available data and simple, evaluative models are required to identify TPs with a high aquatic exposure potential. While different such methods exist, none of them has so far been systematically evaluated against field data. Using a dynamic multimedia, multispecies model for TP prioritization, we compared the predicted relative surface water exposure potential of pesticides and their TPs with experimental data for 16 pesticides and 46 TPs measured in a small river draining a Swiss agricultural catchment. Twenty TPs were determined quantitatively using solid-phase extraction liquid chromatography mass spectrometry (SPE-LC-MS/MS), whereas the remaining 26 TPs could only be detected qualitatively because of the lack of analytical reference standards. Accordingly, the two sets of TPs were used for quantitative and qualitative model evaluation, respectively. Quantitative comparison of predicted with measured surface water exposure ratios for 20 pairs of TPs and parent pesticides indicated agreement within a factor of 10, except for chloridazon-desphenyl and chloridazon-methyl-desphenyl. The latter two TPs were found to be present in elevated concentrations during baseflow conditions and in groundwater samples across Switzerland, pointing toward high concentrations in exfiltrating groundwater. A simple leaching relationship was shown to qualitatively agree with the observed baseflow concentrations and to thus be useful in identifying TPs for which the simple prioritization model might underestimate actual surface water concentrations. Application of the model to the 26 qualitatively analyzed TPs showed that most of those TPs categorized as exhibiting a high aquatic

  3. From Product Models to Product State Models

    DEFF Research Database (Denmark)

    Larsen, Michael Holm

    1999-01-01

    A well-known technology designed to handle product data is Product Models. Product Models are in their current form not able to handle all types of product state information. Hence, the concept of a Product State Model (PSM) is proposed. The PSM and in particular how to model a PSM is the Research...... Object for this project. In the presentation, benefits and challenges of the PSM will be presented as a basis for the discussion....

  4. Product personality: from analysing to applying

    NARCIS (Netherlands)

    Pourtalebi Hendehkhaleh, S.; Pouralvar, K.

    2012-01-01

    Nowadays products are expected to undertake their functions properly and the competition for satisfying consumer is in the field of product attachments and emotional characteristics. Products have a symbolic meaning in addition to their utilitarian benefits. This symbolic meaning that refers to phys

  5. Toxicity Tests Applied to the Biocidal Products

    OpenAIRE

    Karabay Yavaşoğlu, N.Ülkü

    2015-01-01

    Biocides are defined as chemical substances used to suppress, destroy, deter, render harmless, or exert a controlling effect on any harmful organism to human or animal health, or that cause damage to natural or manufactured materials. Biocidal products (BPs) containing biocides are disinfectants, products related to human and veterinary hygiene, products used for pests such as insects, rodents etc., repellents and industrial chemicals like anti-fouling paints for ship and material preservativ...

  6. Applying the Context, Input, Process, Product Evaluation Model for Evaluation, Research, and Redesign of an Online Master's Program

    Science.gov (United States)

    Sancar Tokmak, Hatice; Meltem Baturay, H.; Fadde, Peter

    2013-01-01

    This study aimed to evaluate and redesign an online master's degree program consisting of 12 courses from the informatics field using a context, input, process, product (CIPP) evaluation model. Research conducted during the redesign of the online program followed a mixed methodology in which data was collected through a CIPP survey,…

  7. Extended exergy accounting applied to biodiesel production

    Energy Technology Data Exchange (ETDEWEB)

    Talens Peiro, L. [SosteniPrA (UAB-IRTA), Institute of Environmental Science and Technology (ICTA), Edifici Q-ETSE, Room QC 3101, Universitat Autonoma de Barcelona (UAB), E-08193 Bellaterra (Cerdanyola del Valles), Barcelona (Spain); Villalba Mendez, G.; Gabarrell i Durany, X. [SosteniPrA (UAB-IRTA), Institute of Environmental Science and Technology (ICTA), Edifici Q-ETSE, Room QC 3101, Universitat Autonoma de Barcelona (UAB), E-08193 Bellaterra (Cerdanyola del Valles), Barcelona (Spain); Department of Chemical Engineering, Edifici Q, Universitat Autonoma de Barcelona (UAB), E- 08193, Bellaterra (Cerdanyola del Valles), Barcelona (Spain); Sciubba, E. [Department of Mechanical and Aeronautical Engineering, University of Roma 1 ' ' La Sapienza' ' , Via Eudossiana 18, I-00184 Roma (Italy)

    2010-07-15

    When evaluating the production of renewable energies such as biofuels, it is necessary to include in the assessment the resource inputs, capital, labor investment and environmental remediation costs. Extended Exergy Accounting (EEA) is a system analysis method that calculates, on the basis of detailed mass and exergy balances, the total amount of primary exergy resources necessary to obtain a product or service. The conceptual novelty of EEA is represented by the fact that it also includes externalities (capital, labor and environmental impact) measured in homogeneous units (Joules). As an illustration of EEA, we assess and compare the production of 1 ton of biodiesel from used cooking oil (UCOME) and rapeseed crops (RME). The extended exergy ''content'' of UCOME and RME are 51.90 GJ and 77.05 GJ respectively. The production of UCOME uses 25.15 GJ less resources (materials and energy) and requires lower total investments and environmental remediation costs than that of RME. On the other hand, UCOME requires 35% more workhours. In summary, the extended exergy of UCOME is about 1.5 the extended exergy content of RME. Thus, we can conclude that biodiesel production from UCO is less resource use intensive than the production from RME. (author)

  8. Self-energy production applied to buildings

    Energy Technology Data Exchange (ETDEWEB)

    Carlo, Fabricio Ramos del; Balestieri, Jose Antonio Perrella [Sao Paulo State University Julio de Mesquita Filho (UNESP), Guaratingueta, SP (Brazil)], E-mail: perrella@feg.unesp.br; Holanda, Marcelo Rodrigues de [Sao Paulo Univ. (EEL/USP), Lorena, SP (Brazil). Engineering School], E-mail: marcelo@debas.eel.usp.br

    2010-07-01

    The decentralization of energy production in order to obtain better environmental conditions, reducing greenhouse gas emissions and the cost reduction of electricity and thermal energy consumed in residential buildings has been proposed in the literature. This paper proposes to demonstrate what are the chances of having a microcogeneration system toward the residential application. In this study, we contemplate the technologies involved and their possible inputs that are arranged in a superstructure to be studied. As a first step we obtain the cost of the products generated by the configuration that consists basically of two sources of power generation, and through optimization calculations intended to obtain the best configuration, taking into consideration the selection between four fuels, two equipment generators (Fuel Cell and Internal Combustion Engine)and three levels of energy production for each one. An economic analysis is also presented to evaluate the opportunity of selling the energy generated considering the fluctuations of the residential building consumption needs. (author)

  9. Applying Product Configuration Systems in Engineering Companies

    DEFF Research Database (Denmark)

    Ladeby, Klaes Rohde

    further. “[In order] to be strategic, a capability must be honed to a user need (so that there are customers), unique (so that the products/services produced can be priced without too much regard to competition), and difficult to replicate (so that profits will not be competed away). (Teece & Pisano, 1994...

  10. Applied groundwater modeling, 2nd Edition

    Science.gov (United States)

    Anderson, Mary P.; Woessner, William W.; Hunt, Randall J.

    2015-01-01

    This second edition is extensively revised throughout with expanded discussion of modeling fundamentals and coverage of advances in model calibration and uncertainty analysis that are revolutionizing the science of groundwater modeling. The text is intended for undergraduate and graduate level courses in applied groundwater modeling and as a comprehensive reference for environmental consultants and scientists/engineers in industry and governmental agencies.

  11. Educational software design: applying models of learning

    Directory of Open Access Journals (Sweden)

    Stephen Richards

    1996-12-01

    Full Text Available The model of learning adopted within this paper is the 'spreading ripples' (SR model proposed by Race (1994. This model was chosen for two important reasons. First, it makes use of accessible ideas and language, .and is therefore simple. Second, .Race suggests that the model can be used in the design, of educational and training programmes (and can thereby be applied to the design of computer-based learning materials.

  12. Applying Modeling Tools to Ground System Procedures

    Science.gov (United States)

    Di Pasquale, Peter

    2012-01-01

    As part of a long-term effort to revitalize the Ground Systems (GS) Engineering Section practices, Systems Modeling Language (SysML) and Business Process Model and Notation (BPMN) have been used to model existing GS products and the procedures GS engineers use to produce them.

  13. Geostatistical methods applied to field model residuals

    DEFF Research Database (Denmark)

    Maule, Fox; Mosegaard, K.; Olsen, Nils

    consists of measurement errors and unmodelled signal), and is typically assumed to be uncorrelated and Gaussian distributed. We have applied geostatistical methods to analyse the residuals of the Oersted(09d/04) field model [http://www.dsri.dk/Oersted/Field_models/IGRF_2005_candidates/], which is based...

  14. Applying the WEAP Model to Water Resource

    DEFF Research Database (Denmark)

    Gao, Jingjing; Christensen, Per; Li, Wei

    Water resources assessment is a tool to provide decision makers with an appropriate basis to make informed judgments regarding the objectives and targets to be addressed during the Strategic Environmental Assessment (SEA) process. The study shows how water resources assessment can be applied in SEA...... in assessing the effects on water resources using a case study on a Coal Industry Development Plan in an arid region in North Western China. In the case the WEAP model (Water Evaluation And Planning System) were used to simulate various scenarios using a diversity of technological instruments like irrigation...... efficiency, treatment and reuse of water. The WEAP model was applied to the Ordos catchment where it was used for the first time in China. The changes in water resource utilization in Ordos basin were assessed with the model. It was found that the WEAP model is a useful tool for water resource assessment...

  15. Applying incentive sensitization models to behavioral addiction

    DEFF Research Database (Denmark)

    Rømer Thomsen, Kristine; Fjorback, Lone; Møller, Arne;

    2014-01-01

    The incentive sensitization theory is a promising model for understanding the mechanisms underlying drug addiction, and has received support in animal and human studies. So far the theory has not been applied to the case of behavioral addictions like Gambling Disorder, despite sharing clinical...

  16. Applied probability models with optimization applications

    CERN Document Server

    Ross, Sheldon M

    1992-01-01

    Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

  17. Variable table applied in parameteric product model design using Solid Edge%变量表在Solid Edge参数化设计中的应用

    Institute of Scientific and Technical Information of China (English)

    王劲; 肖冰; 由艳平

    2012-01-01

    设置变量表是利用Solid Edge进行三维产品模型参数化设计的常用方法.介绍了变量表中变量的类型和属性的设置方法,说明了设置变量变化范围所使用的符号以及设置变量间简单函数关系的算术运算符和数学函数,给出了VBA过程的语法格式,通过变量表加载VBA过程可以设置变量间比较复杂的关系,说明了VBA编程的应用方法.%The common method in three-dimensional product model parameteric design using Solid Edge is setting the variable table . This article introduces the type of variables in the variable table and the setting method of variable attributes. The symbols used to set the variable range and arithmetic operators and math functions for a simple functional relationship between variables are illustrated . The grammar format of VBA is given. The process of the variable table loading VBA can set more complex relationship between variables. The application of VBA programming method is introduced.

  18. Information Model for Product Modeling

    Institute of Scientific and Technical Information of China (English)

    焦国方; 刘慎权

    1992-01-01

    The Key problems in product modeling for integrated CAD ∥CAM systems are the information structures and representations of products.They are taking more and more important roles in engineering applications.With the investigation on engineering product information and from the viewpoint of industrial process,in this paper,the information models are proposed and the definitions of the framework of product information are given.And then,the integration and the consistence of product information are discussed by introucing the entity and its instance.As a summary,the information structures described in this paper have many advantage and natures helpful in engineering design.

  19. Applied Integer Programming Modeling and Solution

    CERN Document Server

    Chen, Der-San; Dang, Yu

    2011-01-01

    An accessible treatment of the modeling and solution of integer programming problems, featuring modern applications and software In order to fully comprehend the algorithms associated with integer programming, it is important to understand not only how algorithms work, but also why they work. Applied Integer Programming features a unique emphasis on this point, focusing on problem modeling and solution using commercial software. Taking an application-oriented approach, this book addresses the art and science of mathematical modeling related to the mixed integer programming (MIP) framework and

  20. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  1. Applied Mathematics, Modelling and Computational Science

    CERN Document Server

    Kotsireas, Ilias; Makarov, Roman; Melnik, Roderick; Shodiev, Hasan

    2015-01-01

    The Applied Mathematics, Modelling, and Computational Science (AMMCS) conference aims to promote interdisciplinary research and collaboration. The contributions in this volume cover the latest research in mathematical and computational sciences, modeling, and simulation as well as their applications in natural and social sciences, engineering and technology, industry, and finance. The 2013 conference, the second in a series of AMMCS meetings, was held August 26–30 and organized in cooperation with AIMS and SIAM, with support from the Fields Institute in Toronto, and Wilfrid Laurier University. There were many young scientists at AMMCS-2013, both as presenters and as organizers. This proceedings contains refereed papers contributed by the participants of the AMMCS-2013 after the conference. This volume is suitable for researchers and graduate students, mathematicians and engineers, industrialists, and anyone who would like to delve into the interdisciplinary research of applied and computational mathematics ...

  2. Product Platform Modeling

    DEFF Research Database (Denmark)

    Pedersen, Rasmus

    on the notion that reuse and encapsulation of platform elements are fundamental characteristics of a product platform. Reuse covers the desire to reuse and share certain assets across a family of products and/or across generations of products. Product design solutions and principles are often regarded...... as important assets in a product platform, yet activities, working patterns, processes and knowledge can also be reused in a platform approach. Encapsulation is seen as a process in which the different elements of a platform are grouped into well defined and self-contained units which are decoupled from each......This PhD thesis has the title Product Platform Modelling. The thesis is about product platforms and visual product platform modelling. Product platforms have gained an increasing attention in industry and academia in the past decade. The reasons are many, yet the increasing globalisation...

  3. Applying axiomatic design methodology in developing modified libertation products

    Directory of Open Access Journals (Sweden)

    Bibiana Margarita Vallejo Díaz

    2010-04-01

    Full Text Available Some conceptual elements regarding the axiomatic design method were applied to a specific case-study regarding developing modified liberation compressed product (CLM-UN, for use in the agricultural sector as pH regulating agent in solil. The study was orientated towards defining functional requeriments, design parameters and process variables for manufacturing the product. Independence and information were evaluated, supporting axiomatic design as an alternative for integral product and process design (as a rational and systemic exercise, facilitating producing products having the quality which future users expect from them.

  4. Model for integrated management of quality, labor risks prevention, environment and ethical aspects, applied to R&D&I and production processes in an organization

    Science.gov (United States)

    González, M. R.; Torres, F.; Yoldi, V.; Arcega, F.; Plaza, I.

    2012-04-01

    It is proposed an integrated management model for an organization. This model is based on the continuous improvement Plan-Do-Check-Act cycle and it intends to integrate the environmental, risk prevention and ethical aspects as well as research, development and innovation projects management in the general quality management structure proposed by ISO 9001:2008. It aims to fulfill the standards ISO 9001, ISO 14001, OSHAS 18001, SGE 21 y 166002.

  5. Applied Regression Modeling A Business Approach

    CERN Document Server

    Pardoe, Iain

    2012-01-01

    An applied and concise treatment of statistical regression techniques for business students and professionals who have little or no background in calculusRegression analysis is an invaluable statistical methodology in business settings and is vital to model the relationship between a response variable and one or more predictor variables, as well as the prediction of a response value given values of the predictors. In view of the inherent uncertainty of business processes, such as the volatility of consumer spending and the presence of market uncertainty, business professionals use regression a

  6. Markov Model Applied to Gene Evolution

    Institute of Scientific and Technical Information of China (English)

    季星来; 孙之荣

    2001-01-01

    The study of nucleotide substitution is very important both to our understanding of gene evolution and to reliable estimation of phylogenetic relationships. In this paper nucleotide substitution is assumed to be random and the Markov model is applied to the study of the evolution of genes. Then a non-linear optimization approach is proposed for estimating substitution in real sequences. This substitution is called the "Nucleotide State Transfer Matrix". One of the most important conclusions from this work is that gene sequence evolution conforms to the Markov process. Also, some theoretical evidences for random evolution are given from energy analysis of DNA replication.

  7. Fuzzy sets, rough sets, and modeling evidence: Theory and Application. A Dempster-Shafer based approach to compromise decision making with multiattributes applied to product selection

    Science.gov (United States)

    Dekorvin, Andre

    1992-01-01

    The Dempster-Shafer theory of evidence is applied to a multiattribute decision making problem whereby the decision maker (DM) must compromise with available alternatives, none of which exactly satisfies his ideal. The decision mechanism is constrained by the uncertainty inherent in the determination of the relative importance of each attribute element and the classification of existing alternatives. The classification of alternatives is addressed through expert evaluation of the degree to which each element is contained in each available alternative. The relative importance of each attribute element is determined through pairwise comparisons of the elements by the decision maker and implementation of a ratio scale quantification method. Then the 'belief' and 'plausibility' that an alternative will satisfy the decision maker's ideal are calculated and combined to rank order the available alternatives. Application to the problem of selecting computer software is given.

  8. The use of linear mixed models for analysis of repeated measurements applied to water-soluble carbohydrates in perennial ryegrass for seed production

    DEFF Research Database (Denmark)

    Gislum, René; Boelt, Birte; Zhang, Xia

    2009-01-01

    Repeated measurements of a response variable in crops or plants receiving different treatments are widely used in agricultural science. In this paper we analyse repeated measurements of the concentration of water-soluble carbohydrates in stem and ear of perennial ryegrass (Lolium perenne L....... The estimates of the water-soluble carbohydrates concentrations within the stem and ear datasets were similar for all three covariance structures, while the smallest standard errors were obtained using the compound symmetry covariance structure. As it was the goal to do parsimonious modelling more weight...... was given to the Bayesian information criteria than to the Akaike information criteria. Accordingly, the compound symmetry structure was chosen for the stem data and the unstructured structure was found to be the best structure for the ear data. A model check of the residuals showed...

  9. Applying Model Based Systems Engineering to NASA's Space Communications Networks

    Science.gov (United States)

    Bhasin, Kul; Barnes, Patrick; Reinert, Jessica; Golden, Bert

    2013-01-01

    System engineering practices for complex systems and networks now require that requirement, architecture, and concept of operations product development teams, simultaneously harmonize their activities to provide timely, useful and cost-effective products. When dealing with complex systems of systems, traditional systems engineering methodology quickly falls short of achieving project objectives. This approach is encumbered by the use of a number of disparate hardware and software tools, spreadsheets and documents to grasp the concept of the network design and operation. In case of NASA's space communication networks, since the networks are geographically distributed, and so are its subject matter experts, the team is challenged to create a common language and tools to produce its products. Using Model Based Systems Engineering methods and tools allows for a unified representation of the system in a model that enables a highly related level of detail. To date, Program System Engineering (PSE) team has been able to model each network from their top-level operational activities and system functions down to the atomic level through relational modeling decomposition. These models allow for a better understanding of the relationships between NASA's stakeholders, internal organizations, and impacts to all related entities due to integration and sustainment of existing systems. Understanding the existing systems is essential to accurate and detailed study of integration options being considered. In this paper, we identify the challenges the PSE team faced in its quest to unify complex legacy space communications networks and their operational processes. We describe the initial approaches undertaken and the evolution toward model based system engineering applied to produce Space Communication and Navigation (SCaN) PSE products. We will demonstrate the practice of Model Based System Engineering applied to integrating space communication networks and the summary of its

  10. Chemical Product Design: A new challenge of applied thermodynamics

    DEFF Research Database (Denmark)

    Abildskov, Jens; Kontogeorgis, Georgios

    2004-01-01

    of such a product involving both solid-liquid phases and (non-equilibrium) metastable states. Thus, many of these products are colloidal systems of different types, e.g. liquid-liquid emulsions, suspensions, powders, solid and liquid dispersions, aerosols and sprays. The physical chemistry (thermodynamics......, stability) of such products is often as important as their manufacture, while a number of non-traditional manufacturing/ separation processes are of relevance, e.g. emulsification, foaming, gelation, granulation and crystallization. Today, serious gaps exist in our thermodynamic modelling abilities when we...... try to describe and understand chemical products with traditional thermodynamic models, typically applicable to problems of petrochemical industries. The purpose of this article is two-fold: first to present some current and future challenges in thermodynamic modelling towards chemical product design...

  11. Product model structure for generalized optimal design

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The framework of the generalized optimization product model with the core of network- and tree-hierarchical structure is advanced to improve the characteristics of the generalized optimal design. Based on the proposed node-repetition technique, a network-hierarchical structure is united with the tree-hierarchical structure to facilitate the modeling of serialization and combination products. The criteria for product decomposition are investigated. Seven tree nodes are defined for the construction of a general product model, and their modeling properties are studied in detail. The developed product modeling system is applied and examined successfully in the modeling practice of the generalized optimal design for a hydraulic excavator.

  12. Modelling Retail Floorspace Productivity

    NARCIS (Netherlands)

    A.R. Thurik (Roy); P. Kooiman

    1986-01-01

    textabstractThis research note presents a "switching regime" model to investigate the impact of environmental factors on floorspace productivity of individual retail stores. The model includes independent supply and demand functions, which are incorporated within a sales maximizing framework. Unlike

  13. Applied TRIZ in Improving Productivity in Textile Industry

    Directory of Open Access Journals (Sweden)

    Ahmad Aminah

    2017-01-01

    Full Text Available TRIZ is a methodology and a collection of problem solving tools and strategies that has been used in many other fields. Therefore, this paper proposes TRIZ method for improving the productivity in a textile industry. It focuses at the packing department in a textile company situated in Malaysia. The process was monitored and the problem was observed. TRIZ method is applied in this problem using Functional Analysis and trimming method. A comparison between before and after implementation is done in order to evaluate the productivity effectiveness.

  14. Digital prototyping technique applied for redesigning plastic products

    Science.gov (United States)

    Pop, A.; Andrei, A.

    2015-11-01

    After products are on the market for some time, they often need to be redesigned to meet new market requirements. New products are generally derived from similar but outdated products. Redesigning a product is an important part of the production and development process. The purpose of this paper is to show that using modern technology, like Digital Prototyping in industry is an effective way to produce new products. This paper tries to demonstrate and highlight the effectiveness of the concept of Digital Prototyping, both to reduce the design time of a new product, but also the costs required for implementing this step. The results of this paper show that using Digital Prototyping techniques in designing a new product from an existing one available on the market mould offers a significantly manufacturing time and cost reduction. The ability to simulate and test a new product with modern CAD-CAM programs in all aspects of production (designing of the 3D model, simulation of the structural resistance, analysis of the injection process and beautification) offers a helpful tool for engineers. The whole process can be realised by one skilled engineer very fast and effective.

  15. Linear model applied to the evaluation of pharmaceutical stability data

    Directory of Open Access Journals (Sweden)

    Renato Cesar Souza

    2013-09-01

    Full Text Available The expiry date on the packaging of a product gives the consumer the confidence that the product will retain its identity, content, quality and purity throughout the period of validity of the drug. The definition of this term in the pharmaceutical industry is based on stability data obtained during the product registration. By the above, this work aims to apply the linear regression according to the guideline ICH Q1E, 2003, to evaluate some aspects of a product undergoing in a registration phase in Brazil. With this propose, the evaluation was realized with the development center of a multinational company in Brazil, with samples of three different batches composed by two active principal ingredients in two different packages. Based on the preliminary results obtained, it was possible to observe the difference of degradation tendency of the product in two different packages and the relationship between the variables studied, added knowledge so new models of linear equations can be applied and developed for other products.

  16. Reduction of the complexity of product modelling by modularisation

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup

    1998-01-01

    The complexity in handling product aspects in design and production may be reduced by using approaches, which are applied in the field of modular engineering. This unit-oriented "spelling" of products, leading to product models with encapsulation, is introduced.......The complexity in handling product aspects in design and production may be reduced by using approaches, which are applied in the field of modular engineering. This unit-oriented "spelling" of products, leading to product models with encapsulation, is introduced....

  17. Modelling Product Families for Product Configuration Systems with Product Variant Master

    DEFF Research Database (Denmark)

    Mortensen, Niels Henrik; Hvam, Lars; Haug, Anders

    2010-01-01

    developed in cooperation with several industrial companies. This article refers to experiences from applying the modelling technique in three different companies. Based upon these experiences, the utility of the product variant master and CRC-cards is evaluated. Significance. Product configuration systems...... are increasingly used in industrial companies as a means for efficient design of customer tailored products. The design and implementation of product configuration systems is a new and challenging task for the industrial companies and calls for a scientifically based framework to support the modelling......This article presents an evaluation of applying a suggested method for modelling product families for product configuration based on theory for modelling mechanical products,systems theory and object-oriented modelling. The modelling technique includes a so-called product variant master and CRC...

  18. Product Development Process Modeling

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    The use of Concurrent Engineering and other modern methods of product development and maintenance require that a large number of time-overlapped "processes" be performed by many people. However, successfully describing and optimizing these processes are becoming even more difficult to achieve. The perspective of industrial process theory (the definition of process) and the perspective of process implementation (process transition, accumulation, and inter-operations between processes) are used to survey the method used to build one base model (multi-view) process model.

  19. Applying mechanistic models in bioprocess development

    DEFF Research Database (Denmark)

    Lencastre Fernandes, Rita; Bodla, Vijaya Krishna; Carlquist, Magnus;

    2013-01-01

    The available knowledge on the mechanisms of a bioprocess system is central to process analytical technology. In this respect, mechanistic modeling has gained renewed attention, since a mechanistic model can provide an excellent summary of available process knowledge. Such a model therefore...

  20. Applied Creativity: The Creative Marketing Breakthrough Model

    Science.gov (United States)

    Titus, Philip A.

    2007-01-01

    Despite the increasing importance of personal creativity in today's business environment, few conceptual creativity frameworks have been presented in the marketing education literature. The purpose of this article is to advance the integration of creativity instruction into marketing classrooms by presenting an applied creative marketing…

  1. Product Knowledge Modelling and Management

    DEFF Research Database (Denmark)

    Zhang, Y.; MacCallum, K. J.; Duffy, Alex

    1996-01-01

    The term, Product Knowledge is used to refer to two related but distinct concepts; the knowledge of a specific product (Specific Product Knowledge) and the knowledge of a product domain (Product Domain Knowledge). Modelling and managing Product Knowlege is an essential part of carrying out design...... function-oriented design. Both Specific Product Knowledge and Product Domain Knowledge are modelled at two levels, a meta-model and an information-level.Following that, a computer-based scheme to manage the proposed product lknowledge models within a dynamically changing environment is presented....

  2. Applying MDL to Learning Best Model Granularity

    CERN Document Server

    Gao, Q; Vitanyi, P; Gao, Qiong; Li, Ming; Vitanyi, Paul

    2000-01-01

    The Minimum Description Length (MDL) principle is solidly based on a provably ideal method of inference using Kolmogorov complexity. We test how the theory behaves in practice on a general problem in model selection: that of learning the best model granularity. The performance of a model depends critically on the granularity, for example the choice of precision of the parameters. Too high precision generally involves modeling of accidental noise and too low precision may lead to confusion of models that should be distinguished. This precision is often determined ad hoc. In MDL the best model is the one that most compresses a two-part code of the data set: this embodies ``Occam's Razor.'' In two quite different experimental settings the theoretical value determined using MDL coincides with the best value found experimentally. In the first experiment the task is to recognize isolated handwritten characters in one subject's handwriting, irrespective of size and orientation. Based on a new modification of elastic...

  3. Biplot models applied to cancer mortality rates.

    Science.gov (United States)

    Osmond, C

    1985-01-01

    "A graphical method developed by Gabriel to display the rows and columns of a matrix is applied to tables of age- and period-specific cancer mortality rates. It is particularly useful when the pattern of age-specific rates changes with time. Trends in age-specific rates and changes in the age distribution are identified as projections. Three examples [from England and Wales] are given."

  4. Professional applied physical training of future specialists of agricultural production

    Directory of Open Access Journals (Sweden)

    Karabanov Y.A.

    2015-01-01

    Full Text Available Purpose : develop and experimentally prove the contents, methods and forms of physical training of future specialists of agricultural production. This takes into account advanced course of professional applied physical preparation means kettlebell sport. Material : The study involved 141 students. Duration of study is 5 years. Results : It was found that a significant increase in indicators of flexibility, strength, coordination abilities of students promoted the use of exercises using weights of different weights. Confirmed the legitimacy of the use of such means of physical education for the development of muscle strength of the upper body, back, legs, abdominals. These muscles are the most loaded in the performance of professional activities of mechanical engineers. Conclusions : The program meets the basic criteria for the formation of curriculum for physical education. The program promotes the development of professional applications of physical qualities, motor skills and improve physical performance of students.

  5. Applying the Sport Education Model to Tennis

    Science.gov (United States)

    Ayvazo, Shiri

    2009-01-01

    The physical education field abounds with theoretically sound curricular approaches such as fitness education, skill theme approach, tactical approach, and sport education. In an era that emphasizes authentic sport experiences, the Sport Education Model includes unique features that sets it apart from other curricular models and can be a valuable…

  6. Advanced Production Planning Models

    Energy Technology Data Exchange (ETDEWEB)

    JONES,DEAN A.; LAWTON,CRAIG R.; KJELDGAARD,EDWIN A.; WRIGHT,STEPHEN TROY; TURNQUIST,MARK A.; NOZICK,LINDA K.; LIST,GEORGE F.

    2000-12-01

    >This report describes the innovative modeling approach developed as a result of a 3-year Laboratory Directed Research and Development project. The overall goal of this project was to provide an effective suite of solvers for advanced production planning at facilities in the nuclear weapons complex (NWC). We focused our development activities on problems related to operations at the DOE's Pantex Plant. These types of scheduling problems appear in many contexts other than Pantex--both within the NWC (e.g., Neutron Generators) and in other commercial manufacturing settings. We successfully developed an innovative and effective solution strategy for these types of problems. We have tested this approach on actual data from Pantex, and from Org. 14000 (Neutron Generator production). This report focuses on the mathematical representation of the modeling approach and presents three representative studies using Pantex data. Results associated with the Neutron Generator facility will be published in a subsequent SAND report. The approach to task-based scheduling described here represents a significant addition to the literature for large-scale, realistic scheduling problems in a variety of production settings.

  7. Applying fuel cell experience to sustainable power products

    Science.gov (United States)

    King, Joseph M.; O'Day, Michael J.

    Fuel cell power plants have demonstrated high efficiency, environmental friendliness, excellent transient response, and superior reliability and durability in spacecraft and stationary applications. Broader application of fuel cell technology promises significant contribution to sustainable global economic growth, but requires improvement to size, cost, fuel flexibility and operating flexibility. International Fuel Cells (IFC) is applying lessons learned from delivery of more than 425 fuel cell power plants and 3 million h of operation to the development of product technology which captures that promise. Key findings at the fuel cell power plant level include: (1) ancillary components account for more than 40% of the weight and nearly all unscheduled outages of hydrocarbon-fuelled power plants; a higher level of integration and simplification is required to achieve reasonable characteristics, (2) hydrocarbon fuel cell power plant components are highly interactive; the fuel processing approach and power plant operating pressure are major determinants of overall efficiency, and (3) achieving the durability required for heavy duty vehicles and stationary applications requires simultaneous satisfaction of electrochemical, materials and mechanical considerations in the design of the cell stack and other power plant components. Practical designs must minimize application specific equipment. Related lessons for stationary fuel cell power plants include: (1) within fuel specification limits, natural gas varies widely in heating value, minor constituents such as oxygen and nitrogen content and trace compounds such as the odorant; (2) city water quality varies widely; recovery of product water for process use avoids costly, complicated and site-specific water treatment systems, but water treatment is required to eliminate impurities and (3) the embedded protection functions for reliable operation of fuel cell power conditioners meet or exceed those required for connection to

  8. Applied mathematics: Models, Discretizations, and Solvers

    Institute of Scientific and Technical Information of China (English)

    D.E. Keyes

    2007-01-01

    @@ Computational plasma physicists inherit decades of developments in mathematical models, numerical algorithms, computer architecture, and software engineering, whose recent coming together marks the beginning of a new era of large-scale simulation.

  9. Applied modelling and computing in social science

    CERN Document Server

    Povh, Janez

    2015-01-01

    In social science outstanding results are yielded by advanced simulation methods, based on state of the art software technologies and an appropriate combination of qualitative and quantitative methods. This book presents examples of successful applications of modelling and computing in social science: business and logistic process simulation and optimization, deeper knowledge extractions from big data, better understanding and predicting of social behaviour and modelling health and environment changes.

  10. Applying Machine Trust Models to Forensic Investigations

    Science.gov (United States)

    Wojcik, Marika; Venter, Hein; Eloff, Jan; Olivier, Martin

    Digital forensics involves the identification, preservation, analysis and presentation of electronic evidence for use in legal proceedings. In the presence of contradictory evidence, forensic investigators need a means to determine which evidence can be trusted. This is particularly true in a trust model environment where computerised agents may make trust-based decisions that influence interactions within the system. This paper focuses on the analysis of evidence in trust-based environments and the determination of the degree to which evidence can be trusted. The trust model proposed in this work may be implemented in a tool for conducting trust-based forensic investigations. The model takes into account the trust environment and parameters that influence interactions in a computer network being investigated. Also, it allows for crimes to be reenacted to create more substantial evidentiary proof.

  11. Proactive Modeling of Market, Product and Production Architectures

    DEFF Research Database (Denmark)

    Mortensen, Niels Henrik; Hansen, Christian Lindschou; Hvam, Lars;

    2011-01-01

    This paper presents an operational model that allows description of market, products and production architectures. The main feature of this model is the ability to describe both structural and functional aspect of architectures. The structural aspect is an answer to the question: What constitutes...... the architecture, e.g. standard designs, design units and interfaces? The functional aspect is an answer to the question: What is the behaviour or the architecture, what is it able to do, i.e. which products at which performance levels can be derived from the architecture? Among the most important benefits...... of this model is the explicit ability to describe what the architecture is prepared for, and what it is not prepared for - concerning development of future derivative products. The model has been applied in a large scale global product development project. Among the most important benefits is contribution to...

  12. Support vector machine applied in QSAR modelling

    Institute of Scientific and Technical Information of China (English)

    MEI Hu; ZHOU Yuan; LIANG Guizhao; LI Zhiliang

    2005-01-01

    Support vector machine (SVM), partial least squares (PLS), and Back-Propagation artificial neural network (ANN) were employed to establish QSAR models of 2 dipeptide datasets. In order to validate predictive capabilities on external dataset of the resulting models, both internal and external validations were performed. The division of dataset into both training and test sets was carried out by D-optimal design. The results showed that support vector machine (SVM) behaved well in both calibration and prediction. For the dataset of 48 bitter tasting dipeptides (BTD), the results obtained by support vector regression (SVR) were superior to that by PLS in both calibration and prediction. When compared with BP artificial neural network, SVR showed less calibration power but more predictive capability. For the dataset of angiotensin-converting enzyme (ACE) inhibitors, the results obtained by support vector machine (SVM) regression were equivalent to those by PLS and BP artificial neural network. In both datasets, SVR using linear kernel function behaved well as that using radial basis kernel function. The results showed that there is wide prospect for the application of support vector machine (SVM) into QSAR modeling.

  13. Revised Reynolds Stress and Triple Product Models

    Science.gov (United States)

    Olsen, Michael E.; Lillard, Randolph P.

    2017-01-01

    Revised versions of Lag methodology Reynolds-stress and triple product models are applied to accepted test cases to assess the improvement, or lack thereof, in the prediction capability of the models. The Bachalo-Johnson bump flow is shown as an example for this abstract submission.

  14. Models and Modelling Tools for Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    2016-01-01

    The design, development and reliability of a chemical product and the process to manufacture it, need to be consistent with the end-use characteristics of the desired product. One of the common ways to match the desired product-process characteristics is through trial and error based experiments......-based framework is that in the design, development and/or manufacturing of a chemical product-process, the knowledge of the applied phenomena together with the product-process design details can be provided with diverse degrees of abstractions and details. This would allow the experimental resources......, are the needed models for such a framework available? Or, are modelling tools that can help to develop the needed models available? Can such a model-based framework provide the needed model-based work-flows matching the requirements of the specific chemical product-process design problems? What types of models...

  15. Modeling Novo Nordisk Production Systems

    DEFF Research Database (Denmark)

    Miller, Thomas Dedenroth

    1997-01-01

    This report describes attributes of models and systems, and how models can be used for description of production systems. There are special attention on the 'Theory of Domains'.......This report describes attributes of models and systems, and how models can be used for description of production systems. There are special attention on the 'Theory of Domains'....

  16. Finite element models applied in active structural acoustic control

    NARCIS (Netherlands)

    Oude Nijhuis, Marco H.H.; Boer, de André; Rao, Vittal S.

    2002-01-01

    This paper discusses the modeling of systems for active structural acoustic control. The finite element method is applied to model structures including the dynamics of piezoelectric sensors and actuators. A model reduction technique is presented to make the finite element model suitable for controll

  17. Metoder for Modellering, Simulering og Regulering af Større Termiske Processer anvendt i Sukkerproduktion. Methods for Modelling, Simulation and Control of Large Scale Thermal Systems Applied in Sugar Production

    DEFF Research Database (Denmark)

    Nielsen, Kirsten Mølgaard; Nielsen, Jens Frederik Dalsgaard

    simulator has been developed. The simulator handles the normal working conditions relevant to control engineers. A non-linear dynamic model based on mass and energy balances has been developed. The model parameters have been adjusted to data measured on a Danish sugar plant. The simulator consists...... retrenchment. A realtime simulator for a crystallization process in a sugar plant has been developed. The sections of the actual processes are internally modelled as separate modules according to the system apparatus. All components are modelled by ordinary differential equations and algebraic equations based...

  18. PRODUCT STRUCTURE DIGITAL MODEL

    Directory of Open Access Journals (Sweden)

    V.M. Sineglazov

    2005-02-01

    Full Text Available  Research results of representation of product structure made by means of CADDS5 computer-aided design (CAD system, Product Data Management Optegra (PDM system and Product Life Cycle Management Wind-chill system (PLM, are examined in this work. Analysis of structure component development and its storage in various systems is carried out. Algorithms of structure transformation required for correct representation of the structure are considered. Management analysis of electronic mockup presentation of the product structure is carried out for Windchill system.

  19. Essays on Applied Resource Economics Using Bioeconomic Optimization Models

    Science.gov (United States)

    Affuso, Ermanno

    With rising demographic growth, there is increasing interest in analytical studies that assess alternative policies to provide an optimal allocation of scarce natural resources while ensuring environmental sustainability. This dissertation consists of three essays in applied resource economics that are interconnected methodologically within the agricultural production sector of Economics. The first chapter examines the sustainability of biofuels by simulating and evaluating an agricultural voluntary program that aims to increase the land use efficiency in the production of biofuels of first generation in the state of Alabama. The results show that participatory decisions may increase the net energy value of biofuels by 208% and reduce emissions by 26%; significantly contributing to the state energy goals. The second chapter tests the hypothesis of overuse of fertilizers and pesticides in U.S. peanut farming with respect to other inputs and address genetic research to reduce the use of the most overused chemical input. The findings suggest that peanut producers overuse fungicide with respect to any other input and that fungi resistant genetically engineered peanuts may increase the producer welfare up to 36.2%. The third chapter implements a bioeconomic model, which consists of a biophysical model and a stochastic dynamic recursive model that is used to measure potential economic and environmental welfare of cotton farmers derived from a rotation scheme that uses peanut as a complementary crop. The results show that the rotation scenario would lower farming costs by 14% due to nitrogen credits from prior peanut land use and reduce non-point source pollution from nitrogen runoff by 6.13% compared to continuous cotton farming.

  20. MODIFIED GENETIC ALGORITHM APPLIED TO SOLVE PRODUCT FAMILY OPTIMIZATION PROBLEM

    Institute of Scientific and Technical Information of China (English)

    CHEN Chunbao; WANG Liya

    2007-01-01

    The product family design problem solved by evolutionary algorithms is discussed. A successfiil product family design method should achieve an optimal tradeoff among a set of competing objectives, which involves maximizing conunonality across the family of products and optimizing the performances of each product in the family. A 2-level chromosome structured genetic algorithm (2LCGA) is proposed to solve this dass of problems and its performance is analyzed in comparing its results with those obtained with other methods. By interpreting the chromosome as a 2-level linear structure, the variable commonality genetic algorithm (GA) is constructed to vary the amount of platform commonality and automatically searches across varying levels of commonality for the platform while trying to resolve the tradeoff between commonality and individual product performance within the product family during optimization process. By incorporating a commonality assessing index to the problem formulation, the 2LCGA optimize the product platform and its corresponding family of products in a single stage, which can yield improvements in the overall performance of the product family compared with two-stage approaches (the first stage involves determining the best settings for the platform variables and values of unique variables are found for each product in the second stage). The scope of the algorithm is also expanded by introducing a classification mechanism to allow multiple platforms to be considered during product family optimization, offering opportunities for superior overall design by more efficacious tradeoffs between commonality and performance. The effectiveness of 2LCGA is demonstrated through the design of a family of universal electric motors and comparison against previous results.

  1. Uncovering product development competence by applying the laddering technique

    DEFF Research Database (Denmark)

    Jensen, Bjarne; Harmsen, Hanne

    This paper addresses companies' lack of implementation of success factors in new product development. Drawing on theories in the competence perspective and an exploratory empirical study, the paper points to two major areas that have not been covered by previous studies on new product development...

  2. Applying life cycle management of colombian cocoa production

    Directory of Open Access Journals (Sweden)

    Oscar Orlando Ortiz-R

    2014-03-01

    Full Text Available The present research aims to evaluate the usefulness of the application of Life Cycle Management in the agricultural sector focusing on the environmental and socio-economic aspects of decision making in the Colombian cocoa production. Such appraisal is based on the application of two methodological tools: Life Cycle Assessment, which considers environmental impacts throughout the life cycle of the cocoa production system, and Taguchi Loss Function, which measures the economic impact of a process' deviation from production targets. Results show that appropriate improvements in farming practices and supply consumption can enhance decision-making in the agricultural cocoa sector towards sustainability. In terms of agri-business purposes, such qualitative shift allows not only meeting consumer demands for environmentally friendly products, but also increasing the productivity and competitiveness of cocoa production, all of which has helped Life Cycle Management gain global acceptance. Since farmers have an important role in improving social and economic indicators at the national level, more attention should be paid to the upgrading of their cropping practices. Finally, one fundamental aspect of national cocoa production is the institutional and governmental support available for farmers in face of socio-economic or technological needs.

  3. Teaching students to apply multiple physical modeling methods

    NARCIS (Netherlands)

    Wiegers, T.; Verlinden, J.C.; Vergeest, J.S.M.

    2014-01-01

    Design students should be able to explore a variety of shapes before elaborating one particular shape. Current modelling courses don’t address this issue. We developed the course Rapid Modelling, which teaches students to explore multiple shape models in a short time, applying different methods and

  4. Nonlinear Eddy Viscosity Models applied to Wind Turbine Wakes

    DEFF Research Database (Denmark)

    Laan, van der, Paul Maarten; Sørensen, Niels N.; Réthoré, Pierre-Elouan;

    2013-01-01

    The linear k−ε eddy viscosity model and modified versions of two existing nonlinear eddy viscosity models are applied to single wind turbine wake simulations using a Reynolds Averaged Navier-Stokes code. Results are compared with field wake measurements. The nonlinear models give better results...

  5. Experience in applying lean production concepts in the service sector

    Directory of Open Access Journals (Sweden)

    Dyrina Evgeniya

    2016-01-01

    Full Text Available This article describes the experience of implementing lean production tools in the company services. The authors have based the research of the chef’s work, built Ishikawa’s diagram on one of the most important problems of the company and proposed the cyclogram of cooks at work. Problems were identified using this technology after analyzing domestic and foreign experience in implementing lean production. The algorithm for implementing lean production and the necessary tools for achievement of the goals of the enterprise were authored on the basis of experience gained analyses of theoretical materials, which were tested on the basis of the small business «White Dragon».

  6. Applying the means-end chain concept to product development

    DEFF Research Database (Denmark)

    Søndergaard, Helle Alsted

    This paper proposes that the means-end chain (MEC) approach can influence the use of market information and inter-functional communication in new product development (NPD). The central question is whether market information represented by means-end chain data can be a vehicle for inter-functional......This paper proposes that the means-end chain (MEC) approach can influence the use of market information and inter-functional communication in new product development (NPD). The central question is whether market information represented by means-end chain data can be a vehicle for inter...

  7. Comparison of two multiaxial fatigue models applied to dental implants

    Directory of Open Access Journals (Sweden)

    JM. Ayllon

    2015-07-01

    Full Text Available This paper presents two multiaxial fatigue life prediction models applied to a commercial dental implant. One model is called Variable Initiation Length Model and takes into account both the crack initiation and propagation phases. The second model combines the Theory of Critical Distance with a critical plane damage model to characterise the initiation and initial propagation of micro/meso cracks in the material. This paper discusses which material properties are necessary for the implementation of these models and how to obtain them in the laboratory from simple test specimens. It also describes the FE models developed for the stress/strain and stress intensity factor characterisation in the implant. The results of applying both life prediction models are compared with experimental results arising from the application of ISO-14801 standard to a commercial dental implant.

  8. IMPROVEMENT OF QUALITY IN PRODUCTION PROCESS BY APPLYING KAIKAKU METHOD

    Directory of Open Access Journals (Sweden)

    Milan Radenkovic

    2013-12-01

    Full Text Available In this paper, Kaikaku method is presented. The essence of this method is introduction, principles and ways of implementation in the real systems. The main point how Kaikaku method influences on quality. It is presented on the practical example (furniture industry, one way how to implement Kaikaku method and how influence on quality improvement of production process.

  9. Research on Digital Product Modeling Key Technologies of Digital Manufacturing

    Institute of Scientific and Technical Information of China (English)

    DING Guoping; ZHOU Zude; HU Yefa; ZHAO Liang

    2006-01-01

    With the globalization and diversification of the market and the rapid development of Information Technology (IT) and Artificial Intelligence (AI), the digital revolution of manufacturing is coming. One of the key technologies in digital manufacturing is product digital modeling. This paper firstly analyzes the information and features of the product digital model during each stage in the product whole lifecycle, then researches on the three critical technologies of digital modeling in digital manufacturing-product modeling, standard for the exchange of product model data and digital product data management. And the potential signification of the product digital model during the process of digital manufacturing is concluded-product digital model integrates primary features of each stage during the product whole lifecycle based on graphic features, applies STEP as data exchange mechanism, and establishes PDM system to manage the large amount, complicated and dynamic product data to implement the product digital model data exchange, sharing and integration.

  10. Applying Network Technology to Improve TV News Production Mode

    Institute of Scientific and Technical Information of China (English)

    冷劲松; 林成栋

    2003-01-01

    With the development of database and computer network technology, traditional TV news production mode (TVNPM) faces great challenge. Up to now, evolution of TVNPM has experienced two stages: In the beginning, TV news is produced completely by hand, named as pipelining TVNPM in this paper. This production mode is limited to space and time, so its production cycle is very time-consuming, and it requires a lot of harmony in different departments; Subsequently, thanks to applications of database technology, a new TVNPM appears, which is named as pooled information resource TVNPM. Compared with pipelining TVNPM, this mode promotes information sharing. However, with the development of network technology, especially the Intranet and the Internet, the pooled information resource TVNPM receives strong impact, and it is referred to contrive a new TVNPM. This new TVNPM must support information sharing, remote collaboration, and interaction in communications so as to improve group work efficiency. In this paper, we present such a new TVNPM, namely, Network TVNPM, give a suit of system solution to support the new TVNPM, introduce the new workflow, and in the end analyze the advantages of Network TVNPM.

  11. Empirical modeling and data analysis for engineers and applied scientists

    CERN Document Server

    Pardo, Scott A

    2016-01-01

    This textbook teaches advanced undergraduate and first-year graduate students in Engineering and Applied Sciences to gather and analyze empirical observations (data) in order to aid in making design decisions. While science is about discovery, the primary paradigm of engineering and "applied science" is design. Scientists are in the discovery business and want, in general, to understand the natural world rather than to alter it. In contrast, engineers and applied scientists design products, processes, and solutions to problems. That said, statistics, as a discipline, is mostly oriented toward the discovery paradigm. Young engineers come out of their degree programs having taken courses such as "Statistics for Engineers and Scientists" without any clear idea as to how they can use statistical methods to help them design products or processes. Many seem to think that statistics is only useful for demonstrating that a device or process actually does what it was designed to do. Statistics courses emphasize creati...

  12. The HPT Model Applied to a Kayak Company's Registration Process

    Science.gov (United States)

    Martin, Florence; Hall, Herman A., IV; Blakely, Amanda; Gayford, Matthew C.; Gunter, Erin

    2009-01-01

    This case study describes the step-by-step application of the traditional human performance technology (HPT) model at a premier kayak company located on the coast of North Carolina. The HPT model was applied to address lost revenues related to three specific business issues: misinformed customers, dissatisfied customers, and guides not showing up…

  13. An applied general equilibrium model for Dutch agribusiness policy analysis.

    NARCIS (Netherlands)

    Peerlings, J.H.M.

    1993-01-01

    The purpose of this thesis was to develop a basic static applied general equilibrium (AGE) model to analyse the effects of agricultural policy changes on Dutch agribusiness. In particular the effects on inter-industry transactions, factor demand, income, and trade are of interest.The model is fairly

  14. Applying the ARCS Motivation Model in Technological and Vocational Education

    Science.gov (United States)

    Liao, Hung-Chang; Wang, Ya-huei

    2008-01-01

    This paper describes the incorporation of Keller's ARCS (Attention, Relevance, Confidence, and Satisfaction) motivation model into traditional classroom instruction-learning process. Viewing that technological and vocational students have low confidence and motivation in learning, the authors applied the ARCS motivation model not only in the…

  15. Applying the means-end chain concept to product development

    DEFF Research Database (Denmark)

    Søndergaard, Helle Alsted

    This paper proposes that the means-end chain (MEC) approach can influence the use of market information and inter-functional communication in new product development (NPD). The central question is whether market information represented by means-end chain data can be a vehicle for inter......-functional communication. This paper describes a PhD project - that through case studies and action research in two companies - aims at investigating the effects on communication and attitudes to communication by those involved when introducing means-end chain data as market information in NPD....

  16. LEARNING SEMANTICS-ENHANCED LANGUAGE MODELS APPLIED TO UNSUEPRVISED WSD

    Energy Technology Data Exchange (ETDEWEB)

    VERSPOOR, KARIN [Los Alamos National Laboratory; LIN, SHOU-DE [Los Alamos National Laboratory

    2007-01-29

    An N-gram language model aims at capturing statistical syntactic word order information from corpora. Although the concept of language models has been applied extensively to handle a variety of NLP problems with reasonable success, the standard model does not incorporate semantic information, and consequently limits its applicability to semantic problems such as word sense disambiguation. We propose a framework that integrates semantic information into the language model schema, allowing a system to exploit both syntactic and semantic information to address NLP problems. Furthermore, acknowledging the limited availability of semantically annotated data, we discuss how the proposed model can be learned without annotated training examples. Finally, we report on a case study showing how the semantics-enhanced language model can be applied to unsupervised word sense disambiguation with promising results.

  17. Applying Functional Modeling for Accident Management of Nuclear Power Plant

    DEFF Research Database (Denmark)

    Lind, Morten; Zhang, Xinxin

    2014-01-01

    The paper investigate applications of functional modeling for accident management in complex industrial plant with special reference to nuclear power production. Main applications for information sharing among decision makers and decision support are identified. An overview of Multilevel Flow...

  18. Modeling in applied sciences a kinetic theory approach

    CERN Document Server

    Pulvirenti, Mario

    2000-01-01

    Modeling complex biological, chemical, and physical systems, in the context of spatially heterogeneous mediums, is a challenging task for scientists and engineers using traditional methods of analysis Modeling in Applied Sciences is a comprehensive survey of modeling large systems using kinetic equations, and in particular the Boltzmann equation and its generalizations An interdisciplinary group of leading authorities carefully develop the foundations of kinetic models and discuss the connections and interactions between model theories, qualitative and computational analysis and real-world applications This book provides a thoroughly accessible and lucid overview of the different aspects, models, computations, and methodology for the kinetic-theory modeling process Topics and Features * Integrated modeling perspective utilized in all chapters * Fluid dynamics of reacting gases * Self-contained introduction to kinetic models * Becker–Doring equations * Nonlinear kinetic models with chemical reactions * Kinet...

  19. Feature Technology in Product Modeling

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xu; NING Ruxin

    2006-01-01

    A unified feature definition is proposed. Feature is form-concentrated, and can be used to model product functionalities, assembly relations, and part geometries. The feature model is given and a feature classification is introduced including functional, assembly, structural, and manufacturing features. A prototype modeling system is developed in Pro/ENGINEER that can define the assembly and user-defined form features.

  20. Applying a realistic evaluation model to occupational safety interventions

    DEFF Research Database (Denmark)

    Pedersen, Louise Møller

    2017-01-01

    Background: Recent literature characterizes occupational safety interventions as complex social activities, applied in complex and dynamic social systems. Hence, the actual outcomes of an intervention will vary, depending on the intervention, the implementation process, context, personal...... and qualitative methods. This revised model has, however, not been applied in a real life context. Method: The model is applied in a controlled, four-component, integrated behaviour-based and safety culture-based safety intervention study (2008-2010) in a medium-sized wood manufacturing company. The interventions...... involve the company’s safety committee, safety manager, safety groups and 130 workers. Results: The model provides a framework for more valid evidence of what works within injury prevention. Affective commitment and role behaviour among key actors are identified as crucial for the implementation...

  1. A comparison of economic evaluation models as applied to geothermal energy technology

    Science.gov (United States)

    Ziman, G. M.; Rosenberg, L. S.

    1983-01-01

    Several cost estimation and financial cash flow models have been applied to a series of geothermal case studies. In order to draw conclusions about relative performance and applicability of these models to geothermal projects, the consistency of results was assessed. The model outputs of principal interest in this study were net present value, internal rate of return, or levelized breakeven price. The models used were VENVAL, a venture analysis model; the Geothermal Probabilistic Cost Model (GPC Model); the Alternative Power Systems Economic Analysis Model (APSEAM); the Geothermal Loan Guarantee Cash Flow Model (GCFM); and the GEOCOST and GEOCITY geothermal models. The case studies to which the models were applied include a geothermal reservoir at Heber, CA; a geothermal eletric power plant to be located at the Heber site; an alcohol fuels production facility to be built at Raft River, ID; and a direct-use, district heating system in Susanville, CA.

  2. Applying the health action process approach (HAPA) to the choice of health products: An exploratory study

    DEFF Research Database (Denmark)

    Krutulyte, Rasa; Grunert, Klaus G.; Scholderer, Joachim;

    on the role of the behavioural intention predictors such as risk perception, outcome expectations and self-efficacy. The model has been proved to be a useful framework for understanding consumer choosing health food and is substantial in the further application of dietary choice issues.......This paper presents the results of a qualitative pilot study that aimed to uncovering Danish consumers' motives for choosing health food. Schwarzer's (1992) health action process approach (HAPA) was applied to understand the process by which people chose health products. The research focused...

  3. Methods for model selection in applied science and engineering.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2004-10-01

    Mathematical models are developed and used to study the properties of complex systems and/or modify these systems to satisfy some performance requirements in just about every area of applied science and engineering. A particular reason for developing a model, e.g., performance assessment or design, is referred to as the model use. Our objective is the development of a methodology for selecting a model that is sufficiently accurate for an intended use. Information on the system being modeled is, in general, incomplete, so that there may be two or more models consistent with the available information. The collection of these models is called the class of candidate models. Methods are developed for selecting the optimal member from a class of candidate models for the system. The optimal model depends on the available information, the selected class of candidate models, and the model use. Classical methods for model selection, including the method of maximum likelihood and Bayesian methods, as well as a method employing a decision-theoretic approach, are formulated to select the optimal model for numerous applications. There is no requirement that the candidate models be random. Classical methods for model selection ignore model use and require data to be available. Examples are used to show that these methods can be unreliable when data is limited. The decision-theoretic approach to model selection does not have these limitations, and model use is included through an appropriate utility function. This is especially important when modeling high risk systems, where the consequences of using an inappropriate model for the system can be disastrous. The decision-theoretic method for model selection is developed and applied for a series of complex and diverse applications. These include the selection of the: (1) optimal order of the polynomial chaos approximation for non-Gaussian random variables and stationary stochastic processes, (2) optimal pressure load model to be

  4. A Research on the E-commerce Applied to the Construction of Marketing Model

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The function of E-commerce is becoming more and more widely applied to many fields,which bring about some new challenges and opportunities for the construction of marketing model.It is proved that the more E-com- merce applied to the construction of marketing,the more precision of forecast for the enterprises can acquire,which is very helpful for the production and marketing of enterprises.Therefore,the research on the E-commerce applied to the construction of marketing is popular today.This paper applie...

  5. Applied data analysis and modeling for energy engineers and scientists

    CERN Document Server

    Reddy, T Agami

    2011-01-01

    ""Applied Data Analysis and Modeling for Energy Engineers and Scientists"" discusses mathematical models, data analysis, and decision analysis in modeling. The approach taken in this volume focuses on the modeling and analysis of thermal systems in an engineering environment, while also covering a number of other critical areas. Other material covered includes the tools that researchers and engineering professionals will need in order to explore different analysis methods, use critical assessment skills and reach sound engineering conclusions. The book also covers process and system design and

  6. Phosphazenes for energy production and storage: Applied and exploratory synthesis

    Science.gov (United States)

    Hess, Andrew R.

    This dissertation involves progress toward phosphazene-based ion conducting materials with a focus on structure-property relationships to improve these materials. This dissertation also includes some more fundamental exploratory syntheses to probe the limits of phosphazene chemistry and discover structure-property relationships that may be useful in designing compounds to fulfill important technical requirements. Chapter 1 provides a brief introduction to polymers and polyphosphazenes as well as ion-conducting materials and the contribution of polyphosphazene chemistry to that field. Chapter 1 also provides a brief introduction to some analytical techniques. Chapter 2 begins with the use of organophosphates as stand-alone non-volatile and fire-retardant liquid electrolyte media for dye sensitized solar cells (DSSCs) as well as their use as plasticizer in polymer gel electrolytes intended for application in lithium batteries. These organophosphates are the smallest phosphorus containing model molecules investigated in this dissertation. A homologous series of oligoalkyleneoxy substituted phosphates was synthesized and the effect of the substituent chain length on viscosity and conductivity was investigated. Small, test-scale DSSCs were constructed and showed promising results with overall cell efficiencies of up to 3.6% under un-optimized conditions. Conductivity measurements were performed on polymer gel-electrolytes based on poly[bis(2-(2-methoxyethoxy)ethoxy)phosphazene] (MEEP) plasticized with the phosphate with the best combination of properties, using a system loaded with lithium trifluoromethanesulfonate as the charge carrier. In chapter 3 the effect of the cation of the charge carrier species on the anionic conductivity of DSSC type electrolytes is evaluated using hexakis(2-(2-methoxyethoxy)ethoxy)cyclotriphosphazene (MEE-trimer) as a small molecule model for MEEP. The iodides of lithium, sodium, and ammonium as well as the ionic liquid, 1-propyl-3

  7. Applying Functional Modeling for Accident Management of Nucler Power Plant

    DEFF Research Database (Denmark)

    Lind, Morten; Zhang, Xinxin

    2014-01-01

    The paper investigates applications of functional modeling for accident management in complex industrial plant with special reference to nuclear power production. Main applications for information sharing among decision makers and decision support are identified. An overview of Multilevel Flow...... for information sharing and decision support in accidents beyond design basis is also indicated. A modelling example demonstrating the application of Multilevel Flow Modelling and reasoning for a PWR LOCA is presented....

  8. Mathematical models applied in inductive non-destructive testing

    Energy Technology Data Exchange (ETDEWEB)

    Wac-Wlodarczyk, A.; Goleman, R.; Czerwinski, D. [Technical University of Lublin, 20 618 Lublin, Nadbystrzycka St 38a (Poland); Gizewski, T. [Technical University of Lublin, 20 618 Lublin, Nadbystrzycka St 38a (Poland)], E-mail: t.gizewski@pollub.pl

    2008-10-15

    Non-destructive testing are the wide group of investigative methods of non-homogenous material. Methods of computer tomography, ultrasonic, magnetic and inductive methods still developed are widely applied in industry. In apparatus used for non-destructive tests, the analysis of signals is made on the basis of complex system answers. The answer is linearized due to the model of research system. In this paper, the authors will discuss the applications of the mathematical models applied in investigations of inductive magnetic materials. The statistical models and other gathered in similarity classes will be taken into consideration. Investigation of mathematical models allows to choose the correct method, which in consequence leads to precise representation of the inner structure of examined object. Inductive research of conductive media, especially those with ferromagnetic properties, are run with high frequency magnetic field (eddy-currents method), which considerably decrease penetration depth.

  9. Dynamical behavior of the Niedermayer algorithm applied to Potts models

    OpenAIRE

    Girardi, D.; Penna, T. J. P.; Branco, N. S.

    2012-01-01

    In this work we make a numerical study of the dynamic universality class of the Niedermayer algorithm applied to the two-dimensional Potts model with 2, 3, and 4 states. This algorithm updates clusters of spins and has a free parameter, $E_0$, which controls the size of these clusters, such that $E_0=1$ is the Metropolis algorithm and $E_0=0$ regains the Wolff algorithm, for the Potts model. For $-1

  10. Transtheoretical Model of Health Behavior Change Applied to Voice Therapy

    OpenAIRE

    2007-01-01

    Studies of patient adherence to health behavior programs, such as physical exercise, smoking cessation, and diet, have resulted in the formulation and validation of the Transtheoretical Model (TTM) of behavior change. Although widely accepted as a guide for the development of health behavior interventions, this model has not been applied to vocal rehabilitation. Because resolution of vocal difficulties frequently depends on a patient’s ability to make changes in vocal and health behaviors, th...

  11. Applying computer simulation models as learning tools in fishery management

    Science.gov (United States)

    Johnson, B.L.

    1995-01-01

    Computer models can be powerful tools for addressing many problems in fishery management, but uncertainty about how to apply models and how they should perform can lead to a cautious approach to modeling. Within this approach, we expect models to make quantitative predictions but only after all model inputs have been estimated from empirical data and after the model has been tested for agreement with an independent data set. I review the limitations to this approach and show how models can be more useful as tools for organizing data and concepts, learning about the system to be managed, and exploring management options. Fishery management requires deciding what actions to pursue to meet management objectives. Models do not make decisions for us but can provide valuable input to the decision-making process. When empirical data are lacking, preliminary modeling with parameters derived from other sources can help determine priorities for data collection. When evaluating models for management applications, we should attempt to define the conditions under which the model is a useful, analytical tool (its domain of applicability) and should focus on the decisions made using modeling results, rather than on quantitative model predictions. I describe an example of modeling used as a learning tool for the yellow perch Perca flavescens fishery in Green Bay, Lake Michigan.

  12. Remarks on orthotropic elastic models applied to wood

    Directory of Open Access Journals (Sweden)

    Nilson Tadeu Mascia

    2006-09-01

    Full Text Available Wood is generally considered an anisotropic material. In terms of engineering elastic models, wood is usually treated as an orthotropic material. This paper presents an analysis of two principal anisotropic elastic models that are usually applied to wood. The first one, the linear orthotropic model, where the material axes L (Longitudinal, R( radial and T(tangential are coincident with the Cartesian axes (x, y, z, is more accepted as wood elastic model. The other one, the cylindrical orthotropic model is more adequate of the growth caracteristics of wood but more mathematically complex to be adopted in practical terms. Specifically due to its importance in wood elastic parameters, this paper deals with the fiber orientation influence in these models through adequate transformation of coordinates. As a final result, some examples of the linear model, which show the variation of elastic moduli, i.e., Young´s modulus and shear modulus, with fiber orientation are presented.

  13. Knowledge Growth: Applied Models of General and Individual Knowledge Evolution

    Science.gov (United States)

    Silkina, Galina Iu.; Bakanova, Svetlana A.

    2016-01-01

    The article considers the mathematical models of the growth and accumulation of scientific and applied knowledge since it is seen as the main potential and key competence of modern companies. The problem is examined on two levels--the growth and evolution of objective knowledge and knowledge evolution of a particular individual. Both processes are…

  14. Applying Particle Tracking Model In The Coastal Modeling System

    Science.gov (United States)

    2011-01-01

    Rev. 8-98) Prescribed by ANSI Std Z39-18 ERDC/CHL CHETN-IV-78 January 2011 2 Figure 1. CMS domain, grid, and bathymetry . CMS-Flow is driven by...through the simulation. At the end of the simulation, about 65 percent of the released clay particles are considered “ dead ,” ERDC/CHL CHETN-IV-78 January...2011 11 which means that they are either permanently buried at the sea bed or have moved out of the model domain. Figure 11. Specifications of

  15. Developing an Integrated Set of Production Planning and Control Models

    OpenAIRE

    Wang, Hui

    2012-01-01

    This paper proposes an integrated set of production planning and control models that can be applied in the Push system (Make-to-stock). The integrated model include forecasting, aggregate planning, materials requirements planning, inventory control, capacity planning and scheduling. This integrated model solves the planning issues via three levels, which include strategic level, tactical level and operational level. The model obtains the optimal production plan for each product type in each p...

  16. Molecular modeling: An open invitation for applied mathematics

    Science.gov (United States)

    Mezey, Paul G.

    2013-10-01

    Molecular modeling methods provide a very wide range of challenges for innovative mathematical and computational techniques, where often high dimensionality, large sets of data, and complicated interrelations imply a multitude of iterative approximations. The physical and chemical basis of these methodologies involves quantum mechanics with several non-intuitive aspects, where classical interpretation and classical analogies are often misleading or outright wrong. Hence, instead of the everyday, common sense approaches which work so well in engineering, in molecular modeling one often needs to rely on rather abstract mathematical constraints and conditions, again emphasizing the high level of reliance on applied mathematics. Yet, the interdisciplinary aspects of the field of molecular modeling also generates some inertia and perhaps too conservative reliance on tried and tested methodologies, that is at least partially caused by the less than up-to-date involvement in the newest developments in applied mathematics. It is expected that as more applied mathematicians take up the challenge of employing the latest advances of their field in molecular modeling, important breakthroughs may follow. In this presentation some of the current challenges of molecular modeling are discussed.

  17. Predictive control applied to an evaporator mathematical model

    Directory of Open Access Journals (Sweden)

    Daniel Alonso Giraldo Giraldo

    2010-07-01

    Full Text Available This paper outlines designing a predictive control model (PCM applied to a mathematical model of a falling film evaporator with mechanical steam compression like those used in the dairy industry. The controller was designed using the Connoisseur software package and data gathered from the simulation of a non-linear mathematical model. A control law was obtained from minimising a cost function sublect to dynamic system constraints, using a quadratic programme (QP algorithm. A linear programming (LP algorithm was used for finding a sub-optimal operation point for the process in stationary state.

  18. A procedure for Applying a Maturity Model to Process Improvement

    Directory of Open Access Journals (Sweden)

    Elizabeth Pérez Mergarejo

    2014-09-01

    Full Text Available A maturity model is an evolutionary roadmap for implementing the vital practices from one or moredomains of organizational process. The use of the maturity models is poor in the Latin-Americancontext. This paper presents a procedure for applying the Process and Enterprise Maturity Modeldeveloped by Michael Hammer [1]. The procedure is divided into three steps: Preparation, Evaluationand Improvement plan. The Hammer´s maturity model joint to the proposed procedure can be used byorganizations to improve theirs process, involving managers and employees.

  19. Applying Functional Modeling for Accident Management of Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Lind, Morten; Zhang Xinxin [Harbin Engineering University, Harbin (China)

    2014-08-15

    The paper investigate applications of functional modeling for accident management in complex industrial plant with special reference to nuclear power production. Main applications for information sharing among decision makers and decision support are identified. An overview of Multilevel Flow Modeling is given and a detailed presentation of the foundational means-end concepts is presented and the conditions for proper use in modelling accidents are identified. It is shown that Multilevel Flow Modeling can be used for modelling and reasoning about design basis accidents. Its possible role for information sharing and decision support in accidents beyond design basis is also indicated. A modelling example demonstrating the application of Multilevel Flow Modelling and reasoning for a PWR LOCA is presented.

  20. Shikimic acid production in Escherichia coli: From classical metabolic engineering strategies to omics applied to improve its production

    Directory of Open Access Journals (Sweden)

    Juan Andrés Martínez

    2015-09-01

    Full Text Available Shikimic acid (SA is an intermediate of the SA pathway that is present in bacteria and plants. SA has gained great interest because it is a precursor in the synthesis of the drug oseltamivir phosphate (OSF, an efficient inhibitor of the neuraminidase enzyme of diverse seasonal influenza viruses, the avian influenza virus H5N1, and the human influenza virus H1N1. For the purposes of OSF production, SA is extracted from the pods of Chinese star anise plants (Illicium spp., yielding up to 17% of SA (dry basis content. The high demand for OSF necessary to manage a major influenza outbreak is not adequately met by industrial production using SA from plants sources. As the SA pathway is present in the model bacteria Escherichia coli, several intuitive metabolically engineered strains have been applied for its successful overproduction by biotechnological processes, resulting in strains producing up to 71 g/L of SA, with high conversion yields of up to 0.42 (mol SA/mol Glc, in both batch and fed-batch cultures using complex fermentation broths, including glucose as a carbon source and yeast extract. Global transcriptomic analyses have been performed in SA producing strains, resulting in the identification of possible key target genes for the design of a rational strain improvement strategy. Because possible target genes are involved in the transport, catabolism and interconversion of different carbon sources and metabolic intermediates outside the central carbon metabolism and SA pathways, as genes involved in diverse cellular stress responses, the development of rational cellular strain improvement strategies based on omics data constitutes a challenging task to improve SA production in currently overproducing engineered strains. In this review, we discuss the main metabolic engineering strategies that have been applied for the development of efficient SA producing strains, as the perspective of omics analysis has focused on further strain improvement

  1. Polymer science applied to petroleum production; Ciencia de polimeros aplicada a producao de petroleo

    Energy Technology Data Exchange (ETDEWEB)

    Lucas, Elizabete F.; Mansur, Claudia R.E.; Garreto, Maria S.E.; Honse, Siller O.; Mazzeo, Claudia P.P. [Universidade Federal do Rio de Janeiro/ Instituto de Macromoleculas/ Laboratorio de Macromoleculas e Coloides na Industria de Petroleo, Rio de Janeiro, RJ (Brazil)], e-mail: elucas@ima.ufrj.br

    2011-07-01

    The petroleum production comprises several operations, from well drilling to oil and water treatment, in which polymer science is applied. This work is focused in the phase behavior of asphaltenes that can be evaluated by precipitation tests and particle size determination. Recent researches show that the petroleum can be diluted with a specific model solvent, without causing any changes on asphaltenes phase behavior, and that a representative model system can be obtained if asphaltenes could be extracted using n-alkane as low as C1. The phase behavior of asphaltenes directly depends on the solubility parameter, which can be estimated for petroleum and asphaltenic fractions by microcalorimetry. More polar asphaltenes are not completely stabilized by less polar molecules, and this affects the stability of the A/O emulsions. There is a relationship between the amount of polar groups in the polymer chain and its capability in stabilizing/flocculating the asphaltenes, which interferes in the asphaltenes particle sizes. (author)

  2. A general diagnostic model applied to language testing data.

    Science.gov (United States)

    von Davier, Matthias

    2008-11-01

    Probabilistic models with one or more latent variables are designed to report on a corresponding number of skills or cognitive attributes. Multidimensional skill profiles offer additional information beyond what a single test score can provide, if the reported skills can be identified and distinguished reliably. Many recent approaches to skill profile models are limited to dichotomous data and have made use of computationally intensive estimation methods such as Markov chain Monte Carlo, since standard maximum likelihood (ML) estimation techniques were deemed infeasible. This paper presents a general diagnostic model (GDM) that can be estimated with standard ML techniques and applies to polytomous response variables as well as to skills with two or more proficiency levels. The paper uses one member of a larger class of diagnostic models, a compensatory diagnostic model for dichotomous and partial credit data. Many well-known models, such as univariate and multivariate versions of the Rasch model and the two-parameter logistic item response theory model, the generalized partial credit model, as well as a variety of skill profile models, are special cases of this GDM. In addition to an introduction to this model, the paper presents a parameter recovery study using simulated data and an application to real data from the field test for TOEFL Internet-based testing.

  3. Product models for the Construction industry

    DEFF Research Database (Denmark)

    Sørensen, Lars Schiøtt

    1996-01-01

    Different types of product models for the building sector was elaborated and grouped. Some discussion on the different models was given. The "definition" of Product models was given.......Different types of product models for the building sector was elaborated and grouped. Some discussion on the different models was given. The "definition" of Product models was given....

  4. Surface-bounded growth modeling applied to human mandibles

    DEFF Research Database (Denmark)

    Andresen, Per Rønsholt

    1999-01-01

    This thesis presents mathematical and computational techniques for three dimensional growth modeling applied to human mandibles. The longitudinal shape changes make the mandible a complex bone. The teeth erupt and the condylar processes change direction, from pointing predominantly backward...... to yield a spatially dense field. Different methods for constructing the sparse field are compared. Adaptive Gaussian smoothing is the preferred method since it is parameter free and yields good results in practice. A new method, geometry-constrained diffusion, is used to simplify The most successful...... growth model is linear and based on results from shape analysis and principal component analysis. The growth model is tested in a cross validation study with good results. The worst case mean modeling error in the cross validation study is 3.7 mm. It occurs when modeling the shape and size of a 12 years...

  5. Applied systems ecology: models, data, and statistical methods

    Energy Technology Data Exchange (ETDEWEB)

    Eberhardt, L L

    1976-01-01

    In this report, systems ecology is largely equated to mathematical or computer simulation modelling. The need for models in ecology stems from the necessity to have an integrative device for the diversity of ecological data, much of which is observational, rather than experimental, as well as from the present lack of a theoretical structure for ecology. Different objectives in applied studies require specialized methods. The best predictive devices may be regression equations, often non-linear in form, extracted from much more detailed models. A variety of statistical aspects of modelling, including sampling, are discussed. Several aspects of population dynamics and food-chain kinetics are described, and it is suggested that the two presently separated approaches should be combined into a single theoretical framework. It is concluded that future efforts in systems ecology should emphasize actual data and statistical methods, as well as modelling.

  6. Model Driven Mutation Applied to Adaptative Systems Testing

    CERN Document Server

    Bartel, Alexandre; Munoz, Freddy; Klein, Jacques; Mouelhi, Tejeddine; Traon, Yves Le

    2012-01-01

    Dynamically Adaptive Systems modify their behav- ior and structure in response to changes in their surrounding environment and according to an adaptation logic. Critical sys- tems increasingly incorporate dynamic adaptation capabilities; examples include disaster relief and space exploration systems. In this paper, we focus on mutation testing of the adaptation logic. We propose a fault model for adaptation logics that classifies faults into environmental completeness and adaptation correct- ness. Since there are several adaptation logic languages relying on the same underlying concepts, the fault model is expressed independently from specific adaptation languages. Taking benefit from model-driven engineering technology, we express these common concepts in a metamodel and define the operational semantics of mutation operators at this level. Mutation is applied on model elements and model transformations are used to propagate these changes to a given adaptation policy in the chosen formalism. Preliminary resul...

  7. 40 CFR 63.11166 - What General Provisions apply to primary beryllium production facilities?

    Science.gov (United States)

    2010-07-01

    ... primary beryllium production facilities? 63.11166 Section 63.11166 Protection of Environment ENVIRONMENTAL... Primary Nonferrous Metals Area Sources-Zinc, Cadmium, and Beryllium Primary Beryllium Production Facilities § 63.11166 What General Provisions apply to primary beryllium production facilities? (a) You...

  8. Nonstandard Analysis Applied to Advanced Undergraduate Mathematics - Infinitesimal Modeling

    OpenAIRE

    Herrmann, Robert A.

    2003-01-01

    This is a Research and Instructional Development Project from the U. S. Naval Academy. In this monograph, the basic methods of nonstandard analysis for n-dimensional Euclidean spaces are presented. Specific rules are deveoped and these methods and rules are applied to rigorous integral and differential modeling. The topics include Robinson infinitesimals, limited and infinite numbers; convergence theory, continuity, *-transfer, internal definition, hyprefinite summation, Riemann-Stieltjes int...

  9. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    Science.gov (United States)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  10. Mathematical modelling applied to LiDAR data

    Directory of Open Access Journals (Sweden)

    Javier Estornell

    2013-06-01

    Full Text Available The aim of this article is to explain the application of several mathematic calculations to LiDAR (Light Detection And Ranging data to estimate vegetation parameters and modelling the relief of a forest area in the town of Chiva (Valencia. To represent the surface that describes the topography of the area, firstly, morphological filters were applied iteratively to select LiDAR ground points. From these data, the Triangulated Irregular Network (TIN structure was applied to model the relief of the area. From LiDAR data the canopy height model (CHM was also calculated. This model allowed obtaining bare soil, shrub and tree vegetation mapping in the study area. In addition, biomass was estimated from measurements taken in the field in 39 circular plots of radius 0.5 m and the 95th percentile of the LiDAR height datanincluded in each plot. The results indicated a high relationship between the two variables (measurednbiomass and 95th percentile with a coeficient of determination (R2 of 0:73. These results reveal the importance of using mathematical modelling to obtain information of the vegetation and land relief from LiDAR data.

  11. Online traffic flow model applying dynamic flow-density relation

    CERN Document Server

    Kim, Y

    2002-01-01

    This dissertation describes a new approach of the online traffic flow modelling based on the hydrodynamic traffic flow model and an online process to adapt the flow-density relation dynamically. The new modelling approach was tested based on the real traffic situations in various homogeneous motorway sections and a motorway section with ramps and gave encouraging simulation results. This work is composed of two parts: first the analysis of traffic flow characteristics and second the development of a new online traffic flow model applying these characteristics. For homogeneous motorway sections traffic flow is classified into six different traffic states with different characteristics. Delimitation criteria were developed to separate these states. The hysteresis phenomena were analysed during the transitions between these traffic states. The traffic states and the transitions are represented on a states diagram with the flow axis and the density axis. For motorway sections with ramps the complicated traffic fl...

  12. Product State Modelling based on a Meta Production

    DEFF Research Database (Denmark)

    Larsen, Michael Holm; Sørensen, Christian; Langer, Gilad

    1999-01-01

    ) is a product model that contains continuously updated data regarding the outcome of the production processes. The main contribution of this paper is a definition and a description of a Production Meta Product State Model (Production Meta PSM), using the Unified Modelling Language (UML). The meta model......As products often deviate from their original design and specifications when being produced, adjustments of the product or process are required in order to meet specifications. A prerequisite for this adjustment, is appropriate and effectively collected shop floor data. The Product State Model (PSM...... incorporates a set of characteristics associated to the (1) scope or application domain of the PSM, (2) the artefact or product, and (3) the events transforming the product and trigging product state changes. Moreover, the paper provides guidelines for a specialisation of the meta model with respect...

  13. Development of a production meta Product State Model

    DEFF Research Database (Denmark)

    Larsen, Michael Holm; Sørensen, Christian; Langer, Gilad

    1999-01-01

    ) is a product model that contains continuously updated data regarding the outcome of the production processes. The main contribution of this paper is a definition and a description of a Production Meta Product State Model (Production Meta PSM), using the Unified Modelling Language (UML). The meta model......As products often deviate from their original design and specifications when being produced, adjustments of the product or process are required in order to meet specifications. A prerequisite for this adjustment, is appropriate and effectively collected shop floor data. The Product State Model (PSM...... incorporates a set of characteristics associated to the (1) scope or application domain of the PSM, (2) the artefact or product, and (3) the events transforming the product and trigging product state changes. Moreover, the paper provides guidelines for a specialisation of the meta model with respect...

  14. Semiclassical model for pion production by neutrons on nuclei

    CERN Document Server

    Sparrow, D A; Sternheim, M M

    1974-01-01

    A model for pion production by neutrons on nuclei is derived by a straightforward extension of the semiclassical model for pion production by protons, previously described by two of the present authors, Silbar and Sternheim (1973). Both models are then applied to compute pion production cross sections for nucleons incident on Pb, Cu and Al, and pion absorption cross sections in nuclear matter. Results are consistent with (unpublished) experimental data from CERN. (10 refs).

  15. Apply a hydrological model to estimate local temperature trends

    Science.gov (United States)

    Igarashi, Masao; Shinozawa, Tatsuya

    2014-03-01

    Continuous times series {f(x)} such as a depth of water is written f(x) = T(x)+P(x)+S(x)+C(x) in hydrological science where T(x),P(x),S(x) and C(x) are called the trend, periodic, stochastic and catastrophic components respectively. We simplify this model and apply it to the local temperature data such as given E. Halley (1693), the UK (1853-2010), Germany (1880-2010), Japan (1876-2010). We also apply the model to CO2 data. The model coefficients are evaluated by a symbolic computation by using a standard personal computer. The accuracy of obtained nonlinear curve is evaluated by the arithmetic mean of relative errors between the data and estimations. E. Halley estimated the temperature of Gresham College from 11/1692 to 11/1693. The simplified model shows that the temperature at the time rather cold compared with the recent of London. The UK and Germany data sets show that the maximum and minimum temperatures increased slowly from the 1890s to 1940s, increased rapidly from the 1940s to 1980s and have been decreasing since the 1980s with the exception of a few local stations. The trend of Japan is similar to these results.

  16. Simple predictive electron transport models applied to sawtoothing plasmas

    Science.gov (United States)

    Kim, D.; Merle, A.; Sauter, O.; Goodman, T. P.

    2016-05-01

    In this work, we introduce two simple transport models to evaluate the time evolution of electron temperature and density profiles during sawtooth cycles (i.e. over a sawtooth period time-scale). Since the aim of these simulations is to estimate reliable profiles within a short calculation time, two simplified ad-hoc models have been developed. The goal for these models is to rely on a few easy-to-check free parameters, such as the confinement time scaling factor and the profiles’ averaged scale-lengths. Due to the simplicity and short calculation time of the models, it is expected that these models can also be applied to real-time transport simulations. We show that it works well for Ohmic and EC heated L- and H-mode plasmas. The differences between these models are discussed and we show that their predictive capabilities are similar. Thus only one model is used to reproduce with simulations the results of sawtooth control experiments on the TCV tokamak. For the sawtooth pacing, the calculated time delays between the EC power off and sawtooth crash time agree well with the experimental results. The map of possible locking range is also well reproduced by the simulation.

  17. Three-Dimensional Gravity Model Applied to Underwater Navigation

    Institute of Scientific and Technical Information of China (English)

    YAN Lei; FENG Hao; DENG Zhongliang; GAO Zhengbing

    2004-01-01

    At present, new integrated navigation, which usesthe location function of reference gravity anomaly map to control the errors of the inertial navigation system (INS), has been developed in marine navigation. It is named the gravityaided INS. Both the INS and real-time computation of gravity anomalies need a 3-D marine normal gravity model.Conventionally, a reduction method applied in geophysical survey is directly introduced to observed data processing. This reduction does not separate anomaly from normal gravity in the observed data, so errors cannot be avoided. The 3-D marine normal gravity model was derived from the J2 gravity model, and is suitable for the region whose depth is less than 1000 m.

  18. Curve Fitting And Interpolation Model Applied In Nonel Dosage Detection

    Directory of Open Access Journals (Sweden)

    Jiuling Li

    2013-06-01

    Full Text Available The Curve Fitting and Interpolation Model are applied in Nonel dosage detection in this paper firstly, and the gray of continuous explosive in the Nonel has been forecasted. Although the traditional infrared equipment establishes the relationship of explosive dosage and light intensity, but the forecast accuracy is very low. Therefore, gray prediction models based on curve fitting and interpolation are framed separately, and the deviations from the different models are compared. Simultaneously, combining on the sample library features, the cubic polynomial fitting curve of the higher precision is used to predict grays, and 5mg-28mg Nonel gray values are calculated by MATLAB. Through the predictive values, the dosage detection operations are simplified, and the defect missing rate of the Nonel are reduced. Finally, the quality of Nonel is improved.

  19. Applying a Dynamic Resource Supply Model in a Smart Grid

    Directory of Open Access Journals (Sweden)

    Kaiyu Wan

    2014-09-01

    Full Text Available Dynamic resource supply is a complex issue to resolve in a cyber-physical system (CPS. In our previous work, a resource model called the dynamic resource supply model (DRSM has been proposed to handle resources specification, management and allocation in CPS. In this paper, we are integrating the DRSM with service-oriented architecture and applying it to a smart grid (SG, one of the most complex CPS examples. We give the detailed design of the SG for electricity charging request and electricity allocation between plug-in hybrid electric vehicles (PHEV and DRSM through the Android system. In the design, we explain a mechanism for electricity consumption with data collection and re-allocation through ZigBee network. In this design, we verify the correctness of this resource model for expected electricity allocation.

  20. Remote sensing applied to numerical modelling. [water resources pollution

    Science.gov (United States)

    Sengupta, S.; Lee, S. S.; Veziroglu, T. N.; Bland, R.

    1975-01-01

    Progress and remaining difficulties in the construction of predictive mathematical models of large bodies of water as ecosystems are reviewed. Surface temperature is at present the only variable than can be measured accurately and reliably by remote sensing techniques, but satellite infrared data are of sufficient resolution for macro-scale modeling of oceans and large lakes, and airborne radiometers are useful in meso-scale analysis (of lakes, bays, and thermal plumes). Finite-element and finite-difference techniques applied to the solution of relevant coupled time-dependent nonlinear partial differential equations are compared, and the specific problem of the Biscayne Bay and environs ecosystem is tackled in a finite-differences treatment using the rigid-lid model and a rigid-line grid system.

  1. Dynamic Decision Making for Graphical Models Applied to Oil Exploration

    CERN Document Server

    Martinelli, Gabriele; Hauge, Ragnar

    2012-01-01

    We present a framework for sequential decision making in problems described by graphical models. The setting is given by dependent discrete random variables with associated costs or revenues. In our examples, the dependent variables are the potential outcomes (oil, gas or dry) when drilling a petroleum well. The goal is to develop an optimal selection strategy that incorporates a chosen utility function within an approximated dynamic programming scheme. We propose and compare different approximations, from simple heuristics to more complex iterative schemes, and we discuss their computational properties. We apply our strategies to oil exploration over multiple prospects modeled by a directed acyclic graph, and to a reservoir drilling decision problem modeled by a Markov random field. The results show that the suggested strategies clearly improve the simpler intuitive constructions, and this is useful when selecting exploration policies.

  2. Experimental Validation of Simplified Free Jet Turbulence Models Applied to the Vocal Tract

    CERN Document Server

    Grandchamp, Xavier; Pelorson, Xavier

    2008-01-01

    Sound production due to turbulence is widely shown to be an important phenomenon involved in a.o. fricatives, singing, whispering and speech pathologies. In spite of its relevance turbulent flow is not considered in classical physical speech production models mostly dealing with voiced sound production. The current study presents preliminary results of an experimental validation of simplified turbulence models in order to estimate the time-mean velocity distribution in a free jet downstream of a tube outlet. Aiming a future application in speech production the influence of typical vocal tract shape parameters on the velocity distribution is experimentally and theoretically explored: the tube shape, length and the degree and geometry of the constriction. Simplified theoretical predictions are obtained by applying similarity solutions of the bidimensional boundary layer theory to a plane and circular free jet in still air. The orifice velocity and shape are the main model input quantities. Results are discussed...

  3. Applying fuzzy analytic network process in quality function deployment model

    Directory of Open Access Journals (Sweden)

    Mohammad Ali Afsharkazemi

    2012-08-01

    Full Text Available In this paper, we propose an empirical study of QFD implementation when fuzzy numbers are used to handle the uncertainty associated with different components of the proposed model. We implement fuzzy analytical network to find the relative importance of various criteria and using fuzzy numbers we calculate the relative importance of these factors. The proposed model of this paper uses fuzzy matrix and house of quality to study the products development in QFD and also the second phase i.e. part deployment. In most researches, the primary objective is only on CRs to implement the quality function deployment and some other criteria such as production costs, manufacturing costs etc were disregarded. The results of using fuzzy analysis network process based on the QFD model in Daroupat packaging company to develop PVDC show that the most important indexes are being waterproof, resistant pill packages, and production cost. In addition, the PVDC coating is the most important index in terms of company experts’ point of view.

  4. Applying Mechanistic Dam Breach Models to Historic Levee Breaches

    Directory of Open Access Journals (Sweden)

    Risher Paul

    2016-01-01

    Full Text Available Hurricane Katrina elevated levee risk in the US national consciousness, motivating agencies to assess and improve their levee risk assessment methodology. Accurate computation of the flood flow magnitude and timing associated with a levee breach remains one of the most difficult and uncertain components of levee risk analysis. Contemporary methods are largely empirical and approximate, introducing substantial uncertainty to the damage and life loss models. Levee breach progressions are often extrapolated to the final width and breach formation time based on limited experience with past breaches or using regression equations developed from a limited data base of dam failures. Physically based embankment erosion models could improve levee breach modeling. However, while several mechanistic embankment breach models are available, they were developed for dams. Several aspects of the levee breach problem are distinct, departing from dam breach assumptions. This study applies three embankments models developed for dam breach analysis (DL Breach, HR BREACH, and WinDAM C to historic levee breaches with observed (or inferred breach rates, assessing the limitations, and applicability of each model to the levee breach problem.

  5. Product Family Modelling for Manufacturing Planning

    DEFF Research Database (Denmark)

    Jørgensen, Kaj Asbjørn; Petersen, Thomas Ditlev; Nielsen, Kjeld;

    2011-01-01

    To enable product configuration of a product family, it is important to develop a model of the selected product family. From such a model, an often performed practice is to make a product configurator from which customers can specify individual products from the family. To get further utilisation...... of the product family model, however, the model should be enriched with data for planning and execution of the manufacturing processes. The idea is that, when any individual product is specified using the product configurator, a product model can be extracted with all data necessary for planning...

  6. A new HBV-model applied to an arctic watershed

    Energy Technology Data Exchange (ETDEWEB)

    Bruland, O.

    1995-12-31

    This paper describes the HBV-model, which was developed in the Nordic joint venture project ``Climate change and energy production``. The HBV-model is a precipitation-runoff model made mainly to create runoff forecasts for hydroelectric power plants. The model has been tested in an arctic watershed, the Bayelva drainage basin at Svalbard. The model was calibrated by means of data for the period 1989-1993 and tested on data for the period 1974-1978. For both periods, snow melt, rainfall and glacier melt events are well predicted. The largest disagreement between observed and simulated runoff occurred on warm days with heavy rain. This may be due to the precipitation measurements which may not be representative for such events. Measurements show a larger negative glacier mass balance than the simulated one although the parameters controlling the glacier melt in the model are set high. Glacier mass balance simulations in which the temperature index depends on albedo and radiation are more correct and improve model efficiency. 5 refs., 4 figs., 1 table

  7. APPLYING A JUST-IN-TIME INTEGRATED SUPPLY CHAIN MODEL WITH INVENTORY AND WASTE REDUCTION CONSIDERATIONS

    Directory of Open Access Journals (Sweden)

    Li-Hsing Ho

    2013-01-01

    Full Text Available Just-In-Time (JIT has been playing an important role in supply chain environments. Countless firms have been applying JIT in production to gain and maintain a competitive advantage. This study introduces an innovative model which integrates inventory and quality assurance in a JIT supply chain. This approach assumes that manufacturing will produce some defective items and those products will not influence the buyer’s purchase policy. The vendor absorbs all the inspection costs. Using a function to compute the expected amount of total cost every year will minimize the total cost and the nonconforming fraction. Finally, a numerical example further confirms this model.

  8. Model of the Product Development Lifecycle.

    Energy Technology Data Exchange (ETDEWEB)

    He, Sunny L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Roe, Natalie H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wood, Evan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nachtigal, Noel M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Helms, Jovana [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    While the increased use of Commercial Off-The-Shelf information technology equipment has presented opportunities for improved cost effectiveness and flexibility, the corresponding loss of control over the product's development creates unique vulnerabilities and security concerns. Of particular interest is the possibility of a supply chain attack. A comprehensive model for the lifecycle of hardware and software products is proposed based on a survey of existing literature from academic, government, and industry sources. Seven major lifecycle stages are identified and defined: (1) Requirements, (2) Design, (3) Manufacturing for hardware and Development for software, (4) Testing, (5) Distribution, (6) Use and Maintenance, and (7) Disposal. The model is then applied to examine the risk of attacks at various stages of the lifecycle.

  9. Structuring as a Basis for Product Modelling

    DEFF Research Database (Denmark)

    Mortensen, Niels Henrik; Hansen, Claus Thorp

    1999-01-01

    Structure means the way which things are built up. A composite product does not exhibit one structure, but hides in its structure of parts several different structuring principles, which fit the production, service, transport etc. Structuring of product models is complex where many factors...... are influencing. This paper identifies four factors that are influencing the structure of a product model: genetics, functionality/property, product life and product assortment. Three principles, which support determination of product model structures, are proposed....

  10. Nature preservation acceptance model applied to tanker oil spill simulations

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Ditlevsen, Ove Dalager

    2003-01-01

    is exemplified by a study of oil spills due to simulated tanker collisions in the Danish straits. It is found that the distribution of the oil spill volume per spill is well represented by an exponential distribution both in Oeresund and in Great Belt. When applied in the Poisson model, a risk profile reasonably...... close to the standard lognormal profile is obtained. Moreover, based on data pairs (volume, cost) for world wide oil spills it is inferred that the conditional distribution of the costs given the spill volume is well modeled by a lognormal distribution. By unconditioning by the exponential distribution...... of the single oil spill, a risk profile for the costs is obtained that is indistinguishable from the standard lognormal risk profile.Finally the question of formulating a public risk acceptance criterion is addressed following Ditlevsen, and it is argued that a Nature Preservation Willingness Index can...

  11. Optimal control applied to a thoraco-abdominal CPR model.

    Science.gov (United States)

    Jung, Eunok; Lenhart, Suzanne; Protopopescu, Vladimir; Babbs, Charles

    2008-06-01

    The techniques of optimal control are applied to a validated blood circulation model of cardiopulmonary resuscitation (CPR), consisting of a system of seven difference equations. In this system, the non-homogeneous forcing terms are chest and abdominal pressures acting as the 'controls'. We seek to maximize the blood flow, as measured by the pressure difference between the thoracic aorta and the right atrium. By applying optimal control methods, we characterize the optimal waveforms for external chest and abdominal compression during cardiac arrest and CPR in terms of the solutions of the circulation model and of the corresponding adjoint system. Numerical results are given for various scenarios. The optimal waveforms confirm the previously discovered positive effects of active decompression and interposed abdominal compression. These waveforms can be implemented with manual (Lifestick-like) and mechanical (vest-like) devices to achieve levels of blood flow substantially higher than those provided by standard CPR, a technique which, despite its long history, is far from optimal.

  12. Applying the luminosity function statistics in the fireshell model

    Science.gov (United States)

    Rangel Lemos, L. J.; Bianco, C. L.; Ruffini, R.

    2015-12-01

    The luminosity function (LF) statistics applied to the data of BATSE, GBM/Fermi and BAT/Swift is the theme approached in this work. The LF is a strong statistical tool to extract useful information from astrophysical samples, and the key point of this statistical analysis is in the detector sensitivity, where we have performed careful analysis. We applied the tool of the LF statistics to three GRB classes predicted by the Fireshell model. We produced, by LF statistics, predicted distributions of: peak ux N(Fph pk), redshift N(z) and peak luminosity N(Lpk) for the three GRB classes predicted by Fireshell model; we also used three GRB rates. We looked for differences among the distributions, and in fact we found. We performed a comparison between the distributions predicted and observed (with and without redshifts), where we had to build a list with 217 GRBs with known redshifts. Our goal is transform the GRBs in a standard candle, where a alternative is find a correlation between the isotropic luminosity and the Band peak spectral energy (Liso - Epk).

  13. A procedure for building product models

    DEFF Research Database (Denmark)

    Hvam, Lars; Riis, Jesper; Malis, Martin;

    2001-01-01

    with product models. The next phase includes an analysis of the product assortment, and the set up of a so-called product master. Finally the product model is designed and implemented using object oriented modelling. The procedure is developed in order to ensure that the product models constructed are fit......This article presents a procedure for building product models to support the specification processes dealing with sales, design of product variants and production preparation. The procedure includes, as the first phase, an analysis and redesign of the business processes, which are to be supported...

  14. A bidirectional coupling procedure applied to multiscale respiratory modeling

    Science.gov (United States)

    Kuprat, A. P.; Kabilan, S.; Carson, J. P.; Corley, R. A.; Einstein, D. R.

    2013-07-01

    pressure applied to the multiple sets of ODEs. In both the simplified geometry and in the imaging-based geometry, the performance of the method was comparable to that of monolithic schemes, in most cases requiring only a single CFD evaluation per time step. Thus, this new accelerator allows us to begin combining pulmonary CFD models with lower-dimensional models of pulmonary mechanics with little computational overhead. Moreover, because the CFD and lower-dimensional models are totally separate, this framework affords great flexibility in terms of the type and breadth of the adopted lower-dimensional model, allowing the biomedical researcher to appropriately focus on model design. Research funded by the National Heart and Blood Institute Award 1RO1HL073598.

  15. A Bidirectional Coupling Procedure Applied to Multiscale Respiratory Modeling.

    Science.gov (United States)

    Kuprat, A P; Kabilan, S; Carson, J P; Corley, R A; Einstein, D R

    2013-07-01

    In this study, we present a novel multiscale computational framework for efficiently linking multiple lower-dimensional models describing the distal lung mechanics to imaging-based 3D computational fluid dynamics (CFD) models of the upper pulmonary airways in order to incorporate physiologically appropriate outlet boundary conditions. The framework is an extension of the Modified Newton's Method with nonlinear Krylov accelerator developed by Carlson and Miller [1, 2, 3]. Our extensions include the retention of subspace information over multiple timesteps, and a special correction at the end of a timestep that allows for corrections to be accepted with verified low residual with as little as a single residual evaluation per timestep on average. In the case of a single residual evaluation per timestep, the method has zero additional computational cost compared to uncoupled or unidirectionally coupled simulations. We expect these enhancements to be generally applicable to other multiscale coupling applications where timestepping occurs. In addition we have developed a "pressure-drop" residual which allows for stable coupling of flows between a 3D incompressible CFD application and another (lower-dimensional) fluid system. We expect this residual to also be useful for coupling non-respiratory incompressible fluid applications, such as multiscale simulations involving blood flow. The lower-dimensional models that are considered in this study are sets of simple ordinary differential equations (ODEs) representing the compliant mechanics of symmetric human pulmonary airway trees. To validate the method, we compare the predictions of hybrid CFD-ODE models against an ODE-only model of pulmonary airflow in an idealized geometry. Subsequently, we couple multiple sets of ODEs describing the distal lung to an imaging-based human lung geometry. Boundary conditions in these models consist of atmospheric pressure at the mouth and intrapleural pressure applied to the multiple sets

  16. Model output statistics applied to wind power prediction

    Energy Technology Data Exchange (ETDEWEB)

    Joensen, A.; Giebel, G.; Landberg, L. [Risoe National Lab., Roskilde (Denmark); Madsen, H.; Nielsen, H.A. [The Technical Univ. of Denmark, Dept. of Mathematical Modelling, Lyngby (Denmark)

    1999-03-01

    Being able to predict the output of a wind farm online for a day or two in advance has significant advantages for utilities, such as better possibility to schedule fossil fuelled power plants and a better position on electricity spot markets. In this paper prediction methods based on Numerical Weather Prediction (NWP) models are considered. The spatial resolution used in NWP models implies that these predictions are not valid locally at a specific wind farm. Furthermore, due to the non-stationary nature and complexity of the processes in the atmosphere, and occasional changes of NWP models, the deviation between the predicted and the measured wind will be time dependent. If observational data is available, and if the deviation between the predictions and the observations exhibits systematic behavior, this should be corrected for; if statistical methods are used, this approaches is usually referred to as MOS (Model Output Statistics). The influence of atmospheric turbulence intensity, topography, prediction horizon length and auto-correlation of wind speed and power is considered, and to take the time-variations into account, adaptive estimation methods are applied. Three estimation techniques are considered and compared, Extended Kalman Filtering, recursive least squares and a new modified recursive least squares algorithm. (au) EU-JOULE-3. 11 refs.

  17. TCSC impedance regulator applied to the second benchmark model

    Energy Technology Data Exchange (ETDEWEB)

    Hamel, J.P.; Dessaint, L.A. [Ecole de Technologie Superieure, Montreal, PQ (Canada). Dept. of Electrical Engineering; Champagne, R. [Ecole de Technologie Superieure, Montreal, PQ (Canada). Dept. of Software and IT Engineering; Pare, D. [Institut de Recherche d' Hydro-Quebec, Varennes, PQ (Canada)

    2008-07-01

    Due to the combination of electrical demand growth and the high cost of building new power transmission lines, series compensation is increasingly used in power systems all around the world. Series compensation has been proposed as a new way to transfer more power on existing lines. By adding series compensation to an existing line (a relatively small change), the power transfer can be increased significantly. One of the means used for line compensation is the addition of capacitive elements in series with the line. This paper presented a thyristor-controlled series capacitor (TCSC) model that used impedance as reference, had individual controls for each phase, included a linearization module and considered only the fundamental frequency for impedance computations, without using any filter. The model's dynamic behavior was validated by applying it to the second benchmark model for subsynchronous resonance (SSR). Simulation results from the proposed model, obtained using EMTP-RV and SimPowerSystems were demonstrated. It was concluded that SSR was mitigated by the proposed approach. 19 refs., 19 figs.

  18. A soil-plant model applied to phytoremediation of metals.

    Science.gov (United States)

    Lugli, Francesco; Mahler, Claudio Fernando

    2016-01-01

    This study reports a phytoremediation pot experiment using an open-source program. Unsaturated water flow was described by the Richards' equation and solute transport by the advection-dispersion equation. Sink terms in the governing flow and transport equations accounted for root water and solute uptake, respectively. Experimental data were related to application of Vetiver grass to soil contaminated by metal ions. Sensitivity analysis revealed that due to the specific experimental set-up (bottom flux not allowed), hydraulic model parameters did not influence root water (and contaminant) uptake. In contrast, the results were highly correlated with plant solar radiation interception efficiency (leaf area index). The amounts of metals accumulated in the plant tissue were compared to numerical values of cumulative uptake. Pb(2+) and Zn(2+) uptake was satisfactorily described using a passive model. However, for Ni(2+) and Cd(2+), a specific calibration of the active uptake model was necessary. Calibrated MM parameters for Ni(2+), Cd(2+), and Pb(2+) were compared to values in the literature, generally suggesting lower rates and saturation advance. A parameter (saturation ratio) was introduced to assess the efficiency of contaminant uptake. Numerical analysis, applying actual field conditions, showed the limitation of the active model for being independent of the transpiration rate.

  19. Modeling of continuous strip production by rheocasting

    Science.gov (United States)

    Matsumiya, T.; Flemings, M. C.

    1981-03-01

    A process was experimentally and mathematically modeled for continuous and direct production of metal strip from its molten state by the use of Rheocasting. The process comprises 1) continuous production of a Rheocast semisolid alloy, and 2) direct shaping of the semisolid into strip. Sn-15 pct Pb was used as the modeling alloy. Crack formation and surface quality of the strip produced depend on fraction solid and deformation force. Continuous, sound strip could be obtained with good surface quality when fraction solid was between 0.50 and 0.70 and deformation force did not exceed a given maximum. Sheet thickness depends on deformation force, fraction solid, rotor rate of Rheocaster and production line speed. At constant deformation force, sheet thickness increases as fraction solid increases, rotor rate decreases and line speed is reduced. Sheet thickness is larger in the center than in the edge, but the difference is reduced by applying edgers. Some segregation of lead toward the edges is observed, and the segregation increases as amount of deformation is increased. A mathematical model for heat flow, solidification and deformation was constructed. The model predicts the point of completion of solidification in the strip and sheet thickness as a function of deformation force and line speed. Calculations are in good agreement with experimental results.

  20. Modeling of continuous strip production by rheocasting

    Energy Technology Data Exchange (ETDEWEB)

    Matsumiya, T.; Flemings, M.C.

    1981-03-01

    A process was experimentally and mathematically modeled for continuous and direct production of metal strip from its molten state by the use of Rheocasting. The process comprises 1) continuous production of a Rheocast semisolid alloy, and 2) direct shaping of the semisolid into strip. Sn-15 pct Pb was used as the modeling alloy. Crack formation and surface quality of the strip produced depend on fraction solid and deformation force. Continuous, sound strip could be obtained with good surface quality when fraction solid was between 0.50 and 0.70 and deformation force did not exceed a given maximum. Sheet thickness depends on deformation force, fraction solid, rotor rate of Rheocaster and production line speed. At constant deformation force, sheet thickness increases as fraction solid increases, rotor rate decreases and line speed is reduced. Sheet thickness is larger in the center than in the edge, but the difference is reduced by applying edgers. Some segregation of lead toward the edges is observed, ad the segregation increases as amount of deformation is increased. A mathematical model for heat flow, solidification and deformation was constructed. The model predicts the point of completion of solidification in the strip and sheet thickness as a function of deformation force and line speed. Calculations are in good agreement with experimental results.

  1. Applying the model of excellence in dental healthcare

    Directory of Open Access Journals (Sweden)

    Tekić Jasmina

    2015-01-01

    Full Text Available Introduction. Models of excellence are considered a practical tool in the field of management that should help a variety of organizations, including dental, to carry out the measurement of the quality of provided services, and so define their position in relation to excellence. The quality of healthcare implies the degree within which the system of healthcare and health services increases the likelihood of positive treatment outcome. Objective. The aim of the present study was to define a model of excellence in the field of dental healthcare (DHC in the Republic of Serbia and suggest the model of DHC whose services will have the characteristics of outstanding service in the dental practice. Methods. In this study a specially designed questionnaire was used for the assessment of the maturity level of applied management regarding quality in healthcare organizations of the Republic of Serbia. The questionnaire consists of 13 units and a total of 240 questions. Results. The results of the study were discussed involving four areas: (1 defining the main criteria and sub-criteria, (2 the elements of excellence of DHC in the Republic of Serbia, (3 the quality of DHC in the Republic of Serbia, and (4 defining the framework of the model of excellence for the DHC in the Republic of Serbia. The main criteria which defined the framework and implementation model of excellence in the field of DHC in Serbia were: leadership, management, human resources, policy and strategy, other resources, processes, patients’ satisfaction, employee’s satisfaction, impact on society and business results. The model had two main parts: the possibilities for the first five criteria and options for the other four criteria. Conclusion. Excellence in DHC business as well as the excellence of provided dental services are increasingly becoming the norm and good practice, and progressively less the exception.

  2. ComParison of delay difference model and surPlus Production model aPPlied to albacore(Thunnus alalunga)in the South Atlantic Ocean%时滞差分模型与剩余产量模型的应用比较--以南大西洋长鳍金枪鱼为例

    Institute of Scientific and Technical Information of China (English)

    张魁; 陈作志; 黄梓荣; 许友伟

    2015-01-01

    We applied surplus production model and delay difference model to analyze the data of the southern Atlantic albacore (Thunnus alalunga)stock. Results show that the delay difference model captured annual fluctuation of catch per unit effort(CPUE) better than the Schaefer model. Akaike information criterion(AIC)also reveals that delay difference model performed better. We cal-culated an 80% percentile confidence interval of maximum sustainable yield(MSY)of 21 756 ~ 23 408 t(median 22 490 t)and 26 116 ~ 28 959 t(median 27 520 t)by delay difference model and Schaefer model,respectively. Results of biological reference points show that the southern Atlantic albacore stock was in a good state before 1985 but had been overfished from 1985 to 2005;after that,it was rebuilt gradually but must be taken care of. The delay difference model gave more effective and conservative results than the surplus production model.%将剩余产量模型和时滞差分模型分别应用于南大西洋长鳍金枪鱼(Thunnus alalunga)渔业数据,结果表明,比起剩余产量模型,时滞差分模型拟合的单位捕捞努力渔获量( catch per unit effort,CPUE)曲线能够更好地捕捉到 CPUE 随着时间的波动。赤池信息量准则( Akaike information criterion,AIC)的结果显示,时滞差分模型比Schaefer 模型的评估效果要好。时滞差分模型评估的最大可持续产量( maximum sustainable yield,MSY)中值为22490 t,80%置信区间为21756~23408 t;剩余产量模型评估的 MSY 中值为27520 t,80%的置信区间为26116~28959 t。生物学参考点的结果表明目标群体在1985年以前资源状态较好;1985年~2005年的20年里处于过度捕捞状态;2005年后资源状况得到改善,但仍需加强管理。比起剩余产量模型,时滞差分模型给出了更为有效且保守的评估结果。

  3. Spectral Aging Model Applied to Meteosat First Generation Visible Band

    Directory of Open Access Journals (Sweden)

    Ilse Decoster

    2014-03-01

    Full Text Available The Meteosat satellites have been operational since the early eighties, creating so far a continuous time period of observations of more than 30 years. In order to use this data for climate data records, a consistent calibration is necessary between the consecutive instruments. Studies have shown that the Meteosat First Generation (MFG satellites (1982–2006 suffer from in-flight degradation which is spectral of nature and is not corrected by the official calibration of EUMETSAT. Continuing on previous published work by the same authors, this paper applies the spectral aging model to a set of clear-sky and cloudy targets, and derives the model parameters for all six MFG satellites (Meteosat-2 to -7. Several problems have been encountered, both due to the instrument and due to geophysical occurrences, and these are discussed and illustrated here in detail. The paper shows how the spectral aging model is an improvement compared to the EUMETSAT calibration method with a stability of 1%–2% for Meteosat-4 to -7, which increases up to 6% for ocean sites using the full MFG time period.

  4. Dynamical behavior of the Niedermayer algorithm applied to Potts models

    Science.gov (United States)

    Girardi, D.; Penna, T. J. P.; Branco, N. S.

    2012-08-01

    In this work, we make a numerical study of the dynamic universality class of the Niedermayer algorithm applied to the two-dimensional Potts model with 2, 3, and 4 states. This algorithm updates clusters of spins and has a free parameter, E0, which controls the size of these clusters, such that E0=1 is the Metropolis algorithm and E0=0 regains the Wolff algorithm, for the Potts model. For -1clusters of equal spins can be formed: we show that the mean size of the clusters of (possibly) turned spins initially grows with the linear size of the lattice, L, but eventually saturates at a given lattice size L˜, which depends on E0. For L≥L˜, the Niedermayer algorithm is in the same dynamic universality class of the Metropolis one, i.e, they have the same dynamic exponent. For E0>0, spins in different states may be added to the cluster but the dynamic behavior is less efficient than for the Wolff algorithm (E0=0). Therefore, our results show that the Wolff algorithm is the best choice for Potts models, when compared to the Niedermayer's generalization.

  5. Production economic models of fisheries

    DEFF Research Database (Denmark)

    Andersen, Jesper Levring

    The overall purpose of this PhD thesis is to investigate different aspects of fishermen’s behaviour using production economic models at the individual and industry levels. Three parts make up this thesis. The first part provides an overview of the thesis. The second part consists of four papers...... or fishing location. Behaviour can be viewed as being determined by the fishermen’s objectives subject to different restrictions, given by physical resources, time, mental capacity and information, and institutions. The review of the extensive literature gives reasonable support to the neoclassical...

  6. Simulating the fate of fall- and spring-applied poultry litter nitrogen in corn production

    Science.gov (United States)

    Monitoring the fate of N derived from manures applied to fertilize crops is difficult, time consuming, and relatively expensive. But computer simulation models can help understand the interactions among various N processes in the soil-plant system and determine the fate of applied N. The RZWQM2 was ...

  7. Product directivity models for parametric loudspeakers.

    Science.gov (United States)

    Shi, Chuang; Gan, Woon-Seng

    2012-03-01

    In a recent work, the beamsteering characteristics of parametric loudspeakers were validated in an experiment. It was shown that based on the product directivity model, the locations and amplitudes of the mainlobe and grating lobes could be predicted within acceptable errors. However, the measured amplitudes of sidelobes have not been able to match the theoretical results accurately. In this paper, the original theories behind the product directivity model are revisited, and three modified product directivity models are proposed: (i) the advanced product directivity model, (ii) the exponential product directivity model, and (iii) the combined product directivity model. The proposed product directivity models take the radii of equivalent Gaussian sources into account and obtain better predictions of sidelobes for the difference frequency waves. From the comparison between measurement results and numerical solutions, all the proposed models outperform the original product directivity model in terms of selected sidelobe predictions by about 10 dB.

  8. Hierarchic stochastic modelling applied to intracellular Ca(2+ signals.

    Directory of Open Access Journals (Sweden)

    Gregor Moenke

    Full Text Available Important biological processes like cell signalling and gene expression have noisy components and are very complex at the same time. Mathematical analysis of such systems has often been limited to the study of isolated subsystems, or approximations are used that are difficult to justify. Here we extend a recently published method (Thurley and Falcke, PNAS 2011 which is formulated in observable system configurations instead of molecular transitions. This reduces the number of system states by several orders of magnitude and avoids fitting of kinetic parameters. The method is applied to Ca(2+ signalling. Ca(2+ is a ubiquitous second messenger transmitting information by stochastic sequences of concentration spikes, which arise by coupling of subcellular Ca(2+ release events (puffs. We derive analytical expressions for a mechanistic Ca(2+ model, based on recent data from live cell imaging, and calculate Ca(2+ spike statistics in dependence on cellular parameters like stimulus strength or number of Ca(2+ channels. The new approach substantiates a generic Ca(2+ model, which is a very convenient way to simulate Ca(2+ spike sequences with correct spiking statistics.

  9. 75 FR 47592 - Final Test Guideline; Product Performance of Skin-applied Insect Repellents of Insect and Other...

    Science.gov (United States)

    2010-08-06

    ... AGENCY Final Test Guideline; Product Performance of Skin-applied Insect Repellents of Insect and Other... Product Performance of Skin-applied Insect Repellents of Insect and Other Arthropods Test Guidelines... ``Product Performance of Skin-applied Insect Repellents of Insects and Other Arthropods'' (OPPTS...

  10. Biases in simulation of the rice phenology models when applied in warmer climates

    Science.gov (United States)

    Zhang, T.; Li, T.; Yang, X.; Simelton, E.

    2015-12-01

    The current model inter-comparison studies highlight the difference in projections between crop models when they are applied to warmer climates, but these studies do not provide results on how the accuracy of the models would change in these projections because the adequate observations under largely diverse growing season temperature (GST) are often unavailable. Here, we investigate the potential changes in the accuracy of rice phenology models when these models were applied to a significantly warmer climate. We collected phenology data from 775 trials with 19 cultivars in 5 Asian countries (China, India, Philippines, Bangladesh and Thailand). Each cultivar encompasses the phenology observations under diverse GST regimes. For a given rice cultivar in different trials, the GST difference reaches 2.2 to 8.2°C, which allows us to calibrate the models under lower GST and validate under higher GST (i.e., warmer climates). Four common phenology models representing major algorithms on simulations of rice phenology, and three model calibration experiments were conducted. The results suggest that the bilinear and beta models resulted in gradually increasing phenology bias (Figure) and double yield bias per percent increase in phenology bias, whereas the growing-degree-day (GDD) and exponential models maintained a comparatively constant bias when applied in warmer climates (Figure). Moreover, the bias of phenology estimated by the bilinear and beta models did not reduce with increase in GST when all data were used to calibrate models. These suggest that variations in phenology bias are primarily attributed to intrinsic properties of the respective phenology model rather than on the calibration dataset. Therefore we conclude that using the GDD and exponential models has more chances of predicting rice phenology correctly and thus, production under warmer climates, and result in effective agricultural strategic adaptation to and mitigation of climate change.

  11. A grand model for chemical product design

    DEFF Research Database (Denmark)

    Fung, Ka Y.; Ng, Ka M.; Zhang, Lei;

    2016-01-01

    Chemical engineering has been expanding its focus from primarily business-to-business products (B2B) to business-to-consumer (B2C) products. The production of B2B products generally emphasizes on process design and optimization, whereas the production of B2C products focuses on product quality......, ingredients and structure. Market and competitive analysis, government policies and regulations have to be explicitly considered in product design. All these considerations are accounted for in the Grand Product Design Model, which consists of a process model, a property model, a quality model, a cost model...... product composition changes with market conditions. Another is a hand lotion that illustrates how product quality affects the profit.(C) 2016 Elsevier Ltd. All rights reserved....

  12. [The bioethical principlism model applied in pain management].

    Science.gov (United States)

    Souza, Layz Alves Ferreira; Pessoa, Ana Paula da Costa; Barbosa, Maria Alves; Pereira, Lilian Varanda

    2013-03-01

    An integrative literature review was developed with the purpose to analyze the scientific production regarding the relationships between pain and the principles of bioethics (autonomy, beneficence, nonmaleficence and justice). Controlled descriptors were used in three international data sources (LILACS, SciELO, MEDLINE), in April of 2012, totaling 14 publications categorized by pain and autonomy, pain and beneficence, pain and nonmaleficence, pain and justice. The adequate relief of pain is a human right and a moral issue directly related with the bioethical principlism standard model (beneficence, non-maleficence, autonomy and justice). However, many professionals overlook the pain of their patients, ignoring their ethical role when facing suffering. It was concluded that principlism has been neglected in the care of patients in pain, showing the need for new practices to change this setting.

  13. Extending product modeling methods for integrated product development

    DEFF Research Database (Denmark)

    Bonev, Martin; Wörösch, Michael; Hauksdóttir, Dagný

    2013-01-01

    Despite great efforts within the modeling domain, the majority of methods often address the uncommon design situation of an original product development. However, studies illustrate that development tasks are predominantly related to redesigning, improving, and extending already existing products....... Updated design requirements have then to be made explicit and mapped against the existing product architecture. In this paper, existing methods are adapted and extended through linking updated requirements to suitable product models. By combining several established modeling techniques, such as the DSM...... and PVM methods, in a presented Product Requirement Development model some of the individual drawbacks of each method could be overcome. Based on the UML standard, the model enables the representation of complex hierarchical relationships in a generic product model. At the same time it uses matrix...

  14. Applying the INN model to the MaxClique problem

    Energy Technology Data Exchange (ETDEWEB)

    Grossman, T.

    1993-09-01

    Max-Clique is the problem of finding the largest clique in a given graph. It is not only NP-hard, but, as recent results suggest, even hard to approximate. Nevertheless it is still very important to develop and test practical algorithms that will find approximate solutions for the maximum clique problem on various graphs stemming from numerous applications. Indeed, many different types of algorithmic approaches are applied to that problem. Several neural networks and related algorithms were applied recently to combinatorial optimization problems in general and to the Max-Clique problem in particular. These neural nets are dynamical system which minimize a cost (or computational ``energy``) function that represents the optimization problem, the Max-Clique in our case. Therefore they all belong to the class of integer programming algorithms surveyed in the Pardalos and Xue review. The work presented here is a development and improvement of a neural network algorithm that was introduced recently. In the previous work, we have considered two Hopfield type neural networks, the INN and the HcN, and their application to the max-clique problem. In this paper, I concentrate on the INN network and present an improved version of the t-A algorithm that was introduced in. The rest of this paper is organized as follows: in section 2, I describe the INN model and how it implements a given graph. In section 3, it is characterized in terms of graph theory. In particular, the stable states of the network are mapped to the maximal cliques of its underling graph. In section 4, I present the t-Annealing algorithm and an improved version of it, the Adaptive t-Annealing. Several experiments done with these algorithms on benchmark graphs are reported in section 5, and the efficiency of the new algorithm is demonstrated. I conclude with a short discussion.

  15. From LCA to PSS – Making leaps towards sustainability by applying product/service-system thinking in product development

    DEFF Research Database (Denmark)

    Bey, Niki; McAloone, Timothy Charles

    2006-01-01

    Life Cycle Assessment (LCA) is the standardised and globally recognised tool for quantifying environmental impact of goods and services. A key aspect in LCA is the consideration of whole life cycle systems. The application of LCA in product development inherently comprises the quest...... their surrounding systems. This paper will exemplify that when broadening the ecodesign horizon to environmental product/service-system (PSS) design, there is a better possibility of applying a system-oriented life cycle thinking approach, and therefore a potential to yield extreme improvements towards...... for optimisations on all system levels. However, as the act of ecodesign conventionally focuses on physical products, the search for potential optimisations is usually directed ‘downwards’, i.e. towards lower system levels, resulting in optimised components within products rather than optimised products within...

  16. A procedure for Building Product Models

    DEFF Research Database (Denmark)

    Hvam, Lars

    1999-01-01

    , easily adaptable concepts and methods from data modeling (object oriented analysis) and domain modeling (product modeling). The concepts are general and can be used for modeling all types of specifications in the different phases in the product life cycle. The modeling techniques presented have been...

  17. Cellular Automata Models Applied to the Study of Landslide Dynamics

    Science.gov (United States)

    Liucci, Luisa; Melelli, Laura; Suteanu, Cristian

    2015-04-01

    Landslides are caused by complex processes controlled by the interaction of numerous factors. Increasing efforts are being made to understand the spatial and temporal evolution of this phenomenon, and the use of remote sensing data is making significant contributions in improving forecast. This paper studies landslides seen as complex dynamic systems, in order to investigate their potential Self Organized Critical (SOC) behavior, and in particular, scale-invariant aspects of processes governing the spatial development of landslides and their temporal evolution, as well as the mechanisms involved in driving the system and keeping it in a critical state. For this purpose, we build Cellular Automata Models, which have been shown to be capable of reproducing the complexity of real world features using a small number of variables and simple rules, thus allowing for the reduction of the number of input parameters commonly used in the study of processes governing landslide evolution, such as those linked to the geomechanical properties of soils. This type of models has already been successfully applied in studying the dynamics of other natural hazards, such as earthquakes and forest fires. The basic structure of the model is composed of three modules: (i) An initialization module, which defines the topographic surface at time zero as a grid of square cells, each described by an altitude value; the surface is acquired from real Digital Elevation Models (DEMs). (ii) A transition function, which defines the rules used by the model to update the state of the system at each iteration. The rules use a stability criterion based on the slope angle and introduce a variable describing the weakening of the material over time, caused for example by rainfall. The weakening brings some sites of the system out of equilibrium thus causing the triggering of landslides, which propagate within the system through local interactions between neighboring cells. By using different rates of

  18. Modelling of virtual production networks

    Directory of Open Access Journals (Sweden)

    2011-03-01

    Full Text Available Nowadays many companies, especially small and medium-sized enterprises (SMEs, specialize in a limited field of production. It requires forming virtual production networks of cooperating enterprises to manufacture better, faster and cheaper. Apart from that, some production orders cannot be realized, because there is not a company of sufficient production potential. In this case the virtual production networks of cooperating companies can realize these production orders. These networks have larger production capacity and many different resources. Therefore it can realize many more production orders together than each of them separately. Such organization allows for executing high quality product. The maintenance costs of production capacity and used resources are not so high. In this paper a methodology of rapid prototyping of virtual production networks is proposed. It allows to execute production orders on time considered existing logistic constraints.

  19. Applying tobacco carcinogen and toxicant biomarkers in product regulation and cancer prevention.

    Science.gov (United States)

    Hecht, Stephen S; Yuan, Jian-Min; Hatsukami, Dorothy

    2010-06-21

    Tobacco carcinogen and toxicant biomarkers are metabolites or protein or DNA adducts of specific compounds in tobacco products. Highly reliable analytical methods, based mainly on mass spectrometry, have been developed and applied in large studies of many of these biomarkers. A panel of tobacco carcinogen and toxicant biomarkers is suggested here, and typical values for smokers and nonsmokers are summarized. This panel of biomarkers has potential applications in the new and challenging area of tobacco product regulation and in the development of rational approaches to cancer prevention by establishing carcinogen and toxicant uptake and excretion in people exposed to tobacco products.

  20. The POWHEG method applied to top pair production and decays at the ILC

    CERN Document Server

    Latunde-Dada, Oluseyi

    2008-01-01

    We study the effects of gluon radiation in top pair production and their decays for e+e- annihilation at the ILC. To achieve this we apply the POWHEG method and interface our results to the Monte Carlo event generator Herwig++. We consider a center-of-mass energy of 500GeV and compare decay correlations and bottom quark distributions before hadronization.

  1. Benthic microalgal production in the Arctic: Applied methods and status of the current database

    DEFF Research Database (Denmark)

    Glud, Ronnie Nøhr; Woelfel, Jana; Karsten, Ulf;

    2009-01-01

    The current database on benthic microalgal production in Arctic waters comprises 10 peer-reviewed and three unpublished studies. Here, we compile and discuss these datasets, along with the applied measurement approaches used. The latter is essential for robust comparative analysis and to clarify ...

  2. The economic production lot size model with several production rates

    DEFF Research Database (Denmark)

    Larsen, Christian

    should be chosen in the interval between the demand rate and the production rate, which minimize unit production costs, and should be used in an increasing order. Then, given the production rates, we derive closed form solutions for the optimal runtimes as well as the minimum average cost. Finally we......We study an extension of the economic production lot size model, where more than one production rate can be used during a cycle. The production rates and their corresponding runtimes are decision variables. We decompose the problem into two subproblems. First, we show that all production rates...

  3. BCS-Hubbard model applied to anisotropic superconductors

    Energy Technology Data Exchange (ETDEWEB)

    Millan, J.S., E-mail: smillan@pampano.unacar.mx [Facultad de Ingenieria, Universidad Autonoma del Carmen, Cd. del Carmen, 24180 Campeche (Mexico); Perez, L.A. [Instituto de Fisica, Universidad Nacional Autonoma de Mexico, A.P. 20-364, 01000, Mexico D.F. (Mexico); Wang, C. [Instituto de Investigaciones en Materiales, Universidad Nacional Autonoma de Mexico, A.P. 70-360, 04510, Mexico D.F. (Mexico)

    2011-11-15

    The BCS formalism applied to a Hubbard model, including correlated hoppings, is used to study d-wave superconductors. The theoretical T{sub c} vs. n relationship is compared with experimental data from BiSr{sub 2-x}La{sub x}CuO{sub 6+{delta}} and La{sub 2-x}Sr{sub x}CuO{sub 4}. The results suggest a nontrivial correlation between the hole and the doping concentrations. Based on the BCS formalism, we study the critical temperature (T{sub c}) as a function of electron density (n) in a square lattice by means of a generalized Hubbard model, in which first ({Delta}t) and second neighbors ({Delta}t{sub 3}) correlated-hopping interactions are included in addition to the repulsive Coulomb ones. We compare the theoretical T{sub c} vs. n relationship with experimental data of cuprate superconductors BiSr{sub 2-x}La{sub x}CuO{sub 6+{delta}} (BSCO) and La{sub 2-x}Sr{sub x}CuO{sub 4}, (LSCO). The theory agrees very well with BSCO data even though the complicated association between Sr concentration (x) and hole doping (p). For the LSCO system, it is observed that in the underdoped regime, the T{sub c} vs. n behavior can be associated to different systems with small variations of t'. For the overdoped regime, a more complicated dependence n = 1 - p/2 fits better than n = 1 - p. On the other hand, it is proposed that the second neighbor hopping ratio (t'/t) should be replaced by the effective mean field hopping ratio t{sub MF}{sup '}/t{sub MF}, which can be very sensitive to small changes of t' due to the doping.

  4. Applied genre analysis: a multi-perspective model

    Directory of Open Access Journals (Sweden)

    Vijay K Bhatia

    2002-04-01

    Full Text Available Genre analysis can be viewed from two different perspectives: it may be seen as a reflection of the complex realities of the world of institutionalised communication, or it may be seen as a pedagogically effective and convenient tool for the design of language teaching programmes, often situated within simulated contexts of classroom activities. This paper makes an attempt to understand and resolve the tension between these two seemingly contentious perspectives to answer the question: "Is generic description a reflection of reality, or a convenient fiction invented by applied linguists?". The paper also discusses issues related to the nature and use of linguistic description in a genre-based educational enterprise, claiming that instead of using generic descriptions as models for linguistic reproduction of conventional forms to respond to recurring social contexts, as is often the case in many communication based curriculum contexts, they can be used as analytical resource to understand and manipulate complex inter-generic and multicultural realisations of professional discourse, which will enable learners to use generic knowledge to respond to novel social contexts and also to create new forms of discourse to achieve pragmatic success as well as other powerful human agendas.

  5. International Conference on Applied Mathematics, Modeling and Computational Science & Annual meeting of the Canadian Applied and Industrial Mathematics

    CERN Document Server

    Bélair, Jacques; Kunze, Herb; Makarov, Roman; Melnik, Roderick; Spiteri, Raymond J

    2016-01-01

    Focusing on five main groups of interdisciplinary problems, this book covers a wide range of topics in mathematical modeling, computational science and applied mathematics. It presents a wealth of new results in the development of modeling theories and methods, advancing diverse areas of applications and promoting interdisciplinary interactions between mathematicians, scientists, engineers and representatives from other disciplines. The book offers a valuable source of methods, ideas, and tools developed for a variety of disciplines, including the natural and social sciences, medicine, engineering, and technology. Original results are presented on both the fundamental and applied level, accompanied by an ample number of real-world problems and examples emphasizing the interdisciplinary nature and universality of mathematical modeling, and providing an excellent outline of today’s challenges. Mathematical modeling, with applied and computational methods and tools, plays a fundamental role in modern science a...

  6. Kinetic models for fermentative hydrogen production: A review

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Jianlong; Wan, Wei [Laboratory of Environmental Technology, INET, Tsinghua University, Beijing 100084 (China)

    2009-05-15

    The kinetic models were developed and applied for fermentative hydrogen production. They were used to describe the progress of a batch fermentative hydrogen production process, to investigate the effects of substrate concentration, inhibitor concentration, temperatures, pH, and dilution rates on the process of fermentative hydrogen production, and to establish the relationship among the substrate degradation rate, the hydrogen-producing bacteria growth rate and the product formation rate. This review showed that the modified Gompertz model was widely used to describe the progress of a batch fermentative hydrogen production process, while the Monod model was widely used to describe the effects of substrate concentration on the rates of substrate degradation, hydrogen-producing bacteria growth and hydrogen production. Arrhenius model was used a lot to describe the effects of temperature on fermentative hydrogen production, while modified Han-Levenspiel model was used to describe the effects of inhibitor concentration on fermentative hydrogen production. The Andrew model was used to describe the effects of H{sup +} concentration on the specific hydrogen production rate, while the Luedeking-Piret model and its modified form were widely used to describe the relationship between the hydrogen-producing bacteria growth rate and the product formation rate. Finally, some suggestions for future work with these kinetic models were proposed. (author)

  7. Alignment of Product Models and Product State Models - Integration of the Product Lifecycle Phases

    DEFF Research Database (Denmark)

    Larsen, Michael Holm; Kirkby, Lars Phillip; Vesterager, Johan

    1999-01-01

    The purpose of this paper is to discuss the integration of the Product Model (PM) and the Product State Model (PCM). Focus is on information exchange from the PSM to the PM within the manufacturing of a single ship. The paper distinguishes between information and knowledge integration. The paper ...... provides some overall strategies for integrating PM and PSM. The context of this discussion is a development project at Odense Steel Shipyard....

  8. Mixture experiment techniques for reducing the number of components applied for modeling waste glass sodium release

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, G.; Redgate, T. [Pacific Northwest National Lab., Richland, WA (United States). Statistics Group

    1997-12-01

    Statistical mixture experiment techniques were applied to a waste glass data set to investigate the effects of the glass components on Product Consistency Test (PCT) sodium release (NR) and to develop a model for PCT NR as a function of the component proportions. The mixture experiment techniques indicate that the waste glass system can be reduced from nine to four components for purposes of modeling PCT NR. Empirical mixture models containing four first-order terms and one or two second-order terms fit the data quite well, and can be used to predict the NR of any glass composition in the model domain. The mixture experiment techniques produce a better model in less time than required by another approach.

  9. Model-driven and software product line engineering

    CERN Document Server

    Royer, Jean-Claude

    2013-01-01

    Many approaches to creating Software Product Lines have emerged that are based on Model-Driven Engineering. This book introduces both Software Product Lines and Model-Driven Engineering, which have separate success stories in industry, and focuses on the practical combination of them. It describes the challenges and benefits of merging these two software development trends and provides the reader with a novel approach and practical mechanisms to improve software development productivity.The book is aimed at engineers and students who wish to understand and apply software product lines

  10. A unified framework for benchmark dose estimation applied to mixed models and model averaging

    DEFF Research Database (Denmark)

    Ritz, Christian; Gerhard, Daniel; Hothorn, Ludwig A.

    2013-01-01

    This article develops a framework for benchmark dose estimation that allows intrinsically nonlinear dose-response models to be used for continuous data in much the same way as is already possible for quantal data. This means that the same dose-response model equations may be applied to both...

  11. Product Recommendation System Based on Personal Preference Model Using CAM

    Science.gov (United States)

    Murakami, Tomoko; Yoshioka, Nobukazu; Orihara, Ryohei; Furukawa, Koichi

    Product recommendation system is realized by applying business rules acquired by data maining techniques. Business rules such as demographical patterns of purchase, are able to cover the groups of users that have a tendency to purchase products, but it is difficult to recommend products adaptive to various personal preferences only by utilizing them. In addition to that, it is very costly to gather the large volume of high quality survey data, which is necessary for good recommendation based on personal preference model. A method collecting kansei information automatically without questionnaire survey is required. The constructing personal preference model from less favor data is also necessary, since it is costly for the user to input favor data. In this paper, we propose product recommendation system based on kansei information extracted by text mining and user's preference model constructed by Category-guided Adaptive Modeling, CAM for short. CAM is a feature construction method that can generate new features constructing the space where same labeled examples are close and different labeled examples are far away from some labeled examples. It is possible to construct personal preference model by CAM despite less information of likes and dislikes categories. In the system, retrieval agent gathers the products' specification and user agent manages preference model, user's likes and dislikes. Kansei information of the products is gained by applying text mining technique to the reputation documents about the products on the web site. We carry out some experimental studies to make sure that prefrence model obtained by our method performs effectively.

  12. Stochastic frontier production model with undesirable outputs:An application to an HIV immunology model

    Institute of Scientific and Technical Information of China (English)

    BIAN Fuping; DAI Min

    2005-01-01

    This paper extends the stochastic frontier production theory to the case of multiple outputs and calculate the measurement of efficiency using the production theory. We further apply this method to construct the stochastic frontier production model with undesirable outputs. Finally, the model is used in an HIV immunology model and the efficient drug treatment strategies are then explored.All the models are estimated using the Maximum Likelihood Estimation method. Stochastic errors are considered in this model, which is an advantage over other deterministic efficiency models. Some of our conclusions agree with those published in related papers.

  13. Applying Discourse Analysis in ELT: a Five Cs Model

    Institute of Scientific and Technical Information of China (English)

    肖巧慧

    2009-01-01

    Based on a discussion of definitions on Discourse analysis,discourse is regard as layers consist of five elements--cohesion, coherence, culture, critique and context. Moreover, we focus on applying DA in ELT.

  14. Modelling and using product architectures in mechatronic product development

    DEFF Research Database (Denmark)

    Bruun, Hans Peter Lomholt; Mortensen, Niels Henrik

    , and lessons learned from a case study in which the author tested a modelling tool to represent a product’s architecture during product development in a larger Danish company. The reasons leading to the use of the specific model and it’s terminology is described and illustrated. The paper supports two......The objective for the paper is to determine the role of a product architecture modelling tool to support communication and to form the basis for developing and maintaining product structures for improving development practices of complex products. This paper contains descriptions, observations...... fundamental theoretical viewpoints; Theories of technical systems and theories of design processes. In this framing, the paper addresses the engineering activity of developing products supported by product architecture representations. The paper includes the description of a visual architecture representation...

  15. Reliability measures for indexed semi-Markov chains applied to wind energy production

    CERN Document Server

    D'Amico, Guglielmo; Prattico, Flavio

    2013-01-01

    The computation of the dependability measures is a crucial point in the planning and development of a wind farm. In this paper we address the issue of energy production by wind turbine by using an indexed semi-Markov chain as a model of wind speed. We present the mathematical model, we describe the data and technical characteristics of a commercial wind turbine (Aircon HAWT-10kW). We show how to compute some of the main dependability measures such as reliability, availability and maintainability functions. We compare the results of the model with real energy production obtained from data available in the Lastem station (Italy) and sampled every 10 minutes.

  16. An Optimization Model for Product Placement on Product Listing Pages

    Directory of Open Access Journals (Sweden)

    Yan-Kwang Chen

    2014-01-01

    Full Text Available The design of product listing pages is a key component of Website design because it has significant influence on the sales volume on a Website. This study focuses on product placement in designing product listing pages. Product placement concerns how venders of online stores place their products over the product listing pages for maximization of profit. This problem is very similar to the offline shelf management problem. Since product information sources on a Web page are typically communicated through the text and image, visual stimuli such as color, shape, size, and spatial arrangement often have an effect on the visual attention of online shoppers and, in turn, influence their eventual purchase decisions. In view of the above, this study synthesizes the visual attention literature and theory of shelf-space allocation to develop a mathematical programming model with genetic algorithms for finding optimal solutions to the focused issue. The validity of the model is illustrated with example problems.

  17. Applying the POWHEG method to top pair production and decays at the ILC

    CERN Document Server

    Latunde-Dada, Oluseyi

    2008-01-01

    We study the effects of gluon radiation in top pair production and their decays for e+e- annihilation at the ILC. To achieve this we apply the POWHEG method and interface our results to the Monte Carlo event generator Herwig++. We consider a center-of-mass energy of \\sqrt{s}=500 GeV and compare decay correlations and bottom quark and anti-quark distributions before hadronization.

  18. MASS CUSTOMIZATION and PRODUCT MODELS

    DEFF Research Database (Denmark)

    Svensson, Carsten; Malis, Martin

    2003-01-01

    to the product. Through the application of a mass customization strategy, companies have a unique opportunity to create increased customer satisfaction. In a customized production, knowledge and information have to be easily accessible since every product is a unique combination of information. If the dream...... of a customized alternative instead of a uniform mass-produced product shall become a reality, then the cross-organizational efficiency must be kept at a competitive level. This is the real challenge for mass customization. A radical restructuring of both the internal and the external knowledge management systems...

  19. Robust Decision-making Applied to Model Selection

    Energy Technology Data Exchange (ETDEWEB)

    Hemez, Francois M. [Los Alamos National Laboratory

    2012-08-06

    The scientific and engineering communities are relying more and more on numerical models to simulate ever-increasingly complex phenomena. Selecting a model, from among a family of models that meets the simulation requirements, presents a challenge to modern-day analysts. To address this concern, a framework is adopted anchored in info-gap decision theory. The framework proposes to select models by examining the trade-offs between prediction accuracy and sensitivity to epistemic uncertainty. The framework is demonstrated on two structural engineering applications by asking the following question: Which model, of several numerical models, approximates the behavior of a structure when parameters that define each of those models are unknown? One observation is that models that are nominally more accurate are not necessarily more robust, and their accuracy can deteriorate greatly depending upon the assumptions made. It is posited that, as reliance on numerical models increases, establishing robustness will become as important as demonstrating accuracy.

  20. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Science.gov (United States)

    Biggs, Matthew B; Papin, Jason A

    2013-01-01

    Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  1. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Directory of Open Access Journals (Sweden)

    Matthew B Biggs

    Full Text Available Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  2. Integrated modelling of crop production and nitrate leaching with the Daisy model

    DEFF Research Database (Denmark)

    Manevski, Kiril; Børgesen, Christen Duus; Li, Xiaoxin

    2016-01-01

    An integrated modelling strategy was designed and applied to the Soil-Vegetation-Atmosphere Transfer model Daisy for simulation of crop production and nitrate leaching under pedo-climatic and agronomic environment different than that of model original parameterisation. The points of significance...

  3. CRC-cards for Product Modelling

    DEFF Research Database (Denmark)

    Hvam, Lars; Riis, Jesper; Hansen, Benjamin Loer

    2003-01-01

    , transportation, service and decommissioning. A main challenge when building product models is to collect and document the product related data, information and knowledge in a structured way. CRC cards are index cards (or computerized versions of these) which are used to record proposed classes, the behavior......This paper describes the CRC (class, responsibility, collaboration) modelling process for building product models. A product model is normally represented in an IT system which contains data, information and knowledge on industrial products and their life cycle properties e.g. manufacturing...... of the classes, their responsibilities, and their relationship to other classes (collaboration). CRC modelling gives an effective, low-tech method for domain-experts, programmers and users to work closely together to identify, structure, understand and document a product model. CRC cards were originally...

  4. A review of studies applying environmental impact assessment methods on fruit production systems.

    Science.gov (United States)

    Cerutti, Alessandro K; Bruun, Sander; Beccaro, Gabriele L; Bounous, Giancarlo

    2011-10-01

    Although many aspects of environmental accounting methodologies in food production have already been investigated, the application of environmental indicators in the fruit sector is still rare and no consensus can be found on the preferred method. On the contrary, widely diverging approaches have been taken to several aspects of the analyses, such as data collection, handling of scaling issues, and goal and scope definition. This paper reviews studies assessing the sustainability or environmental impacts of fruit production under different conditions and identifies aspects of fruit production that are of environmental importance. Four environmental assessment methods which may be applied to assess fruit production systems are evaluated, namely Life Cycle Assessment, Ecological Footprint Analysis, Emergy Analysis and Energy Balance. In the 22 peer-reviewed journal articles and two conference articles applying one of these methods in the fruit sector that were included in this review, a total of 26 applications of environmental impact assessment methods are described. These applications differ concerning e.g. overall objective, set of environmental issues considered, definition of system boundaries and calculation algorithms. Due to the relatively high variability in study cases and approaches, it was not possible to identify any one method as being better than the others. However, remarks on methodologies and suggestions for standardisation are given and the environmental burdens of fruit systems are highlighted.

  5. Method of product portfolio analysis based on optimization models

    Directory of Open Access Journals (Sweden)

    V.M. Lozyuk

    2011-12-01

    Full Text Available The research is devoted to optimization of the structure of product portfolio of trading company with using the principles of the investment modeling. We further developed the models of investment portfolio optimization, using the known Markowitz and Sharp methods to determine the optimal portfolio of trade company. Adapted to the goods market the models in this study could be applied to the business of trade companies.

  6. Adequateness of applying the Zmijewski model on Serbian companies

    Directory of Open Access Journals (Sweden)

    Pavlović Vladan

    2012-12-01

    Full Text Available The aim of the paper is to determine the accuracy of the prediction of Zmijewski model in Serbia on the eligible sample. At the same time, the paper identifies model's strengths, weaknesses and limitations of its possible application. Bearing in mind that the economic environment in Serbia is not similar to the United States at the time the model was developed, Zmijewski model is surprisingly accurate in the case of Serbian companies. The accuracy was slightly weaker than the model results in the U.S. in its original form, but much better than the results model gave in the U.S. in the period 1988-1991, and 1992-1999. Model gave also better results in Serbia comparing those in Croatia, even in Croatia model was adjusted.

  7. Applying Meta-Analysis to Structural Equation Modeling

    Science.gov (United States)

    Hedges, Larry V.

    2016-01-01

    Structural equation models play an important role in the social sciences. Consequently, there is an increasing use of meta-analytic methods to combine evidence from studies that estimate the parameters of structural equation models. Two approaches are used to combine evidence from structural equation models: A direct approach that combines…

  8. Ontological Relations and the Capability Maturity Model Applied in Academia

    Science.gov (United States)

    de Oliveira, Jerônimo Moreira; Campoy, Laura Gómez; Vilarino, Lilian

    2015-01-01

    This work presents a new approach to the discovery, identification and connection of ontological elements within the domain of characterization in learning organizations. In particular, the study can be applied to contexts where organizations require planning, logic, balance, and cognition in knowledge creation scenarios, which is the case for the…

  9. Applying the Job Characteristics Model to the College Education Experience

    Science.gov (United States)

    Kass, Steven J.; Vodanovich, Stephen J.; Khosravi, Jasmine Y.

    2011-01-01

    Boredom is one of the most common complaints among university students, with studies suggesting its link to poor grades, drop out, and behavioral problems. Principles borrowed from industrial-organizational psychology may help prevent boredom and enrich the classroom experience. In the current study, we applied the core dimensions of the job…

  10. Architecture Descriptions. A Contribution to Modeling of Production System Architecture

    DEFF Research Database (Denmark)

    Jepsen, Allan Dam; Hvam, Lars

    on the underlying principles of a production system’s design; and despite the existence of established architecture and platform theories and practices within product design, there is still a need for a better understanding of the architecture phenomenon itself, and certainly how it applies within production system....... The viewpoints provide a set of model kinds to frame select architecture related concerns relating to the production capability and the design of the technical system. With the contribution to architecture description there follows a need to support exchange and processing of architecture information within......The subject of this PhD dissertation is architecture-centric design and the description of production system architecture. Companies are facing demands for the development and production of new products at an ever increasing rate, as the market life of products decreases and the rate at which...

  11. Applying CBR to machine tool product configuration design oriented to customer requirements

    Science.gov (United States)

    Wang, Pengjia; Gong, Yadong; Xie, Hualong; Liu, Yongxian; Nee, Andrew Yehching

    2016-03-01

    Product customization is a trend in the current market-oriented manufacturing environment. However, deduction from customer requirements to design results and evaluation of design alternatives are still heavily reliant on the designer's experience and knowledge. To solve the problem of fuzziness and uncertainty of customer requirements in product configuration, an analysis method based on the grey rough model is presented. The customer requirements can be converted into technical characteristics effectively. In addition, an optimization decision model for product planning is established to help the enterprises select the key technical characteristics under the constraints of cost and time to serve the customer to maximal satisfaction. A new case retrieval approach that combines the self-organizing map and fuzzy similarity priority ratio method is proposed in case-based design. The self-organizing map can reduce the retrieval range and increase the retrieval efficiency, and the fuzzy similarity priority ratio method can evaluate the similarity of cases comprehensively. To ensure that the final case has the best overall performance, an evaluation method of similar cases based on grey correlation analysis is proposed to evaluate similar cases to select the most suitable case. Furthermore, a computer-aided system is developed using MATLAB GUI to assist the product configuration design. The actual example and result on an ETC series machine tool product show that the proposed method is effective, rapid and accurate in the process of product configuration. The proposed methodology provides a detailed instruction for the product configuration design oriented to customer requirements.

  12. Applying CBR to machine tool product configuration design oriented to customer requirements

    Science.gov (United States)

    Wang, Pengjia; Gong, Yadong; Xie, Hualong; Liu, Yongxian; Nee, Andrew Yehching

    2017-01-01

    Product customization is a trend in the current market-oriented manufacturing environment. However, deduction from customer requirements to design results and evaluation of design alternatives are still heavily reliant on the designer's experience and knowledge. To solve the problem of fuzziness and uncertainty of customer requirements in product configuration, an analysis method based on the grey rough model is presented. The customer requirements can be converted into technical characteristics effectively. In addition, an optimization decision model for product planning is established to help the enterprises select the key technical characteristics under the constraints of cost and time to serve the customer to maximal satisfaction. A new case retrieval approach that combines the self-organizing map and fuzzy similarity priority ratio method is proposed in case-based design. The self-organizing map can reduce the retrieval range and increase the retrieval efficiency, and the fuzzy similarity priority ratio method can evaluate the similarity of cases comprehensively. To ensure that the final case has the best overall performance, an evaluation method of similar cases based on grey correlation analysis is proposed to evaluate similar cases to select the most suitable case. Furthermore, a computer-aided system is developed using MATLAB GUI to assist the product configuration design. The actual example and result on an ETC series machine tool product show that the proposed method is effective, rapid and accurate in the process of product configuration. The proposed methodology provides a detailed instruction for the product configuration design oriented to customer requirements.

  13. Planning Horizon for Production Inventory Models with Production Rate Dependent on Demand and Inventory Level

    Directory of Open Access Journals (Sweden)

    Jennifer Lin

    2013-01-01

    Full Text Available This paper discusses why the selection of a finite planning horizon is preferable to an infinite one for a replenishment policy of production inventory models. In a production inventory model, the production rate is dependent on both the demand rate and the inventory level. When there is an exponentially decreasing demand, the application of an infinite planning horizon model is not suitable. The emphasis of this paper is threefold. First, while pointing out questionable results from a previous study, we propose a corrected infinite planning horizon inventory model for the first replenishment cycle. Second, while investigating the optimal solution for the minimization problem, we found that the infinite planning horizon should not be applied when dealing with an exponentially decreasing demand. Third, we developed a new production inventory model under a finite planning horizon for practitioners. Numerical examples are provided to support our findings.

  14. Applying XML for designing and interchanging information for multidimensional model

    Institute of Scientific and Technical Information of China (English)

    Lu Changhui; Deng Su; Zhang Weiming

    2005-01-01

    In order to exchange and share information among the conceptual models of data warehouse, and to build a solid base for the integration and share of metadata, a new multidimensional concept model is presented based on XML and its DTD is defined, which can perfectly describe various semantic characteristics of multidimensional conceptual model. According to the multidimensional conceptual modeling technique which is based on UML, the mapping algorithm between the multidimensional conceptual model is described based on XML and UML class diagram, and an application base for the wide use of this technique is given.

  15. Applying Model Checking to Industrial-Sized PLC Programs

    CERN Document Server

    AUTHOR|(CDS)2079190; Darvas, Daniel; Blanco Vinuela, Enrique; Tournier, Jean-Charles; Bliudze, Simon; Blech, Jan Olaf; Gonzalez Suarez, Victor M

    2015-01-01

    Programmable logic controllers (PLCs) are embedded computers widely used in industrial control systems. Ensuring that a PLC software complies with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of safety-critical software but is still underused in industry due to the complexity of building and managing formal models of real applications. In this paper, we propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (\\eg CTL, LTL) on PLC programs. This methodology is based on an intermediate model (IM), meant to transform PLC programs written in various standard languages (ST, SFC, etc.) to different modeling languages of verification tools. We present the syntax and semantics of the IM and the transformation rules of the ST and SFC languages to the nuXmv model checker passing through the intermediate model. Finally, two real cases studies of \\CERN PLC programs, written mainly in th...

  16. Modeling the polymer product maceration

    Science.gov (United States)

    Ahunov, D. N.; Karpova, M. N.

    2014-12-01

    The article contains a view of mass transmission simulation procedure conformably to control of manufacturing method's automation, and also is shown a simulator of polymer product maceration process, and results of developed for this simulator realization program system

  17. Investigating the productivity model for clinical nurses.

    Science.gov (United States)

    Dehghan Nayeri, Nahid; Hooshmand Bahabadi, Abbas; Kazemnejad, Anoshirvan

    2014-01-01

    One of the main objectives of quantitative researches is assessment of models developed by qualitative studies. Models validation through their testing implies that the designed model is representative of the existed facts. Hence, this study was conducted to assess the clinical nurses' productivity model presented for Iranian nurses' productivity. The sample of the study consisted of 360 nurses of Tehran University of Medical Sciences. The research tool was a questionnaire for measuring the components of clinical nurses' productivity. After completing all steps of instrument psychometric and getting answers from the participants, the factors introduced in the questionnaire were named and then Lisrel Path Analysis tests were performed to analyze the components of the model. The results of the model test revealed there is an internal relationship among different components of the model. Regression Analysis showed that each increasing unit in components of the model was to be added to central variable of productivity model -human resource. Model components altogether explained 20 % of clinical nurses' productivity variance. This study found that the important component of productivity is human resources that are reciprocally related to other components of the model. Therefore, it can be stated that the managers can promote the productivity by using efficient strategies to correct human resource patterns.

  18. Geographically Weighted Logistic Regression Applied to Credit Scoring Models

    Directory of Open Access Journals (Sweden)

    Pedro Henrique Melo Albuquerque

    Full Text Available Abstract This study used real data from a Brazilian financial institution on transactions involving Consumer Direct Credit (CDC, granted to clients residing in the Distrito Federal (DF, to construct credit scoring models via Logistic Regression and Geographically Weighted Logistic Regression (GWLR techniques. The aims were: to verify whether the factors that influence credit risk differ according to the borrower’s geographic location; to compare the set of models estimated via GWLR with the global model estimated via Logistic Regression, in terms of predictive power and financial losses for the institution; and to verify the viability of using the GWLR technique to develop credit scoring models. The metrics used to compare the models developed via the two techniques were the AICc informational criterion, the accuracy of the models, the percentage of false positives, the sum of the value of false positive debt, and the expected monetary value of portfolio default compared with the monetary value of defaults observed. The models estimated for each region in the DF were distinct in their variables and coefficients (parameters, with it being concluded that credit risk was influenced differently in each region in the study. The Logistic Regression and GWLR methodologies presented very close results, in terms of predictive power and financial losses for the institution, and the study demonstrated viability in using the GWLR technique to develop credit scoring models for the target population in the study.

  19. Applying the General Linear Model to Repeated Measures Problems.

    Science.gov (United States)

    Pohlmann, John T.; McShane, Michael G.

    The purpose of this paper is to demonstrate the use of the general linear model (GLM) in problems with repeated measures on a dependent variable. Such problems include pretest-posttest designs, multitrial designs, and groups by trials designs. For each of these designs, a GLM analysis is demonstrated wherein full models are formed and restrictions…

  20. [Applying multilevel models in evaluation of bioequivalence (I)].

    Science.gov (United States)

    Liu, Qiao-lan; Shen, Zhuo-zhi; Chen, Feng; Li, Xiao-song; Yang, Min

    2009-12-01

    This study aims to explore the application value of multilevel models for bioequivalence evaluation. Using a real example of 2 x 4 cross-over experimental design in evaluating bioequivalence of antihypertensive drug, this paper explores complex variance components corresponding to criteria statistics in existing methods recommended by FDA but obtained in multilevel models analysis. Results are compared with those from FDA standard Method of Moments, specifically on the feasibility and applicability of multilevel models in directly assessing the bioequivalence (ABE), the population bioequivalence (PBE) and the individual bioequivalence (IBE). When measuring ln (AUC), results from all variance components of the test and reference groups such as total variance (sigma(TT)(2) and sigma(TR)(2)), between-subject variance (sigma(BT)(2) and sigma(BR)(2)) and within-subject variance (sigma(WT)(2) and sigma(WR)(2)) estimated by simple 2-level models are very close to those that using the FDA Method of Moments. In practice, bioequivalence evaluation can be carried out directly by multilevel models, or by FDA criteria, based on variance components estimated from multilevel models. Both approaches produce consistent results. Multilevel models can be used to evaluate bioequivalence in cross-over test design. Compared to FDA methods, this one is more flexible in decomposing total variance into sub components in order to evaluate the ABE, PBE and IBE. Multilevel model provides a new way into the practice of bioequivalence evaluation.

  1. Community Mobilization Model Applied to Support Grandparents Raising Grandchildren

    Science.gov (United States)

    Miller, Jacque; Bruce, Ann; Bundy-Fazioli, Kimberly; Fruhauf, Christine A.

    2010-01-01

    This article discusses the application of a community mobilization model through a case study of one community's response to address the needs of grandparents raising grandchildren. The community mobilization model presented is one that is replicable in addressing diverse community identified issues. Discussed is the building of the partnerships,…

  2. Hydrologic and water quality terminology as applied to modeling

    Science.gov (United States)

    A survey of literature and examination in particular of terminology use in a previous special collection of modeling calibration and validation papers has been conducted to arrive at a list of consistent terminology recommended for writing about hydrologic and water quality model calibration and val...

  3. Trailing edge noise model applied to wind turbine airfoils

    DEFF Research Database (Denmark)

    Bertagnolio, Franck

    The aim of this work is firstly to provide a quick introduction to the theory of noise generation that are relevant to wind turbine technology with focus on trailing edge noise. Secondly, the socalled TNO trailing edge noise model developed by Parchen [1] is described in more details. The model...

  4. 21 CFR 111.165 - What requirements apply to a product received for packaging or labeling as a dietary supplement...

    Science.gov (United States)

    2010-04-01

    ... product that you receive for packaging or labeling as a dietary supplement (and for distribution rather... shipment of the received product to ensure that the received product is consistent with your purchase order... 21 Food and Drugs 2 2010-04-01 2010-04-01 false What requirements apply to a product received...

  5. Selection of productivity improvement techniques via mathematical modeling

    Directory of Open Access Journals (Sweden)

    Mahassan M. Khater

    2011-07-01

    Full Text Available This paper presents a new mathematical model to select an optimal combination of productivity improvement techniques. The proposed model of this paper considers four-stage cycle productivity and the productivity is assumed to be a linear function of fifty four improvement techniques. The proposed model of this paper is implemented for a real-world case study of manufacturing plant. The resulted problem is formulated as a mixed integer programming which can be solved for optimality using traditional methods. The preliminary results of the implementation of the proposed model of this paper indicate that the productivity can be improved through a change on equipments and it can be easily applied for both manufacturing and service industries.

  6. Modeling Sustainability in Product Development and Commercialization

    Science.gov (United States)

    Carlson, Robert C.; Rafinejad, Dariush

    2008-01-01

    In this article, the authors present the framework of a model that integrates strategic product development decisions with the product's impact on future conditions of resources and the environment. The impact of a product on stocks of nonrenewable sources and sinks is linked in a feedback loop to the cost of manufacturing and using the product…

  7. Blue sky catastrophe as applied to modeling of cardiac rhythms

    Science.gov (United States)

    Glyzin, S. D.; Kolesov, A. Yu.; Rozov, N. Kh.

    2015-07-01

    A new mathematical model for the electrical activity of the heart is proposed. The model represents a special singularly perturbed three-dimensional system of ordinary differential equations with one fast and two slow variables. A characteristic feature of the system is that its solution performs nonclassical relaxation oscillations and simultaneously undergoes a blue sky catastrophe bifurcation. Both these factors make it possible to achieve a phenomenological proximity between the time dependence of the fast component in the model and an ECG of the human heart.

  8. Manifold learning techniques and model reduction applied to dissipative PDEs

    CERN Document Server

    Sonday, Benjamin E; Gear, C William; Kevrekidis, Ioannis G

    2010-01-01

    We link nonlinear manifold learning techniques for data analysis/compression with model reduction techniques for evolution equations with time scale separation. In particular, we demonstrate a `"nonlinear extension" of the POD-Galerkin approach to obtaining reduced dynamic models of dissipative evolution equations. The approach is illustrated through a reaction-diffusion PDE, and the performance of different simulators on the full and the reduced models is compared. We also discuss the relation of this nonlinear extension with the so-called "nonlinear Galerkin" methods developed in the context of Approximate Inertial Manifolds.

  9. Forecasting coconut production in the Philippines with ARIMA model

    Science.gov (United States)

    Lim, Cristina Teresa

    2015-02-01

    The study aimed to depict the situation of the coconut industry in the Philippines for the future years applying Autoregressive Integrated Moving Average (ARIMA) method. Data on coconut production, one of the major industrial crops of the country, for the period of 1990 to 2012 were analyzed using time-series methods. Autocorrelation (ACF) and partial autocorrelation functions (PACF) were calculated for the data. Appropriate Box-Jenkins autoregressive moving average model was fitted. Validity of the model was tested using standard statistical techniques. The forecasting power of autoregressive moving average (ARMA) model was used to forecast coconut production for the eight leading years.

  10. Joint regression analysis and AMMI model applied to oat improvement

    Science.gov (United States)

    Oliveira, A.; Oliveira, T. A.; Mejza, S.

    2012-09-01

    In our work we present an application of some biometrical methods useful in genotype stability evaluation, namely AMMI model, Joint Regression Analysis (JRA) and multiple comparison tests. A genotype stability analysis of oat (Avena Sativa L.) grain yield was carried out using data of the Portuguese Plant Breeding Board, sample of the 22 different genotypes during the years 2002, 2003 and 2004 in six locations. In Ferreira et al. (2006) the authors state the relevance of the regression models and of the Additive Main Effects and Multiplicative Interactions (AMMI) model, to study and to estimate phenotypic stability effects. As computational techniques we use the Zigzag algorithm to estimate the regression coefficients and the agricolae-package available in R software for AMMI model analysis.

  11. Pressure Sensitive Paint Applied to Flexible Models Project

    Science.gov (United States)

    Schairer, Edward T.; Kushner, Laura Kathryn

    2014-01-01

    One gap in current pressure-measurement technology is a high-spatial-resolution method for accurately measuring pressures on spatially and temporally varying wind-tunnel models such as Inflatable Aerodynamic Decelerators (IADs), parachutes, and sails. Conventional pressure taps only provide sparse measurements at discrete points and are difficult to integrate with the model structure without altering structural properties. Pressure Sensitive Paint (PSP) provides pressure measurements with high spatial resolution, but its use has been limited to rigid or semi-rigid models. Extending the use of PSP from rigid surfaces to flexible surfaces would allow direct, high-spatial-resolution measurements of the unsteady surface pressure distribution. Once developed, this new capability will be combined with existing stereo photogrammetry methods to simultaneously measure the shape of a dynamically deforming model in a wind tunnel. Presented here are the results and methodology for using PSP on flexible surfaces.

  12. Modular Modelling and Simulation Approach - Applied to Refrigeration Systems

    DEFF Research Database (Denmark)

    Sørensen, Kresten Kjær; Stoustrup, Jakob

    2008-01-01

    This paper presents an approach to modelling and simulation of the thermal dynamics of a refrigeration system, specifically a reefer container. A modular approach is used and the objective is to increase the speed and flexibility of the developed simulation environment. The refrigeration system...... is divided into components where the inputs and outputs are described by a set of XML files that can be combined into a composite system model that may be loaded into MATLABtrade. A set of tools that allows the user to easily load the model and run a simulation are provided. The results show a simulation...... speed-up of more than a factor of three by partitioning the model into smaller parts, and thereby isolating fast and slow dynamics. As a cost there is a reduction in accuracy which in the example considered is less than one percent....

  13. Opto-physiological modeling applied to photoplethysmographic cardiovascular assessment.

    Science.gov (United States)

    Hu, Sijung; Azorin-Peris, Vicente; Zheng, Jia

    2013-01-01

    This paper presents opto-physiological (OP) modeling and its application in cardiovascular assessment techniques based on photoplethysmography (PPG). Existing contact point measurement techniques, i.e., pulse oximetry probes, are compared with the next generation non-contact and imaging implementations, i.e., non-contact reflection and camera-based PPG. The further development of effective physiological monitoring techniques relies on novel approaches to OP modeling that can better inform the design and development of sensing hardware and applicable signal processing procedures. With the help of finite-element optical simulation, fundamental research into OP modeling of photoplethysmography is being exploited towards the development of engineering solutions for practical biomedical systems. This paper reviews a body of research comprising two OP models that have led to significant progress in the design of transmission mode pulse oximetry probes, and approaches to 3D blood perfusion mapping for the interpretation of cardiovascular performance.

  14. Lithospheric structure models applied for locating the Romanian seismic events

    Directory of Open Access Journals (Sweden)

    V. Oancea

    1994-06-01

    Full Text Available The paper presents our attempts made for improving the locations obtained for local seismic events, using refined lithospheric structure models. The location program (based on Geiger method supposes a known model. The program is run for some seismic sequences which occurred in different regions, on the Romanian territory, using for each of the sequences three velocity models: 1 7 layers of constant velocity of seismic waves, as an average structure of the lithosphere for the whole territory; 2 site dependent structure (below each station, based on geophysical and geological information on the crust; 3 curves deseribing the dependence of propagation velocities with depth in the lithosphere, characterizing the 7 structural units delineated on the Romanian territory. The results obtained using the different velocity models are compared. Station corrections are computed for each data set. Finally, the locations determined for some quarry blasts are compared with the real ones.

  15. A Model-based Prognostics Approach Applied to Pneumatic Valves

    Data.gov (United States)

    National Aeronautics and Space Administration — Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain...

  16. A Model-Based Prognostics Approach Applied to Pneumatic Valves

    Data.gov (United States)

    National Aeronautics and Space Administration — Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain...

  17. Simple queueing model applied to the city of Portland

    Energy Technology Data Exchange (ETDEWEB)

    Simon, P.M.; Nagel, K. [Los Alamos National Lab., NM (United States)]|[Santa Fe Inst., NM (United States)

    1998-07-31

    The authors present a simple traffic micro-simulation model that models the effects of capacity cut-off, i.e. the effect of queue built-up when demand is exceeding capacity, and queue spillback, i.e. the effect that queues can spill back across intersections when a congested link is filled up. They derive the model`s fundamental diagrams and explain it. The simulation is used to simulate traffic on the emme/2 network of the Portland (Oregon) metropolitan region (20,000 links). Demand is generated by a simplified home-to-work assignment which generates about half a million trips for the AM peak. Route assignment is done by iterative feedback between micro-simulation and router. Relaxation of the route assignment for the above problem can be achieved within about half a day of computing time on a desktop workstation.

  18. Opto-Physiological Modeling Applied to Photoplethysmographic Cardiovascular Assessment

    Directory of Open Access Journals (Sweden)

    Sijung Hu

    2013-01-01

    Full Text Available This paper presents opto-physiological (OP modeling and its application in cardiovascular assessment techniques based on photoplethysmography (PPG. Existing contact point measurement techniques, i.e., pulse oximetry probes, are compared with the next generation non-contact and imaging implementations, i.e., non-contact reflection and camera-based PPG. The further development of effective physiological monitoring techniques relies on novel approaches to OP modeling that can better inform the design and development of sensing hardware and applicable signal processing procedures. With the help of finite-element optical simulation, fundamental research into OP modeling of photoplethysmography is being exploited towards the development of engineering solutions for practical biomedical systems. This paper reviews a body of research comprising two OP models that have led to significant progress in the design of transmission mode pulse oximetry probes, and approaches to 3D blood perfusion mapping for the interpretation of cardiovascular performance.

  19. A model of provenance applied to biodiversity datasets

    OpenAIRE

    Amanqui, Flor K; De Nies, Tom; Dimou, Anastasia; Verborgh, Ruben; Mannens, Erik; Van De Walle, Rik; Moreira, Dilvan

    2016-01-01

    Nowadays, the Web has become one of the main sources of biodiversity information. An increasing number of biodiversity research institutions add new specimens and their related information to their biological collections and make this information available on the Web. However, mechanisms which are currently available provide insufficient provenance of biodiversity information. In this paper, we propose a new biodiversity provenance model extending the W3C PROV Data Model. Biodiversity data is...

  20. Availability modeling methodology applied to solar power systems

    Science.gov (United States)

    Unione, A.; Burns, E.; Husseiny, A.

    1981-01-01

    Availability is discussed as a measure for estimating the expected performance for solar- and wind-powered generation systems and for identifying causes of performance loss. Applicable analysis techniques, ranging from simple system models to probabilistic fault tree analysis, are reviewed. A methodology incorporating typical availability models is developed for estimating reliable plant capacity. Examples illustrating the impact of design and configurational differences on the expected capacity of a solar-thermal power plant with a fossil-fired backup unit are given.

  1. Fleet Replacement Squadron consolidation : a cost model applied.

    OpenAIRE

    Maholchic, Robert M.

    1991-01-01

    The consolidation of Fleet Replacement Squadrons (FRS) represents one method of achieving planned force reductions. This thesis utilizes the Cost of Base Realignment Actions (COBRA) cost model to develop cost estimates for determination of the cost effective site location. The A-6 FRS consolidation is used as a case study. Data were compiled using completed Functional Wing studies as well as local information sources. A comparison between the cost estimates provided by the COBRA cost model fo...

  2. Tensegrity applied to modelling the motion of viruses

    Institute of Scientific and Technical Information of China (English)

    Cretu Simona-Mariana; Brinzan Gabriela-Catalina

    2011-01-01

    A considerable number of viruses' structures have been discovered and more are expected to be identified. Different viruses' symmetries can be observed at the nanoscale level. The mechanical models of some viruses realised by scientists are described in this paper, none of which has taken into consideration the internal deformation of subsystems.The authors' models for some viruses' elements are introduced, with rigid and flexible links, which reproduce the movements of viruses including internal deformations of the subunits.

  3. SPH method applied to high speed cutting modelling

    OpenAIRE

    LIMIDO, Jérôme; Espinosa, Christine; Salaün, Michel; Lacome, Jean-Luc

    2007-01-01

    The purpose of this study is to introduce a new approach of high speed cutting numerical modelling. A Lagrangian smoothed particle hydrodynamics (SPH)- based model is arried out using the Ls-Dyna software. SPH is a meshless method, thus large material distortions that occur in the cutting problem are easily managed and SPH contact control permits a "natural" workpiece/chip separation. The developed approach is compared to machining dedicated code results and experimental data. The SPH cutting...

  4. The J3 SCR model applied to resonant converter simulation

    Science.gov (United States)

    Avant, R. L.; Lee, F. C. Y.

    1985-01-01

    The J3 SCR model is a continuous topology computer model for the SCR. Its circuit analog and parameter estimation procedure are uniformly applicable to popular computer-aided design and analysis programs such as SPICE2 and SCEPTRE. The circuit analog is based on the intrinsic three pn junction structure of the SCR. The parameter estimation procedure requires only manufacturer's specification sheet quantities as a data base.

  5. Order and Disorder in Product Innovation Models

    NARCIS (Netherlands)

    Pina e Cunha, Miguel; Gomes, Jorge F.S.

    2003-01-01

    This article argues that the conceptual development of product innovation models goes hand in hand with paradigmatic changes in the field of organization science. Remarkable similarities in the change of organizational perspectives and product innovation models are noticeable. To illustrate how chan

  6. Validation of models with constant bias: an applied approach

    Directory of Open Access Journals (Sweden)

    Salvador Medina-Peralta

    2014-06-01

    Full Text Available Objective. This paper presents extensions to the statistical validation method based on the procedure of Freese when a model shows constant bias (CB in its predictions and illustrate the method with data from a new mechanistic model that predict weight gain in cattle. Materials and methods. The extensions were the hypothesis tests and maximum anticipated error for the alternative approach, and the confidence interval for a quantile of the distribution of errors. Results. The model evaluated showed CB, once the CB is removed and with a confidence level of 95%, the magnitude of the error does not exceed 0.575 kg. Therefore, the validated model can be used to predict the daily weight gain of cattle, although it will require an adjustment in its structure based on the presence of CB to increase the accuracy of its forecasts. Conclusions. The confidence interval for the 1-α quantile of the distribution of errors after correcting the constant bias, allows determining the top limit for the magnitude of the error of prediction and use it to evaluate the evolution of the model in the forecasting of the system. The confidence interval approach to validate a model is more informative than the hypothesis tests for the same purpose.

  7. A SIMULATION OF THE PENICILLIN G PRODUCTION BIOPROCESS APPLYING NEURAL NETWORKS

    Directory of Open Access Journals (Sweden)

    A.J.G. da Cruz

    1997-12-01

    Full Text Available The production of penicillin G by Penicillium chrysogenum IFO 8644 was simulated employing a feedforward neural network with three layers. The neural network training procedure used an algorithm combining two procedures: random search and backpropagation. The results of this approach were very promising, and it was observed that the neural network was able to accurately describe the nonlinear behavior of the process. Besides, the results showed that this technique can be successfully applied to control process algorithms due to its long processing time and its flexibility in the incorporation of new data

  8. Agent-Based Modelling applied to 5D model of the HIV infection

    Directory of Open Access Journals (Sweden)

    Toufik Laroum

    2016-12-01

    The simplest model was the 3D mathematical model. But the complexity of this phenomenon and the diversity of cells and actors which affect its evolution requires the use of new approaches such as multi-agents approach that we have applied in this paper. The results of our simulator on the 5D model are promising because they are consistent with biological knowledge’s. Therefore, the proposed approach is well appropriate to the study of population dynamics in general and could help to understand and predict the dynamics of HIV infection.

  9. Investigation of sulphur isotope variation due to different processes applied during uranium ore concentrate production.

    Science.gov (United States)

    Krajkó, Judit; Varga, Zsolt; Wallenius, Maria; Mayer, Klaus; Konings, Rudy

    The applicability and limitations of sulphur isotope ratio as a nuclear forensic signature have been studied. The typically applied leaching methods in uranium mining processes were simulated for five uranium ore samples and the n((34)S)/n((32)S) ratios were measured. The sulphur isotope ratio variation during uranium ore concentrate (UOC) production was also followed using two real-life sample sets obtained from industrial UOC production facilities. Once the major source of sulphur is revealed, its appropriate application for origin assessment can be established. Our results confirm the previous assumption that process reagents have a significant effect on the n((34)S)/n((32)S) ratio, thus the sulphur isotope ratio is in most cases a process-related signature.

  10. Development of in-situ product removal strategies in biocatalysis applying scaled-down unit operations

    DEFF Research Database (Denmark)

    Heintz, Søren; Börner, Tim; Ringborg, Rolf Hoffmeyer;

    2017-01-01

    different process steps while operating it as a combined system, giving the possibility to test and characterize the performance of novel process concepts and biocatalysts with minimal influence of inhibitory products. Here the capabilities of performing process development by applying scaled-down unit......An experimental platform based on scaled-down unit operations combined in a plug-and-play manner enables easy and highly flexible testing of advanced biocatalytic process options such as in-situ product removal (ISPR) process strategies. In such a platform it is possible to compartmentalize......-automatically characterize ω-transaminases in a scaled-down packed-bed reactor (PBR) module, showing MPPA as a strong inhibitor. To overcome the inhibition, a two-step liquid-liquid extraction (LLE) ISPR concept was tested using scaled-down unit operations combined in a plug-and-play manner. Through the tested ISPR concept...

  11. Applying artificial vision models to human scene understanding

    Directory of Open Access Journals (Sweden)

    Elissa Michele Aminoff

    2015-02-01

    Full Text Available How do we understand the complex patterns of neural responses that underlie scene understanding? Studies of the network of brain regions held to be scene-selective – the parahippocampal/lingual region (PPA, the retrosplenial complex (RSC, and the occipital place area (TOS – have typically focused on single visual dimensions (e.g., size, rather than the high-dimensional feature space in which scenes are likely to be neurally represented. Here we leverage well-specified artificial vision systems to explicate a more complex understanding of how scenes are encoded in this functional network. We correlated similarity matrices within three different scene-spaces arising from: 1 BOLD activity in scene-selective brain regions; 2 behavioral measured judgments of visually-perceived scene similarity; and 3 several different computer vision models. These correlations revealed: 1 models that relied on mid- and high-level scene attributes showed the highest correlations with the patterns of neural activity within the scene-selective network; 2 NEIL and SUN – the models that best accounted for the patterns obtained from PPA and TOS – were different from the GIST model that best accounted for the pattern obtained from RSC; 3 The best performing models outperformed behaviorally-measured judgments of scene similarity in accounting for neural data. One computer vision method – NEIL (Never-Ending-Image-Learner, which incorporates visual features learned as statistical regularities across web-scale numbers of scenes – showed significant correlations with neural activity in all three scene-selective regions and was one of the two models best able to account for variance in the PPA and TOS. We suggest that these results are a promising first step in explicating more fine-grained models of neural scene understanding, including developing a clearer picture of the division of labor among the components of the functional scene-selective brain network.

  12. State and parameter estimation based on a nonlinear filter applied to an industrial process control of ethanol production

    Directory of Open Access Journals (Sweden)

    Meleiro L.A.C.

    2000-01-01

    Full Text Available Most advanced computer-aided control applications rely on good dynamics process models. The performance of the control system depends on the accuracy of the model used. Typically, such models are developed by conducting off-line identification experiments on the process. These experiments for identification often result in input-output data with small output signal-to-noise ratio, and using these data results in inaccurate model parameter estimates [1]. In this work, a multivariable adaptive self-tuning controller (STC was developed for a biotechnological process application. Due to the difficulties involving the measurements or the excessive amount of variables normally found in industrial process, it is proposed to develop "soft-sensors" which are based fundamentally on artificial neural networks (ANN. A second approach proposed was set in hybrid models, results of the association of deterministic models (which incorporates the available prior knowledge about the process being modeled with artificial neural networks. In this case, kinetic parameters - which are very hard to be accurately determined in real time industrial plants operation - were obtained using ANN predictions. These methods are especially suitable for the identification of time-varying and nonlinear models. This advanced control strategy was applied to a fermentation process to produce ethyl alcohol (ethanol in industrial scale. The reaction rate considered for substratum consumption, cells and ethanol productions are validated with industrial data for typical operating conditions. The results obtained show that the proposed procedure in this work has a great potential for application.

  13. Marketing Modeling for New Products

    NARCIS (Netherlands)

    C. Hernández-Mireles (Carlos)

    2010-01-01

    textabstractThis thesis addresses the analysis of new or very recent marketing data and the introduction of new marketing models. We present a collection of models that are useful to analyze (1) the optimal launch time of new and dominant technologies, (2) the triggers, speed and timing of new produ

  14. A Model-Based Prognostics Approach Applied to Pneumatic Valves

    Science.gov (United States)

    Daigle, Matthew J.; Goebel, Kai

    2011-01-01

    Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain knowledge of the system, its components, and how they fail by casting the underlying physical phenomena in a physics-based model that is derived from first principles. Uncertainty cannot be avoided in prediction, therefore, algorithms are employed that help in managing these uncertainties. The particle filtering algorithm has become a popular choice for model-based prognostics due to its wide applicability, ease of implementation, and support for uncertainty management. We develop a general model-based prognostics methodology within a robust probabilistic framework using particle filters. As a case study, we consider a pneumatic valve from the Space Shuttle cryogenic refueling system. We develop a detailed physics-based model of the pneumatic valve, and perform comprehensive simulation experiments to illustrate our prognostics approach and evaluate its effectiveness and robustness. The approach is demonstrated using historical pneumatic valve data from the refueling system.

  15. A Model-based Prognostics Approach Applied to Pneumatic Valves

    Directory of Open Access Journals (Sweden)

    Matthew J. Daigle

    2011-01-01

    Full Text Available Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain knowledge of the system, its components, and how they fail by casting the underlying physical phenomena in a physics-based model that is derived from first principles. Uncertainty cannot be avoided in prediction, therefore, algorithms are employed that help in managing these uncertainties. The particle filtering algorithm has become a popular choice for model-based prognostics due to its wide applicability, ease of implementation, and support for uncertainty management. We develop a general model-based prognostics methodology within a robust probabilistic framework using particle filters. As a case study, we consider a pneumatic valve from the Space Shuttle cryogenic refueling system. We develop a detailed physics-based model of the pneumatic valve, and perform comprehensive simulation experiments to illustrate our prognostics approach and evaluate its effectiveness and robustness. The approach is demonstrated using historical pneumatic valve data from the refueling system.

  16. Continuous Molecular Fields Approach Applied to Structure-Activity Modeling

    CERN Document Server

    Baskin, Igor I

    2013-01-01

    The Method of Continuous Molecular Fields is a universal approach to predict various properties of chemical compounds, in which molecules are represented by means of continuous fields (such as electrostatic, steric, electron density functions, etc). The essence of the proposed approach consists in performing statistical analysis of functional molecular data by means of joint application of kernel machine learning methods and special kernels which compare molecules by computing overlap integrals of their molecular fields. This approach is an alternative to traditional methods of building 3D structure-activity and structure-property models based on the use of fixed sets of molecular descriptors. The methodology of the approach is described in this chapter, followed by its application to building regression 3D-QSAR models and conducting virtual screening based on one-class classification models. The main directions of the further development of this approach are outlined at the end of the chapter.

  17. Mathematical modeling applied to the left ventricle of heart

    CERN Document Server

    Ranjbar, Saeed

    2014-01-01

    Background: How can mathematics help us to understand the mechanism of the cardiac motion? The best known approach is to take a mathematical model of the fibered structure, insert it into a more-or-less complex model of cardiac architecture, and then study the resulting fibers of activation that propagate through the myocardium. In our paper, we have attempted to create a novel software capable of demonstrate left ventricular (LV) model in normal hearts. Method: Echocardiography was performed on 70 healthy volunteers. Data evaluated included: velocity (radial, longitudinal, rotational and vector point), displacement (longitudinal and rotational), strain rate (longitudinal and circumferential) and strain (radial, longitudinal and circumferential) of all 16 LV myocardial segments. Using these data, force vectors of myocardial samples were estimated by MATLAB software, interfaced in the echocardiograph system. Dynamic orientation contraction (through the cardiac cycle) of every individual myocardial fiber could ...

  18. Applying meta-analysis to structural equation modeling.

    Science.gov (United States)

    Hedges, Larry V

    2016-06-01

    Structural equation models play an important role in the social sciences. Consequently, there is an increasing use of meta-analytic methods to combine evidence from studies that estimate the parameters of structural equation models. Two approaches are used to combine evidence from structural equation models: A direct approach that combines structural coefficients and an indirect approach that first combines correlation matrices and estimates structural coefficients from the combined correlation matrix. When there is no heterogeneity across studies, direct estimates of structural coefficients from several studies is an appealing approach. Heterogeneity of correlation matrices across studies presents both practical and conceptual problems. An alternative approach to heterogeneity is suggested as an example of how to better handle heterogeneity in this context. Copyright © 2016 John Wiley & Sons, Ltd.

  19. Differential Evolution algorithm applied to FSW model calibration

    Science.gov (United States)

    Idagawa, H. S.; Santos, T. F. A.; Ramirez, A. J.

    2014-03-01

    Friction Stir Welding (FSW) is a solid state welding process that can be modelled using a Computational Fluid Dynamics (CFD) approach. These models use adjustable parameters to control the heat transfer and the heat input to the weld. These parameters are used to calibrate the model and they are generally determined using the conventional trial and error approach. Since this method is not very efficient, we used the Differential Evolution (DE) algorithm to successfully determine these parameters. In order to improve the success rate and to reduce the computational cost of the method, this work studied different characteristics of the DE algorithm, such as the evolution strategy, the objective function, the mutation scaling factor and the crossover rate. The DE algorithm was tested using a friction stir weld performed on a UNS S32205 Duplex Stainless Steel.

  20. Applying Sewage Sludge to Eucalyptus grandis Plantations: Effects on Biomass Production and Nutrient Cycling through Litterfall

    Directory of Open Access Journals (Sweden)

    Paulo Henrique Müller da Silva

    2011-01-01

    Full Text Available In most Brazilian cities sewage sludge is dumped into sanitary landfills, even though its use in forest plantations as a fertilizer and soil conditioner might be an interesting option. Sewage sludge applications might reduce the amounts of mineral fertilizers needed to sustain the productivity on infertile tropical soils. However, sewage sludge must be applied with care to crops to avoid soil and water pollution. The aim of our study was to assess the effects of dry and wet sewage sludges on the growth and nutrient cycling of Eucalyptus grandis plantations established on the most common soil type for Brazilian eucalypt plantations. Biomass production and nutrient cycling were studied over a 36-month period in a complete randomized block design. Four experimental treatments were compared: wet sewage sludge, dry sludge, mineral fertilizer, and no fertilizer applications. The two types of sludges as well as mineral fertilizer increased significantly the biomass of Eucalyptus trees. Wood biomass productions 36 months after planting were similar in the sewage sludge and mineral fertilization treatments (about 80 tons ha−1 and 86% higher than in the control treatment. Sewage sludge application also affected positively leaf litter production and significantly increased nutrient transfer among the components of the ecosystem.

  1. Safe production model for small mines

    Institute of Scientific and Technical Information of China (English)

    Calizaya F.; Suryanto S.

    2008-01-01

    Presented a "safe production model" that can be adopted by small mine opera-tors to achieve their production targets safely and efficiently. The model consists of eightelements ranging from management commitment and leadership to safety account-abilityand communication. The model is developed considering the mine operators' resourcelimitations and the workers' training needs. The study concludes with a summary of asample survey that is conducted to validate the model and estimate a parameter for eachmine and determine its position in the safe production scale.

  2. The method of characteristics applied to analyse 2DH models

    NARCIS (Netherlands)

    Sloff, C.J.

    1992-01-01

    To gain insight into the physical behaviour of 2D hydraulic models (mathematically formulated as a system of partial differential equations), the method of characteristics is used to analyse the propagation of physical meaningful disturbances. These disturbances propagate as wave fronts along bichar

  3. An Analytical Model for Learning: An Applied Approach.

    Science.gov (United States)

    Kassebaum, Peter Arthur

    A mediated-learning package, geared toward non-traditional students, was developed for use in the College of Marin's cultural anthropology courses. An analytical model for learning was used in the development of the package, utilizing concepts related to learning objectives, programmed instruction, Gestalt psychology, cognitive psychology, and…

  4. Polarimetric SAR interferometry applied to land ice: modeling

    DEFF Research Database (Denmark)

    Dall, Jørgen; Papathanassiou, Konstantinos; Skriver, Henning

    2004-01-01

    This paper introduces a few simple scattering models intended for the application of polarimetric SAR interfer-ometry to land ice. The principal aim is to eliminate the penetration bias hampering ice sheet elevation maps generated with single-channel SAR interferometry. The polarimetric coherent...

  5. Robust model identification applied to type 1diabetes

    DEFF Research Database (Denmark)

    Finan, Daniel Aaron; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad;

    2010-01-01

    In many realistic applications, process noise is known to be neither white nor normally distributed. When identifying models in these cases, it may be more effective to minimize a different penalty function than the standard sum of squared errors (as in a least-squares identification method...

  6. Applied Bounded Model Checking for Interlocking System Designs

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Peleska, Jan; Pinger, Ralf

    2014-01-01

    of behavioural (operational) semantics. The former checks that the plant model – that is, the software components reflecting the physical components of the interlocking system – has been set up in an adequate way. The latter investigates trains moving through the network, with the objective to uncover potential...

  7. Applying an Employee-Motivation Model to Prevent Student Plagiarism.

    Science.gov (United States)

    Malouff, John M.; Sims, Randi L.

    1996-01-01

    A model based on Vroom's expectancy theory of employee motivation posits that instructors can prevent plagiarism by ensuring that students understand the rules of ethical writing, expect assignments to be manageable and have personal benefits, and expect plagiarism to be difficult and have important personal costs. (SK)

  8. A Decision-Making Model Applied to Career Counseling.

    Science.gov (United States)

    Olson, Christine; And Others

    1990-01-01

    A four-component model for career decision-making counseling relates each component to assessment questions and appropriate intervention strategies. The components are (1) conceptualization (definition of the problem); (2) enlargement of response repertoire (generation of alternatives); (3) identification of discriminative stimuli (consequences of…

  9. Improving Credit Scorecard Modeling Through Applying Text Analysis

    Directory of Open Access Journals (Sweden)

    Omar Ghailan

    2016-04-01

    Full Text Available In the credit card scoring and loans management, the prediction of the applicant’s future behavior is an important decision support tool and a key factor in reducing the risk of Loan Default. A lot of data mining and classification approaches have been developed for the credit scoring purpose. For the best of our knowledge, building a credit scorecard by analyzing the textual data in the application form has not been explored so far. This paper proposes a comprehensive credit scorecard model technique that improves credit scorecard modeling though employing textual data analysis. This study uses a sample of loan application forms of a financial institution providing loan services in Yemen, which represents a real-world situation of the credit scoring and loan management. The sample contains a set of Arabic textual data attributes defining the applicants. The credit scoring model based on the text mining pre-processing and logistic regression techniques is proposed and evaluated through a comparison with a group of credit scorecard modeling techniques that use only the numeric attributes in the application form. The results show that adding the textual attributes analysis achieves higher classification effectiveness and outperforms the other traditional numerical data analysis techniques.

  10. Dynamics Model Applied to Pricing Options with Uncertain Volatility

    Directory of Open Access Journals (Sweden)

    Lorella Fatone

    2012-01-01

    model is proposed. The data used to test the calibration problem included observations of asset prices over a finite set of (known equispaced discrete time values. Statistical tests were used to estimate the statistical significance of the two parameters of the Black-Scholes model: the volatility and the drift. The effects of these estimates on the option pricing problem were investigated. In particular, the pricing of an option with uncertain volatility in the Black-Scholes framework was revisited, and a statistical significance was associated with the price intervals determined using the Black-Scholes-Barenblatt equations. Numerical experiments involving synthetic and real data were presented. The real data considered were the daily closing values of the S&P500 index and the associated European call and put option prices in the year 2005. The method proposed here for calibrating the Black-Scholes dynamics model could be extended to other science and engineering models that may be expressed in terms of stochastic dynamical systems.

  11. A Spatial Lattice Model Applied for Meteorological Visualization and Analysis

    Directory of Open Access Journals (Sweden)

    Mingyue Lu

    2017-03-01

    Full Text Available Meteorological information has obvious spatial-temporal characteristics. Although it is meaningful to employ a geographic information system (GIS to visualize and analyze the meteorological information for better identification and forecasting of meteorological weather so as to reduce the meteorological disaster loss, modeling meteorological information based on a GIS is still difficult because meteorological elements generally have no stable shape or clear boundary. To date, there are still few GIS models that can satisfy the requirements of both meteorological visualization and analysis. In this article, a spatial lattice model based on sampling particles is proposed to support both the representation and analysis of meteorological information. In this model, a spatial sampling particle is regarded as the basic element that contains the meteorological information, and the location where the particle is placed with the time mark. The location information is generally represented using a point. As these points can be extended to a surface in two dimensions and a voxel in three dimensions, if these surfaces and voxels can occupy a certain space, then this space can be represented using these spatial sampling particles with their point locations and meteorological information. In this case, the full meteorological space can then be represented by arranging numerous particles with their point locations in a certain structure and resolution, i.e., the spatial lattice model, and extended at a higher resolution when necessary. For practical use, the meteorological space is logically classified into three types of spaces, namely the projection surface space, curved surface space, and stereoscopic space, and application-oriented spatial lattice models with different organization forms of spatial sampling particles are designed to support the representation, inquiry, and analysis of meteorological information within the three types of surfaces. Cases

  12. Applying Transtheoretical Model to Promote Physical Activities Among Women

    Science.gov (United States)

    Pirzadeh, Asiyeh; Mostafavi, Firoozeh; Ghofranipour, Fazllolah; Feizi, Awat

    2015-01-01

    Background: Physical activity is one of the most important indicators of health in communities but different studies conducted in the provinces of Iran showed that inactivity is prevalent, especially among women. Objectives: Inadequate regular physical activities among women, the importance of education in promoting the physical activities, and lack of studies on the women using transtheoretical model, persuaded us to conduct this study with the aim of determining the application of transtheoretical model in promoting the physical activities among women of Isfahan. Materials and Methods: This research was a quasi-experimental study which was conducted on 141 women residing in Isfahan, Iran. They were randomly divided into case and control groups. In addition to the demographic information, their physical activities and the constructs of the transtheoretical model (stages of change, processes of change, decisional balance, and self-efficacy) were measured at 3 time points; preintervention, 3 months, and 6 months after intervention. Finally, the obtained data were analyzed through t test and repeated measures ANOVA test using SPSS version 16. Results: The results showed that education based on the transtheoretical model significantly increased physical activities in 2 aspects of intensive physical activities and walking, in the case group over the time. Also, a high percentage of people have shown progress during the stages of change, the mean of the constructs of processes of change, as well as pros and cons. On the whole, a significant difference was observed over the time in the case group (P < 0.01). Conclusions: This study showed that interventions based on the transtheoretical model can promote the physical activity behavior among women. PMID:26834796

  13. Solvent effect modelling of isocyanuric products synthesis by chemometric methods

    OpenAIRE

    Havet, Jean-Louis; Billiau-Loreau, Myriam; Porte, Catherine; Delacroix, Alain

    2002-01-01

    Chemometric tools were used to generate the modelling of solvent e¡ects on the N-alkylation of an isocyanuric acid salt. The method proceeded from a central composite design applied on the Carlson solvent classification using principal components analysis. The selectivity of the reaction was studied from the production of different substituted isocyanuric derivatives. Response graphs were obtained for each compound and used to devise a strategy for solvent selection. The prediction models wer...

  14. Reliability Measures of Second-Order Semi-Markov Chain Applied to Wind Energy Production

    Directory of Open Access Journals (Sweden)

    Guglielmo D'Amico

    2013-01-01

    Full Text Available We consider the problem of wind energy production by using a second-order semi-Markov chain in state and duration as a model of wind speed. The model used in this paper is based on our previous work where we have shown the ability of second-order semi-Markov process in reproducing statistical features of wind speed. Here we briefly present the mathematical model and describe the data and technical characteristics of a commercial wind turbine (Aircon HAWT-10 kW. We show how, by using our model, it is possible to compute some of the main dependability measures such as reliability, availability, and maintainability functions. We compare, by means of Monte Carlo simulations, the results of the model with real energy production obtained from data available in the Lastem station (Italy and sampled every 10 minutes. The computation of the dependability measures is a crucial point in the planning and development of a wind farm. Through our model, we show how the values of this quantity can be obtained both analytically and computationally.

  15. Applying Model Checking to Generate Model-Based Integration Tests from Choreography Models

    Science.gov (United States)

    Wieczorek, Sebastian; Kozyura, Vitaly; Roth, Andreas; Leuschel, Michael; Bendisposto, Jens; Plagge, Daniel; Schieferdecker, Ina

    Choreography models describe the communication protocols between services. Testing of service choreographies is an important task for the quality assurance of service-based systems as used e.g. in the context of service-oriented architectures (SOA). The formal modeling of service choreographies enables a model-based integration testing (MBIT) approach. We present MBIT methods for our service choreography modeling approach called Message Choreography Models (MCM). For the model-based testing of service choreographies, MCMs are translated into Event-B models and used as input for our test generator which uses the model checker ProB.

  16. Statistical Model Checking for Product Lines

    DEFF Research Database (Denmark)

    ter Beek, Maurice H.; Legay, Axel; Lluch Lafuente, Alberto

    2016-01-01

    average cost of products (in terms of the attributes of the products’ features) and the probability of features to be (un)installed at runtime. The product lines must be modelled in QFLan, which extends the probabilistic feature-oriented language PFLan with novel quantitative constraints among features......We report on the suitability of statistical model checking for the analysis of quantitative properties of product line models by an extended treatment of earlier work by the authors. The type of analysis that can be performed includes the likelihood of specific product behaviour, the expected...... and on behaviour and with advanced feature installation options. QFLan is a rich process-algebraic specification language whose operational behaviour interacts with a store of constraints, neatly separating product configuration from product behaviour. The resulting probabilistic configurations and probabilistic...

  17. {sup 7}Be radioactive beam production at CIRCE and its utilization in basic and applied physics

    Energy Technology Data Exchange (ETDEWEB)

    Limata, Benedicta Normanna [Sezione di Napoli, INFN, Ed. G, Via Cintia, Napoli 80126 (Italy)], E-mail: limata@na.infn.it; Gialanella, Lucio; Leva, Antonino Di [Sezione di Napoli, INFN, Ed. G, Via Cintia, Napoli 80126 (Italy); Cesare, Nicola De [Sezione di Napoli, INFN, Ed. G, Via Cintia, Napoli 80126 (Italy); Dipartimento di Scienze della Vita, II Universita di Napoli, Via Vivaldi 43, Caserta 81100 (Italy); D' Onofrio, Antonio [Sezione di Napoli, INFN, Ed. G, Via Cintia, Napoli 80126 (Italy); Dipartimento di Scienze Ambientali, II Universita di Napoli, Via Vivaldi 43, Caserta 81100 (Italy); Gyurky, G. [ATOMKI, POB 51, Debrecen H-4001 (Hungary); Rolfs, Claus [Institut fuer Experimentalphysik III, RuhrUniversitaet, Universitatetstrasse 150, Bochum D-44780 (Germany); Romano, Mario [Sezione di Napoli, INFN, Ed. G, Via Cintia, Napoli 80126 (Italy); Dipartimento di Scienze Fisiche, Universita di Napoli Federico II, Ed. G Via Cintia, Napoli 80126 (Italy); Rogalla, Detlef [Institut fuer Experimentalphysik III, RuhrUniversitaet, Universitatetstrasse 150, Bochum D-44780 (Germany); Rossi, Cesare; Russo, Michele [DIME, Universita di Napoli Federico II, Via Claudio, Napoli 80126 (Italy); Somorjai, Endre [ATOMKI, POB 51, Debrecen H-4001 (Hungary); Terrasi, Filippo [Sezione di Napoli, INFN, Ed. G, Via Cintia, Napoli 80126 (Italy); Dipartimento di Scienze Ambientali, II Universita di Napoli, Via Vivaldi 43, Caserta 81100 (Italy)

    2008-05-15

    A pure {sup 7}Be beam with an energy E = 1-8 MeV is available for nuclear and applied physics at the 3 MV Pelletron tandem accelerator CIRCE in Caserta. The beam is produced using an offline technique. Typical analyzed beam intensities are about 2 ppA, using cathodes with an activity of the order of 200 MBq. The {sup 7}Be implantation has been used for both fundamental nuclear physics and applied physics. In particular, different metals have been implanted with {sup 7}Be in order to study the influence of the chemical composition and of the number of quasi-free electrons of the host material on the {sup 7}Be half-life. In the field of applied physics, the {sup 7}Be implantation turns out to be very interesting for wear measurement. In fact, in this case {sup 7}Be is used as a depth-sensitive tracer. The continuous detection of the sample activity during the wear allows a high sensitivity measurement of wearing speed. The {sup 7}Be beam production at CIRCE, the implantation procedure and the results obtained from the {sup 7}Be half-life measurements and the wear characterization of implanted steel samples are described.

  18. Applying learning theories and instructional design models for effective instruction.

    Science.gov (United States)

    Khalil, Mohammed K; Elkhider, Ihsan A

    2016-06-01

    Faculty members in higher education are involved in many instructional design activities without formal training in learning theories and the science of instruction. Learning theories provide the foundation for the selection of instructional strategies and allow for reliable prediction of their effectiveness. To achieve effective learning outcomes, the science of instruction and instructional design models are used to guide the development of instructional design strategies that elicit appropriate cognitive processes. Here, the major learning theories are discussed and selected examples of instructional design models are explained. The main objective of this article is to present the science of learning and instruction as theoretical evidence for the design and delivery of instructional materials. In addition, this article provides a practical framework for implementing those theories in the classroom and laboratory.

  19. Applying the Extended Parallel Process Model to workplace safety messages.

    Science.gov (United States)

    Basil, Michael; Basil, Debra; Deshpande, Sameer; Lavack, Anne M

    2013-01-01

    The extended parallel process model (EPPM) proposes fear appeals are most effective when they combine threat and efficacy. Three studies conducted in the workplace safety context examine the use of various EPPM factors and their effects, especially multiplicative effects. Study 1 was a content analysis examining the use of EPPM factors in actual workplace safety messages. Study 2 experimentally tested these messages with 212 construction trainees. Study 3 replicated this experiment with 1,802 men across four English-speaking countries-Australia, Canada, the United Kingdom, and the United States. The results of these three studies (1) demonstrate the inconsistent use of EPPM components in real-world work safety communications, (2) support the necessity of self-efficacy for the effective use of threat, (3) show a multiplicative effect where communication effectiveness is maximized when all model components are present (severity, susceptibility, and efficacy), and (4) validate these findings with gory appeals across four English-speaking countries.

  20. Consideration of an applied model of public health program infrastructure.

    Science.gov (United States)

    Lavinghouze, René; Snyder, Kimberly; Rieker, Patricia; Ottoson, Judith

    2013-01-01

    Systemic infrastructure is key to public health achievements. Individual public health program infrastructure feeds into this larger system. Although program infrastructure is rarely defined, it needs to be operationalized for effective implementation and evaluation. The Ecological Model of Infrastructure (EMI) is one approach to defining program infrastructure. The EMI consists of 5 core (Leadership, Partnerships, State Plans, Engaged Data, and Managed Resources) and 2 supporting (Strategic Understanding and Tactical Action) elements that are enveloped in a program's context. We conducted a literature search across public health programs to determine support for the EMI. Four of the core elements were consistently addressed, and the other EMI elements were intermittently addressed. The EMI provides an initial and partial model for understanding program infrastructure, but additional work is needed to identify evidence-based indicators of infrastructure elements that can be used to measure success and link infrastructure to public health outcomes, capacity, and sustainability.

  1. Structure Modeling and Validation applied to Source Physics Experiments (SPEs)

    Science.gov (United States)

    Larmat, C. S.; Rowe, C. A.; Patton, H. J.

    2012-12-01

    The U. S. Department of Energy's Source Physics Experiments (SPEs) comprise a series of small chemical explosions used to develop a better understanding of seismic energy generation and wave propagation for low-yield explosions. In particular, we anticipate improved understanding of the processes through which shear waves are generated by the explosion source. Three tests, 100, 1000 and 1000 kg yields respectively, were detonated in the same emplacement hole and recorded on the same networks of ground motion sensors in the granites of Climax Stock at the Nevada National Security Site. We present results for the analysis and modeling of seismic waveforms recorded close-in on five linear geophone lines extending radially from ground zero, having offsets from 100 to 2000 m and station spacing of 100 m. These records exhibit azimuthal variations of P-wave arrival times, and phase velocity, spreading and attenuation properties of high-frequency Rg waves. We construct a 1D seismic body-wave model starting from a refraction analysis of P-waves and adjusting to address time-domain and frequency-domain dispersion measurements of Rg waves between 2 and 9 Hz. The shallowest part of the structure we address using the arrival times recorded by near-field accelerometers residing within 200 m of the shot hole. We additionally perform a 2D modeling study with the Spectral Element Method (SEM) to investigate which structural features are most responsible for the observed variations, in particular anomalously weak amplitude decay in some directions of this topographically complicated locality. We find that a near-surface, thin, weathered layer of varying thickness and low wave speeds plays a major role on the observed waveforms. We anticipate performing full 3D modeling of the seismic near-field through analysis and validation of waveforms on the 5 radial receiver arrays.

  2. Cellular systems biology profiling applied to cellular models of disease.

    Science.gov (United States)

    Giuliano, Kenneth A; Premkumar, Daniel R; Strock, Christopher J; Johnston, Patricia; Taylor, Lansing

    2009-11-01

    Building cellular models of disease based on the approach of Cellular Systems Biology (CSB) has the potential to improve the process of creating drugs as part of the continuum from early drug discovery through drug development and clinical trials and diagnostics. This paper focuses on the application of CSB to early drug discovery. We discuss the integration of protein-protein interaction biosensors with other multiplexed, functional biomarkers as an example in using CSB to optimize the identification of quality lead series compounds.

  3. Applying OWA operator to model group behaviors in uncertain QFD

    OpenAIRE

    2013-01-01

    It is a crucial step to derive the priority order of design requirements (DRs) from customer requirements (CRs) in quality function deployment (QFD). However, it is not straightforward to prioritize DRs due to two types of uncertainties: human subjective perception and user variability. This paper proposes an OWA based group decision-making approach to uncertain QFD with an application to a flexible manufacturing system design. The proposed model performs computations solely based on the orde...

  4. Applying Transtheoretical Model to Promote Physical Activities Among Women

    OpenAIRE

    2015-01-01

    Background: Physical activity is one of the most important indicators of health in communities but different studies conducted in the provinces of Iran showed that inactivity is prevalent, especially among women. Objectives: Inadequate regular physical activities among women, the importance of education in promoting the physical activities, and lack of studies on the women using transtheoretical model, persuaded us to conduct this study with the aim of determining the application of transtheo...

  5. Modeling a Thermoelectric Generator Applied to Diesel Automotive Heat Recovery

    Science.gov (United States)

    Espinosa, N.; Lazard, M.; Aixala, L.; Scherrer, H.

    2010-09-01

    Thermoelectric generators (TEGs) are outstanding devices for automotive waste heat recovery. Their packaging, lack of moving parts, and direct heat to electrical conversion are the main benefits. Usually, TEGs are modeled with a constant hot-source temperature. However, energy in exhaust gases is limited, thus leading to a temperature decrease as heat is recovered. Therefore thermoelectric properties change along the TEG, affecting performance. A thermoelectric generator composed of Mg2Si/Zn4Sb3 for high temperatures followed by Bi2Te3 for low temperatures has been modeled using engineering equation solver (EES) software. The model uses the finite-difference method with a strip-fins convective heat transfer coefficient. It has been validated on a commercial module with well-known properties. The thermoelectric connection and the number of thermoelements have been addressed as well as the optimum proportion of high-temperature material for a given thermoelectric heat exchanger. TEG output power has been estimated for a typical commercial vehicle at 90°C coolant temperature.

  6. A theoretical intellectual capital model applied to cities

    Directory of Open Access Journals (Sweden)

    José Luis Alfaro Navarro

    2013-06-01

    Full Text Available New Management Information Systems (MIS are necessary at local level as the main source of wealth creation. Therefore, tools and approaches that provide a full future vision of any organization should be a strategic priority for economic development. In this line, cities are “centers of knowledge and sources of growth and innovation” and integrated urban development policies are necessary. These policies support communication networks and optimize location structures as strategies that provide opportunities for social and democratic participation for the citizens. This paper proposes a theoretical model to measure and evaluate the cities intellectual capital that allows determine what we must take into account to make cities a source of wealth, prosperity, welfare and future growth. Furthermore, local intellectual capital provides a long run vision. Thus, in this paper we develop and explain how to implement a model to estimate intellectual capital in cities. In this sense, our proposal is to provide a model for measuring and managing intellectual capital using socio-economic indicators for cities. These indicators offer a long term picture supported by a comprehensive strategy for those who occupy the local space, infrastructure for implementation and management of the environment for its development.

  7. Experiences & Tools from Modeling Instruction Applied to Earth Sciences

    Science.gov (United States)

    Cervenec, J.; Landis, C. E.

    2012-12-01

    The Framework for K-12 Science Education calls for stronger curricular connections within the sciences, greater depth in understanding, and tasks higher on Bloom's Taxonomy. Understanding atmospheric sciences draws on core knowledge traditionally taught in physics, chemistry, and in some cases, biology. If this core knowledge is not conceptually sound, well retained, and transferable to new settings, understanding the causes and consequences of climate changes become a task in memorizing seemingly disparate facts to a student. Fortunately, experiences and conceptual tools have been developed and refined in the nationwide network of Physics Modeling and Chemistry Modeling teachers to build necessary understanding of conservation of mass, conservation of energy, particulate nature of matter, kinetic molecular theory, and particle model of light. Context-rich experiences are first introduced for students to construct an understanding of these principles and then conceptual tools are deployed for students to resolve misconceptions and deepen their understanding. Using these experiences and conceptual tools takes an investment of instructional time, teacher training, and in some cases, re-envisioning the format of a science classroom. There are few financial barriers to implementation and students gain a greater understanding of the nature of science by going through successive cycles of investigation and refinement of their thinking. This presentation shows how these experiences and tools could be used in an Earth Science course to support students developing conceptually rich understanding of the atmosphere and connections happening within.

  8. Simple Queueing Model Applied to the City of Portland

    Science.gov (United States)

    Simon, Patrice M.; Esser, Jörg; Nagel, Kai

    We use a simple traffic micro-simulation model based on queueing dynamics as introduced by Gawron [IJMPC, 9(3):393, 1998] in order to simulate traffic in Portland/Oregon. Links have a flow capacity, that is, they do not release more vehicles per second than is possible according to their capacity. This leads to queue built-up if demand exceeds capacity. Links also have a storage capacity, which means that once a link is full, vehicles that want to enter the link need to wait. This leads to queue spill-back through the network. The model is compatible with route-plan-based approaches such as TRANSIMS, where each vehicle attempts to follow its pre-computed path. Yet, both the data requirements and the computational requirements are considerably lower than for the full TRANSIMS microsimulation. Indeed, the model uses standard emme/2 network data, and runs about eight times faster than real time with more than 100 000 vehicles simultaneously in the simulation on a single Pentium-type CPU. We derive the model's fundamental diagrams and explain it. The simulation is used to simulate traffic on the emme/2 network of the Portland (Oregon) metropolitan region (20 000 links). Demand is generated by a simplified home-to-work destination assignment which generates about half a million trips for the morning peak. Route assignment is done by iterative feedback between micro-simulation and router. An iterative solution of the route assignment for the above problem can be achieved within about half a day of computing time on a desktop workstation. We compare results with field data and with results of traditional assignment runs by the Portland Metropolitan Planning Organization. Thus, with a model such as this one, it is possible to use a dynamic, activities-based approach to transportation simulation (such as in TRANSIMS) with affordable data and hardware. This should enable systematic research about the coupling of demand generation, route assignment, and micro

  9. Optimization strategies in the modelling of SG-SMB applied to separation of phenylalanine and tryptophan

    Science.gov (United States)

    Diógenes Tavares Câmara, Leôncio

    2014-03-01

    The solvent-gradient simulated moving bed process (SG-SMB) is the new tendency in the performance improvement if compared to the traditional isocratic solvent conditions. In such SG-SMB process the modulation of the solvent strength leads to significant increase in the purities and productivity followed by reduction in the solvent consumption. A stepwise modelling approach was utilized in the representation of the interconnected chromatographic columns of the system combined with a lumped mass transfer model between the solid and liquid phase. The influence of the solvent modifier was considered applying the Abel model which takes into account the effect of modifier volume fraction over the partition coefficient. Correlation models of the mass transfer parameters were obtained through the retention times of the solutes according to the volume fraction of modifier. The modelling and simulations were carried out and compared to the experimental SG-SMB separation unit of the amino acids Phenylalanine and Tryptophan. The simulation results showed the great potential of the proposed modelling approach in the representation of such complex systems. The simulations showed great agreement fitting the experimental data of the amino acids concentrations both at the extract as well as at the raffinate. A new optimization strategy was proposed in the determination of the best operating conditions which uses the phi-plot concept.

  10. Product Modelling for Model-Based Maintenance

    NARCIS (Netherlands)

    Houten, van F.J.A.M.; Tomiyama, T.; Salomons, O.W.

    1998-01-01

    The paper describes the fundamental concepts of maintenance and the role that information technology can play in the support of maintenance activities. Function-Behaviour-State modelling is used to describe faults and deterioration of mechanisms in terms of user perception and measurable quantities.

  11. Neighborhood Design, Physical Activity, and Wellbeing: Applying the Walkability Model.

    Science.gov (United States)

    Zuniga-Teran, Adriana A; Orr, Barron J; Gimblett, Randy H; Chalfoun, Nader V; Guertin, David P; Marsh, Stuart E

    2017-01-13

    Neighborhood design affects lifestyle physical activity, and ultimately human wellbeing. There are, however, a limited number of studies that examine neighborhood design types. In this research, we examine four types of neighborhood designs: traditional development, suburban development, enclosed community, and cluster housing development, and assess their level of walkability and their effects on physical activity and wellbeing. We examine significant associations through a questionnaire (n = 486) distributed in Tucson, Arizona using the Walkability Model. Among the tested neighborhood design types, traditional development showed significant associations and the highest value for walkability, as well as for each of the two types of walking (recreation and transportation) representing physical activity. Suburban development showed significant associations and the highest mean values for mental health and wellbeing. Cluster housing showed significant associations and the highest mean value for social interactions with neighbors and for perceived safety from crime. Enclosed community did not obtain the highest means for any wellbeing benefit. The Walkability Model proved useful in identifying the walkability categories associated with physical activity and perceived crime. For example, the experience category was strongly and inversely associated with perceived crime. This study provides empirical evidence of the importance of including vegetation, particularly trees, throughout neighborhoods in order to increase physical activity and wellbeing. Likewise, the results suggest that regular maintenance is an important strategy to improve mental health and overall wellbeing in cities.

  12. ORGANIZING SCENARIO VARIABLES BY APPLYING THE INTERPRETATIVE STRUCTURAL MODELING (ISM

    Directory of Open Access Journals (Sweden)

    Daniel Estima de Carvalho

    2009-10-01

    Full Text Available The scenario building method is a thought mode - taken to effect in an optimized, strategic manner - based on trends and uncertain events, concerning a large variety of potential results that may impact the future of an organization.In this study, the objective is to contribute towards a possible improvement in Godet and Schoemaker´s scenario preparation methods, by employing the Interpretative Structural Modeling (ISM as a tool for the analysis of variables.Given this is an exploratory theme, bibliographical research with tool definition and analysis, examples extraction from literature and a comparison exercise of referred methods, were undertaken.It was verified that ISM may substitute or complement the original tools for the analysis of variables of scenarios per Godet and Schoemaker’s methods, given the fact that it enables an in-depth analysis of relations between variables in a shorter period of time, facilitating both structuring and construction of possible scenarios.Key-words: Strategy. Future studies. Interpretative Structural Modeling.

  13. Minimization of Heat Energy Intensity in Food Production Companies Applying Sustainable Industrial Development Methods

    Directory of Open Access Journals (Sweden)

    Irina Kliopova

    2011-10-01

    Full Text Available Lithuanian food and drink sector of industry is characterized by high energy intensity, which is 29% higher than the EU average. At the confectionary plant chosen for the experiment, an environmental impact has been controlled and its maximum managed by creating different procedures to reduce pollution. Assessment of the plant's environmental costs has revealed that the energy costs amount to main part of the environmental ones (up to 55.4%. In recent years several energy efficiency projects have been implemented allowing minimizing the plant's energy intensity up to 15%. An algorithm of feasibility analysis of increasing thermal energy efficiency of the plant was suggested which could also be applied to other food industry plants. Demand for heat energy within the plant was evaluated for each technological process; the fuel and energy balance of the plant boiler-house was drawn up. It was revealed that huge heat energy losses were made during heat energy production and usage. During the research period a control system of significant environmental aspects was suggested, its objective function was estimated. Several environmental alternatives were suggested for optimization of the heat energy production processes. Three projects were chosen for the feasibility analysis. Results of technical, economic and environmental evaluations of Cleaner Production (CP innovations as well as conclusions made are presented in this article.

  14. Electrostatic Model Applied to ISS Charged Water Droplet Experiment

    Science.gov (United States)

    Stevenson, Daan; Schaub, Hanspeter; Pettit, Donald R.

    2015-01-01

    The electrostatic force can be used to create novel relative motion between charged bodies if it can be isolated from the stronger gravitational and dissipative forces. Recently, Coulomb orbital motion was demonstrated on the International Space Station by releasing charged water droplets in the vicinity of a charged knitting needle. In this investigation, the Multi-Sphere Method, an electrostatic model developed to study active spacecraft position control by Coulomb charging, is used to simulate the complex orbital motion of the droplets. When atmospheric drag is introduced, the simulated motion closely mimics that seen in the video footage of the experiment. The electrostatic force's inverse dependency on separation distance near the center of the needle lends itself to analytic predictions of the radial motion.

  15. Virtual building environments (VBE) - Applying information modeling to buildings

    Energy Technology Data Exchange (ETDEWEB)

    Bazjanac, Vladimir

    2004-06-21

    A Virtual Building Environment (VBE) is a ''place'' where building industry project staffs can get help in creating Building Information Models (BIM) and in the use of virtual buildings. It consists of a group of industry software that is operated by industry experts who are also experts in the use of that software. The purpose of a VBE is to facilitate expert use of appropriate software applications in conjunction with each other to efficiently support multidisciplinary work. This paper defines BIM and virtual buildings, and describes VBE objectives, set-up and characteristics of operation. It informs about the VBE Initiative and the benefits from a couple of early VBE projects.

  16. Applying a Hybrid MCDM Model for Six Sigma Project Selection

    Directory of Open Access Journals (Sweden)

    Fu-Kwun Wang

    2014-01-01

    Full Text Available Six Sigma is a project-driven methodology; the projects that provide the maximum financial benefits and other impacts to the organization must be prioritized. Project selection (PS is a type of multiple criteria decision making (MCDM problem. In this study, we present a hybrid MCDM model combining the decision-making trial and evaluation laboratory (DEMATEL technique, analytic network process (ANP, and the VIKOR method to evaluate and improve Six Sigma projects for reducing performance gaps in each criterion and dimension. We consider the film printing industry of Taiwan as an empirical case. The results show that our study not only can use the best project selection, but can also be used to analyze the gaps between existing performance values and aspiration levels for improving the gaps in each dimension and criterion based on the influential network relation map.

  17. Applying direct observation to model workflow and assess adoption.

    Science.gov (United States)

    Unertl, Kim M; Weinger, Matthew B; Johnson, Kevin B

    2006-01-01

    Lack of understanding about workflow can impair health IT system adoption. Observational techniques can provide valuable information about clinical workflow. A pilot study using direct observation was conducted in an outpatient chronic disease clinic. The goals of the study were to assess workflow and information flow and to develop a general model of workflow and information behavior. Over 55 hours of direct observation showed that the pilot site utilized many of the features of the informatics systems available to them, but also employed multiple non-electronic artifacts and workarounds. Gaps existed between clinic workflow and informatics tool workflow, as well as between institutional expectations of informatics tool use and actual use. Concurrent use of both paper-based and electronic systems resulted in duplication of effort and inefficiencies. A relatively short period of direct observation revealed important information about workflow and informatics tool adoption.

  18. "Let's Move" campaign: applying the extended parallel process model.

    Science.gov (United States)

    Batchelder, Alicia; Matusitz, Jonathan

    2014-01-01

    This article examines Michelle Obama's health campaign, "Let's Move," through the lens of the extended parallel process model (EPPM). "Let's Move" aims to reduce the childhood obesity epidemic in the United States. Developed by Kim Witte, EPPM rests on the premise that people's attitudes can be changed when fear is exploited as a factor of persuasion. Fear appeals work best (a) when a person feels a concern about the issue or situation, and (b) when he or she believes to have the capability of dealing with that issue or situation. Overall, the analysis found that "Let's Move" is based on past health campaigns that have been successful. An important element of the campaign is the use of fear appeals (as it is postulated by EPPM). For example, part of the campaign's strategies is to explain the severity of the diseases associated with obesity. By looking at the steps of EPPM, readers can also understand the strengths and weaknesses of "Let's Move."

  19. Sensorless position estimator applied to nonlinear IPMC model

    Science.gov (United States)

    Bernat, Jakub; Kolota, Jakub

    2016-11-01

    This paper addresses the issue of estimating position for an ionic polymer metal composite (IPMC) known as electro active polymer (EAP). The key step is the construction of a sensorless mode considering only current feedback. This work takes into account nonlinearities caused by electrochemical effects in the material. Owing to the recent observer design technique, the authors obtained both Lyapunov function based estimation law as well as sliding mode observer. To accomplish the observer design, the IPMC model was identified through a series of experiments. The research comprises time domain measurements. The identification process was completed by means of geometric scaling of three test samples. In the proposed design, the estimated position accurately tracks the polymer position, which is illustrated by the experiments.

  20. Nonspherical Radiation Driven Wind Models Applied to Be Stars

    Science.gov (United States)

    Arauxo, F. X.

    1990-11-01

    ABSTRACT. In this work we present a model for the structure of a radiatively driven wind in the meridional plane of a hot star. Rotation effects and simulation of viscous forces were included in the motion equations. The line radiation force is considered with the inclusion of the finite disk correction in self-consistent computations which also contain gravity darkening as well as distortion of the star by rotation. An application to a typical BlV star leads to mass-flux ratios between equator and pole of the order of 10 and mass loss rates in the range 5.l0 to Mo/yr. Our envelope models are flattened towards the equator and the wind terminal velocities in that region are rather high (1000 Km/s). However, in the region near the star the equatorial velocity field is dominated by rotation. RESUMEN. Se presenta un modelo de la estructura de un viento empujado radiativamente en el plano meridional de una estrella caliente. Se incluyeron en las ecuaciones de movimiento los efectos de rotaci6n y la simulaci6n de fuerzas viscosas. Se consider6 la fuerza de las lineas de radiaci6n incluyendo la correcci6n de disco finito en calculos autoconsistentes los cuales incluyen oscurecimiento gravitacional asi como distorsi6n de la estrella por rotaci6n. La aplicaci6n a una estrella tipica BlV lleva a cocientes de flujo de masa entre el ecuador y el polo del orden de 10 de perdida de masa en el intervalo 5.l0 a 10 Mo/ano. Nuestros modelos de envolvente estan achatados hacia el ecuador y las velocidads terminales del viento en esa regi6n son bastante altas (1000 Km/s). Sin embargo, en la regi6n cercana a la estrella el campo de velocidad ecuatorial esta dominado por la rotaci6n. Key words: STARS-BE -- STARS-WINDS

  1. A physiological production model for cacao : results of model simulations

    NARCIS (Netherlands)

    Zuidema, P.A.; Leffelaar, P.A.

    2002-01-01

    CASE2 is a physiological model for cocoa (Theobroma cacao L.) growth and yield. This report introduces the CAcao Simulation Engine for water-limited production in a non-technical way and presents simulation results obtained with the model.

  2. Costs Models in Design and Manufacturing of Sand Casting Products

    CERN Document Server

    Perry, Nicolas; Bernard, Alain

    2010-01-01

    In the early phases of the product life cycle, the costs controls became a major decision tool in the competitiveness of the companies due to the world competition. After defining the problems related to this control difficulties, we will present an approach using a concept of cost entity related to the design and realization activities of the product. We will try to apply this approach to the fields of the sand casting foundry. This work will highlight the enterprise modelling difficulties (limits of a global cost modelling) and some specifics limitations of the tool used for this development. Finally we will discuss on the limits of a generic approach.

  3. Quantitative modeling of Cerenkov light production efficiency from medical radionuclides.

    Science.gov (United States)

    Beattie, Bradley J; Thorek, Daniel L J; Schmidtlein, Charles R; Pentlow, Keith S; Humm, John L; Hielscher, Andreas H

    2012-01-01

    There has been recent and growing interest in applying Cerenkov radiation (CR) for biological applications. Knowledge of the production efficiency and other characteristics of the CR produced by various radionuclides would help in accessing the feasibility of proposed applications and guide the choice of radionuclides. To generate this information we developed models of CR production efficiency based on the Frank-Tamm equation and models of CR distribution based on Monte-Carlo simulations of photon and β particle transport. All models were validated against direct measurements using multiple radionuclides and then applied to a number of radionuclides commonly used in biomedical applications. We show that two radionuclides, Ac-225 and In-111, which have been reported to produce CR in water, do not in fact produce CR directly. We also propose a simple means of using this information to calibrate high sensitivity luminescence imaging systems and show evidence suggesting that this calibration may be more accurate than methods in routine current use.

  4. Production management using the EFQM Excellence Model

    Directory of Open Access Journals (Sweden)

    Janja Škedel

    2016-09-01

    Full Text Available Research Question (RQ: Comparison of production management using the EFQM Excellence Model. Purpose: The aim of the research is based on a comparison of production management using the EFQM Excellence Model, to establish identity and difference. The aim is to improve the management of production and by using the model closer to excellence. Method: An identification method of benchmarking. Results: The results show tolerance, which represent an opportunity to improve production management with excellence. Organization: If we take into account the results of the organization, this would be an asset to the organization. Society: Method comparisons can also be used in the wider environment. Originality: The survey is unique and the first of its kind in the manufacturing organization. Limitations/Future Research: With this research we will gain improvements in production management through design excellence.

  5. Applying dispersive changes to Lagrangian particles in groundwater transport models

    Science.gov (United States)

    Konikow, Leonard F.

    2010-01-01

    Method-of-characteristics groundwater transport models require that changes in concentrations computed within an Eulerian framework to account for dispersion be transferred to moving particles used to simulate advective transport. A new algorithm was developed to accomplish this transfer between nodal values and advecting particles more precisely and realistically compared to currently used methods. The new method scales the changes and adjustments of particle concentrations relative to limiting bounds of concentration values determined from the population of adjacent nodal values. The method precludes unrealistic undershoot or overshoot for concentrations of individual particles. In the new method, if dispersion causes cell concentrations to decrease during a time step, those particles in the cell having the highest concentration will decrease the most, and those with the lowest concentration will decrease the least. The converse is true if dispersion is causing concentrations to increase. Furthermore, if the initial concentration on a particle is outside the range of the adjacent nodal values, it will automatically be adjusted in the direction of the acceptable range of values. The new method is inherently mass conservative.

  6. A spectrophotometric model applied to cluster galaxies: the WINGS dataset

    CERN Document Server

    Fritz, J; Bettoni, D; Cava, A; Couch, W J; D'Onofrio, M; Dressler, A; Fasano, G; Kjaergaard, P; Moles, M; Varela, J

    2007-01-01

    [Abridged] The WIde-field Nearby Galaxy-cluster Survey (WINGS) is a project aiming at the study of the galaxy populations in clusters in the local universe (0.04model is the possibility of treating dust extinction as a function of age, allowing younger stars to be more obscured than older ones. Our technique, for the first time, takes into account this feature in a spectral fitting code. A set of template spectra spanning a wide range of star formation histories is built, with features closely resembling those of typical spectra in our sample in terms of spectral resolution, noise and wavelength coverage. Our method of analyzing these spectra allows us to test the reliability and the uncertainties related to each physical parameter we are inferring. The well-known degeneracy problem, i.e. the non-uniqu...

  7. Atomistic Method Applied to Computational Modeling of Surface Alloys

    Science.gov (United States)

    Bozzolo, Guillermo H.; Abel, Phillip B.

    2000-01-01

    The formation of surface alloys is a growing research field that, in terms of the surface structure of multicomponent systems, defines the frontier both for experimental and theoretical techniques. Because of the impact that the formation of surface alloys has on surface properties, researchers need reliable methods to predict new surface alloys and to help interpret unknown structures. The structure of surface alloys and when, and even if, they form are largely unpredictable from the known properties of the participating elements. No unified theory or model to date can infer surface alloy structures from the constituents properties or their bulk alloy characteristics. In spite of these severe limitations, a growing catalogue of such systems has been developed during the last decade, and only recently are global theories being advanced to fully understand the phenomenon. None of the methods used in other areas of surface science can properly model even the already known cases. Aware of these limitations, the Computational Materials Group at the NASA Glenn Research Center at Lewis Field has developed a useful, computationally economical, and physically sound methodology to enable the systematic study of surface alloy formation in metals. This tool has been tested successfully on several known systems for which hard experimental evidence exists and has been used to predict ternary surface alloy formation (results to be published: Garces, J.E.; Bozzolo, G.; and Mosca, H.: Atomistic Modeling of Pd/Cu(100) Surface Alloy Formation. Surf. Sci., 2000 (in press); Mosca, H.; Garces J.E.; and Bozzolo, G.: Surface Ternary Alloys of (Cu,Au)/Ni(110). (Accepted for publication in Surf. Sci., 2000.); and Garces, J.E.; Bozzolo, G.; Mosca, H.; and Abel, P.: A New Approach for Atomistic Modeling of Pd/Cu(110) Surface Alloy Formation. (Submitted to Appl. Surf. Sci.)). Ternary alloy formation is a field yet to be fully explored experimentally. The computational tool, which is based on

  8. Development of simple-to-apply biogas kinetic models for the co-digestion of food waste and maize husk.

    Science.gov (United States)

    Owamah, H I; Izinyon, O C

    2015-10-01

    Biogas kinetic models are often used to characterize substrate degradation and prediction of biogas production potential. Most of these existing models are however difficult to apply to substrates they were not developed for since their applications are usually substrate specific. Biodegradability kinetic (BIK) model and maximum biogas production potential and stability assessment (MBPPSA) model were therefore developed in this study for better understanding of the anaerobic co-digestion of food waste and maize husk for biogas production. Biodegradability constant (k) was estimated as 0.11 d(-1) using the BIK model. The results of maximum biogas production potential (A) obtained using the MBPPSA model were found to be in good correspondence, both in value and trend with the results obtained using the popular but complex modified Gompertz model for digesters B-1, B-2, B-3, B-4, and B-5. The (If) value of MBPPSA model also showed that digesters B-3, B-4, and B-5 were stable, while B-1 and B-2 were inhibited/unstable. Similar stability observation was also obtained using the modified Gompertz model. The MBPPSA model can therefore be used as an alternative model for anaerobic digestion feasibility studies and plant design.

  9. Production economic models of fisheries

    DEFF Research Database (Denmark)

    Andersen, Jesper Levring

    vessels is performed. The results indicate that these vessels seek to insure themselves against uncertainty by allowing for the highest flexibility in crew payments, followed by fuel costs, sales costs and costs for ice/provisions respectively. Accounting for changes in fishermen’s behaviour...... in the modelling framework presented here. The starting point for Paper 5 is the fact that management changes will most likely result in different behaviour by the fishermen. It is necessary to account for these changes, when evaluating the expected gains to be derived. Based on the Data Envelopment Analysis...

  10. Recent insights into the cell immobilization technology applied for dark fermentative hydrogen production.

    Science.gov (United States)

    Kumar, Gopalakrishnan; Mudhoo, Ackmez; Sivagurunathan, Periyasamy; Nagarajan, Dillirani; Ghimire, Anish; Lay, Chyi-How; Lin, Chiu-Yue; Lee, Duu-Jong; Chang, Jo-Shu

    2016-11-01

    The contribution and insights of the immobilization technology in the recent years with regards to the generation of (bio)hydrogen via dark fermentation have been reviewed. The types of immobilization practices, such as entrapment, encapsulation and adsorption, are discussed. Materials and carriers used for cell immobilization are also comprehensively surveyed. New development of nano-based immobilization and nano-materials has been highlighted pertaining to the specific subject of this review. The microorganisms and the type of carbon sources applied in the dark hydrogen fermentation are also discussed and summarized. In addition, the essential components of process operation and reactor configuration using immobilized microbial cultures in the design of varieties of bioreactors (such as fixed bed reactor, CSTR and UASB) are spotlighted. Finally, suggestions and future directions of this field are provided to assist the development of efficient, economical and sustainable hydrogen production technologies.

  11. Hybrid simulation models of production networks

    CERN Document Server

    Kouikoglou, Vassilis S

    2001-01-01

    This book is concerned with a most important area of industrial production, that of analysis and optimization of production lines and networks using discrete-event models and simulation. The book introduces a novel approach that combines analytic models and discrete-event simulation. Unlike conventional piece-by-piece simulation, this method observes a reduced number of events between which the evolution of the system is tracked analytically. Using this hybrid approach, several models are developed for the analysis of production lines and networks. The hybrid approach combines speed and accuracy for exceptional analysis of most practical situations. A number of optimization problems, involving buffer design, workforce planning, and production control, are solved through the use of hybrid models.

  12. Modelling Configuration Knowledge in Heterogeneous Product Families

    DEFF Research Database (Denmark)

    Queva, Matthieu Stéphane Benoit; Männistö, Tomi; Ricci, Laurent

    2011-01-01

    Product configuration systems play an important role in the development of Mass Customisation. The configuration of complex product families may nowadays involve multiple design disciplines, e.g. hardware, software and services. In this paper, we present a conceptual approach for modelling the va...

  13. Modeling Fission Product Sorption in Graphite Structures

    Energy Technology Data Exchange (ETDEWEB)

    Szlufarska, Izabela [University of Wisconsin, Madison, WI (United States); Morgan, Dane [University of Wisconsin, Madison, WI (United States); Allen, Todd [University of Wisconsin, Madison, WI (United States)

    2013-04-08

    The goal of this project is to determine changes in adsorption and desorption of fission products to/from nuclear-grade graphite in response to a changing chemical environment. First, the project team will employ principle calculations and thermodynamic analysis to predict stability of fission products on graphite in the presence of structural defects commonly observed in very high- temperature reactor (VHTR) graphites. Desorption rates will be determined as a function of partial pressure of oxygen and iodine, relative humidity, and temperature. They will then carry out experimental characterization to determine the statistical distribution of structural features. This structural information will yield distributions of binding sites to be used as an input for a sorption model. Sorption isotherms calculated under this project will contribute to understanding of the physical bases of the source terms that are used in higher-level codes that model fission product transport and retention in graphite. The project will include the following tasks: Perform structural characterization of the VHTR graphite to determine crystallographic phases, defect structures and their distribution, volume fraction of coke, and amount of sp2 versus sp3 bonding. This information will be used as guidance for ab initio modeling and as input for sorptivity models; Perform ab initio calculations of binding energies to determine stability of fission products on the different sorption sites present in nuclear graphite microstructures. The project will use density functional theory (DFT) methods to calculate binding energies in vacuum and in oxidizing environments. The team will also calculate stability of iodine complexes with fission products on graphite sorption sites; Model graphite sorption isotherms to quantify concentration of fission products in graphite. The binding energies will be combined with a Langmuir isotherm statistical model to predict the sorbed concentration of fission

  14. Model based sustainable production of biomethane

    OpenAIRE

    Biernacki, Piotr

    2014-01-01

    The main intention of this dissertation was to evaluate sustainable production of biomethane with use of mathematical modelling. To achieve this goal, widely acknowledged models like Anaerobic Digestion Model No.1 (ADM1), describing anaerobic digestion, and electrolyte Non-Random Two Liquid Model (eNRTL), for gas purification, were utilized. The experimental results, batch anaerobic digestion of different substrates and carbon dioxide solubility in 2-(Ethylamino)ethanol, were used to determin...

  15. Error-free pathology: applying lean production methods to anatomic pathology.

    Science.gov (United States)

    Condel, Jennifer L; Sharbaugh, David T; Raab, Stephen S

    2004-12-01

    The current state of our health care system calls for dramatic changes. In their pathology department, the authors believe these changes may be accomplished by accepting the long-term commitment of applying a lean production system. The ideal state of zero pathology errors is one that should be pursued by consistently asking, "Why can't we?" The philosophy of lean production systems began in the manufacturing industry: "All we are doing is looking at the time from the moment the customer gives us an order to the point when we collect the cash. And we are reducing that time line by removing non-value added wastes". The ultimate goals in pathology and overall health care are not so different. The authors' intention is to provide the patient (customer) with the most accurate diagnostic information in a timely and efficient manner. Their lead histotechnologist recently summarized this philosophy: she indicated that she felt she could sleep better at night knowing she truly did the best job she could. Her chances of making an error (in cutting or labeling) were dramatically decreased in the one-by-one continuous flow work process compared with previous practices. By designing a system that enables employees to be successful in meeting customer demand, and by empowering the frontline staff in the development and problem solving processes, one can meet the challenges of eliminating waste and build an improved, efficient system.

  16. [Aquatic ecosystem modelling approach: temperature and water quality models applied to Oualidia and Nador lagoons].

    Science.gov (United States)

    Idrissi, J Lakhdar; Orbi, A; Hilmi, K; Zidane, F; Moncef, M

    2005-07-01

    The objective of this work is to develop an aquatic ecosystem and apply it on Moroccan lagoon systems. This model will keep us abreast of the yearly development of the main parameters that characterize these ecosystems while integrating all the data that have so far been acquired. Within this framework, a simulation model of the thermal system and a model of the water quality have been elaborated. These models, which have been simulated on the lagoon of Oualidia (North of Morocco) and validated on the lagoon of Nador (North West Mediterranean), permit to foresee the cycles of temperature of the surface and the parameters of the water quality (dissolved oxygen and biomass phytoplankton) by using meteorological information, specific features and in situ measurements in the studied sites. The elaborated model, called Zero-Dimensional, simulates the average conduct of the site during the time of variable states that are representatives of the studied ecosystem. This model will provide answers for the studied phenomena and is a work tool adequate for numerical simplicity.

  17. Dynamical Model of Weak Pion Production Reactions

    CERN Document Server

    Sato, T; Lee, T S H

    2003-01-01

    The dynamical model of pion electroproduction has been extended to investigate the weak pion production reactions. The predicted cross sections of neutrino-induced pion production reactions are in good agreement with the existing data. We show that the renormalized(dressed) axial N-$\\Delta$ form factor contains large dynamical pion cloud effects and this renormalization effects are crucial in getting agreement with the data. We conclude that the N-$\\Delta$ transitions predicted by the constituent quark model are consistent with the existing neutrino induced pion production data in the $\\Delta$ region.

  18. Applied Railway Optimization in Production Planning at DSB-S-tog - Tasks, Tools and Challenges

    DEFF Research Database (Denmark)

    Clausen, Jens

    2007-01-01

    Efficient public transportation is becoming increasingly vital for modern capitals. DSB S-tog a/s is the major supplier of rail traffic on the infrastructure of the city-rail network in Copenhagen. S-tog has experienced a demand for increasing volume and quality of the transportation offered....... In addition we describe on-going efforts in using mathematical models in activities such as timetable design and work-force planning. We also identify some organizatorial key factors, which have paved the way for extended use of optimization methods in railway production planning....

  19. Product modelling in the seafood industry

    DEFF Research Database (Denmark)

    Jonsdottir, Stella; Vesterager, Johan

    1997-01-01

    assessments, speed up the process and ensure a constant renewal of the seafood products. The objective, therefore, is to estimate the suitability of the CE, and especially CE through product modelling, in the seafood industry as a means to obtain an integration of the entire chain, i.e., a business and market...... based integration obtained by the CE approach and tools. It is described how the knowledge and information of a seafood product can be modelled by using object oriented techniques.......The paper addresses the aspects of Concurrent Engineering (CE) as a means to obtain integrated product development in the seafood industry. It is assumed that the future New Product Development (NPD) in seafood industry companies will shift from being retailer driven and reactive to be more company...

  20. Model Proposition for the Fiscal Policies Analysis Applied in Economic Field

    Directory of Open Access Journals (Sweden)

    Larisa Preda

    2007-05-01

    Full Text Available This paper presents a study about fiscal policy applied in economic development. Correlations between macroeconomics and fiscal indicators signify the first steep in our analysis. Next step is a new model proposal for the fiscal and budgetary choices. This model is applied on the date of the Romanian case.

  1. Interpretation models and charts of production profiles in horizontal wells

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Stratified flow is common for the gravity segregation and flow regimes are very complex because of borehole inclination,therefore,all the conventional production logging tools cannot be effectively applied in horizontal wells,thus significantly increasing the difficulties in log interpretation. In this paper,firstly,the overseas progress in updated integration tools for horizontal wells and production profile interpretation methods has been discussed in brief. Secondly,by means of theory study and experimental simulations,we have obtained the production profile interpretation model and experimental interpretation charts,which have been calibrated by the improved downhole technology and optimization methods. Finally,we have interpreted X-well with the production profile interpretation software designed by us,and it proves that the methods are useful for the production profile interpretation in horizontal wells.

  2. Modelling Fungal Fermentations for Enzyme Production

    DEFF Research Database (Denmark)

    Albæk, Mads Orla; Gernaey, Krist; Hansen, Morten S.

    We have developed a process model of fungal fed-batch fermentations for enzyme production. In these processes, oxygen transfer rate is limiting and controls the substrate feeding rate. The model has been shown to describe cultivations of both Aspergillus oryzae and Trichoderma reesei strains in 550...

  3. Applying a Hybrid QFD-TOPSIS Method to Design Product in the Industry (Case Study in Sum Service Company

    Directory of Open Access Journals (Sweden)

    Babak Haji Karimi

    2012-09-01

    Full Text Available Electronics industry as an industry with high added value and television production industry especially as one of its pillars play an important role in the economy of each country. Therefore, the aim study of this paper is to illustrate how, using a combined QFD-TOPSIS model, organizations are able to their design product in accordance with requirements of consumers with a case study in Sum Service Company. Quality Function Deployment (QFD is one such extremely important quality management tool that is useful in product design and development. Traditionally, QFD rates the Design Requirements (DRs with respect to customer needs and aggregates the rating to get relative importance score of DRs. An increasing number of studies emphasize on the need to incorporate additional factors, such as cost and environmental impact, while calculating the relative importance of DRs. However, there is variety of methodologies for driving the relative importance of DRs, when several additional factors are considered. TOPSIS (technique for order preferences by similarity to ideal solution is suggested for the purpose of the research. This research proposes new approach of TOPSIS for considering the rating of DRs with respect to CRs and several additional factors simultaneously. Proposed method is illustrated using by step-by-step procedure. The proposed fuzzy QFDTOPSIS methodology was applied for the Sum Service Company in Iran.

  4. Applying a Markov approach as a Lean Thinking analysis of waste elimination in a Rice Production Process

    Directory of Open Access Journals (Sweden)

    Eldon Glen Caldwell Marin

    2015-01-01

    Full Text Available The Markov Chains Model was proposed to analyze stochastic events when recursive cycles occur; for example, when rework in a continuous flow production affects the overall performance. Typically, the analysis of rework and scrap is done through a wasted material cost perspective and not from the perspective of waste capacity that reduces throughput and economic value added (EVA. Also, we can not find many cases of this application in agro-industrial production in Latin America, given the complexity of the calculations and the need for robust applications. This scientific work presents the results of a quasi-experimental research approach in order to explain how to apply DOE methods and Markov analysis in a rice production process located in Central America, evaluating the global effects of a single reduction in rework and scrap in a part of the whole line. The results show that in this case it is possible to evaluate benefits from Global Throughput and EVA perspective and not only from the saving costs perspective, finding a relationship between operational indicators and corporate performance. However, it was found that it is necessary to analyze the markov chains configuration with many rework points, also it is still relevant to take into account the effects on takt time and not only scrap´s costs.

  5. Model-based optimization of production systems

    OpenAIRE

    2015-01-01

    Gas lifted method is one of the artificial lift technique used in the oil and gas industry.This method is applied most in oil well to improve the oil recovery by lowering the bottomhole pressure. Normally in the field there are multi-gas lifted wells that requires certain amount of gas to be injected to achieve the maximum oil production. Generally the amount of gas available is limited, therefore is has to be allocated per well in the best way possible to achieve maximum oil production in...

  6. Behavior and Design Intent Based Product Modeling

    Directory of Open Access Journals (Sweden)

    László Horváth

    2004-11-01

    Full Text Available A knowledge based modeling of mechanical products is presented for industrial CAD/CAM systems. An active model is proposed that comprise knowledge from modeling procedures, generic part models and engineers. Present day models of mechanical systems do not contain data about the background of human decisions. This situation motivated the authors at their investigations on exchange design intent information between engineers. Their concept was extending of product models to be capable of description of design intent information. Several human-computer and human-human communication issues were considered. The complex communication problem has been divided into four sub-problems, namely communication of human intent source with the computer system, representation of human intent, exchange of intent data between modeling procedures and communication of the represented intent with humans. Paper discusses the scenario of intelligent modeling based engineering. Then key concepts for the application of computational intelligence in computer model based engineering systems are detailed including knowledge driven models as well as areas of their application. Next, behavior based models with intelligent content involving specifications and knowledge for the design processes are emphasized and an active part modeling is proposed and possibilities for its application are outlined. Finally, design intent supported intelligent modeling is discussed.

  7. Framework for product knowledge and product related knowledge which supports product modelling for mass customization

    DEFF Research Database (Denmark)

    Riis, Jesper; Hansen, Benjamin Loer; Hvam, Lars

    2003-01-01

    The article presents a framework for product knowledge and product related knowledge which can be used to support the product modelling process which is needed for developing IT systems. These IT systems are important tools for many companies when they aim at achieving mass customization and pers......The article presents a framework for product knowledge and product related knowledge which can be used to support the product modelling process which is needed for developing IT systems. These IT systems are important tools for many companies when they aim at achieving mass customization...... on experience from product modelling projects in several companies. Among them for example companies manufacturing electronic switchboards, spray dryer systems and air conditioning equipment. The framework is divided into three views: the product knowledge view, the life phase system view and the transformation...... process view (“the meeting”). The persons (rolls) involved in the product modelling process are for example: domain experts, change managers, model managers, project leaders, technical facilitators, process managers and software programmers. They need a framework during the product modelling process...

  8. Applying consequential LCA to support energy policy: land use change effects of bioenergy production.

    Science.gov (United States)

    Vázquez-Rowe, Ian; Marvuglia, Antonino; Rege, Sameer; Benetto, Enrico

    2014-02-15

    Luxembourg aims at complying with the EU objective of attaining a 14% use of bioenergy in the national grid by 2020. The increase of biomethane production from energy crops could be a valuable option in achieving this objective. However, the overall environmental benefit of such option is yet to be proven. Consequential Life Cycle Assessment (CLCA) has shown to be a useful tool to evaluate the environmental suitability of future energy scenarios and policies. The objective of this study was, therefore, to evaluate the environmental consequences of modifying the Luxembourgish agricultural system to increase maize production for biomethane generation. A total of 10 different scenarios were modelled using a partial equilibrium (PE) model to identify changes in land cultivation based on farmers' revenue maximisation, which were then compared to the baseline scenario, i.e. the state of the agricultural sector in 2009. The results were divided into three different consequential decision contexts, presenting differing patterns in terms of land use changes (LUCs) but with minor shifts in environmental impacts. Nevertheless, energy from maize production would imply substantially higher environmental impacts when compared with the current use of natural gas, mainly due to increases in climate change and agricultural land occupation impacts. The results are discussed based on the consequences they may generate on the bioenergy policy, the management of arable land, the changes in import-export flows in Luxembourg and LUCs in the domestic agricultural system. In addition, the specific PE+LCA method presented intends to be of use for other regional studies in which a high level of site-specific data is available.

  9. Applying Catastrophe Theory to an Information-Processing Model of Problem Solving in Science Education

    Science.gov (United States)

    Stamovlasis, Dimitrios; Tsaparlis, Georgios

    2012-01-01

    In this study, we test an information-processing model (IPM) of problem solving in science education, namely the working memory overload model, by applying catastrophe theory. Changes in students' achievement were modeled as discontinuities within a cusp catastrophe model, where working memory capacity was implemented as asymmetry and the degree…

  10. The Levels of Conceptual Interoperability Model: Applying Systems Engineering Principles to M&S

    CERN Document Server

    WANG, Wenguang; WANG, Weiping

    2009-01-01

    This paper describes the use of the Levels of Conceptual Interoperability Model (LCIM) as a framework for conceptual modeling and its descriptive and prescriptive uses. LCIM is applied to show its potential and shortcomings in the current simulation interoperability approaches, in particular the High Level Architecture (HLA) and Base Object Models (BOM). It emphasizes the need to apply rigorous engineering methods and principles and replace ad-hoc approaches.

  11. 30 CFR 206.112 - What adjustments and transportation allowances apply when I value oil production from my lease...

    Science.gov (United States)

    2010-07-01

    ... (b) of this section, also adjust the NYMEX price or ANS spot price for quality based on premiums or... apply when I value oil production from my lease using NYMEX prices or ANS spot prices? 206.112 Section... I value oil production from my lease using NYMEX prices or ANS spot prices? This section...

  12. Conceptual modelling approach of mechanical products based on functional surface

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    A modelling framework based on functional surface is presented to support conceptual design of mechanical products. The framework organizes product information in an abstract and multilevel manner. It consists of two mapping processes: function decomposition process and form reconstitution process. The steady mapping relationship from function to form (function-functional surface-form) is realized by taking functional surface as the middle layer. It farthest reduces the possibilities of combinatorial explosion that can occur during function decomposition and form reconstitution. Finally, CAD tools are developed and an auto-bender machine is applied to demonstrate the proposed approach.

  13. A chaotic agricultural machines production growth model

    OpenAIRE

    Jablanović, Vesna D.

    2011-01-01

    Chaos theory, as a set of ideas, explains the structure in aperiodic, unpredictable dynamic systems. The basic aim of this paper is to provide a relatively simple agricultural machines production growth model that is capable of generating stable equilibrium, cycles, or chaos. A key hypothesis of this work is based on the idea that the coefficient π = 1 + α plays a crucial role in explaining local stability of the agricultural machines production, where α is an autonomous growth rate of the ag...

  14. Agro-ecological aspects when applying the remaining products from agricultural biogas processes as fertilizer in crop production

    Energy Technology Data Exchange (ETDEWEB)

    Bermejo Dominguez, Gabriela

    2012-06-11

    With the increase of biogas production in recent years, the amount of digestates or the remaining residues increased accordingly. Every year in Germany more than 50 million tons of digestates are produced, which are used as fertilizer. Thus nutrients return into the circulation of agricultural ecosystems. However, the agro-ecological effects have not been deeply researched until now. For this reason, the following parameters were quantified: the influence of dry and liquid fermentation products on the yield of three selected crops in comparison to or in combination with mineral-N-fertilizers in on-farm experiments; the growth, development and yield of two selected crops in comparison to mineral-N-fertilizer, liquid manure and farmyard manure in a randomized complete block design; selected soil organisms as compared to mineral-N-fertilizer, liquid manure and farmyard manure in a randomized complete block design. In addition, the mineralization of dry and wet digestates in comparison with liquid manure and farmyard manure was investigated in order to evaluate the effects of different fertilizers on the humus formation under controlled conditions. The 2-year results of on-farm experiments showed that for a sandy soil, the combination of digestates in autumn and mineral-N-fertilizer in spring for winter crops (wheat, rye and rape) brought the highest yields. The wet digestate achieved the highest dry-matter yield as the only fertilizer for maize in spring. In a clayey soil, the use of 150 kg ha{sup -1} N mineral-N-fertilizer brought the highest grain yield. These results were similar to the ones obtained by the application of dry digestates, if they were applied in two doses. Maize showed no signif-icant differences between the dry-matter yields of the different treatments. The results in the field experiments from 2009 to 2011 showed that the effect of digestates on the yield of winter wheat and Sorghum sudanense was up to 15 % lower than the effect of the mineral

  15. A model for methane production in sewers.

    Science.gov (United States)

    Chaosakul, Thitirat; Koottatep, Thammarat; Polprasert, Chongrak

    2014-09-19

    Most sewers in developing countries are combined sewers which receive stormwater and effluent from septic tanks or cesspools of households and buildings. Although the wastewater strength in these sewers is usually lower than those in developed countries, due to improper construction and maintenance, the hydraulic retention time (HRT) could be relatively long and resulting considerable greenhouse gas (GHG) production. This study proposed an empirical model to predict the quantity of methane production in gravity-flow sewers based on relevant parameters such as surface area to volume ratio (A/V) of sewer, hydraulic retention time (HRT) and wastewater temperature. The model was developed from field survey data of gravity-flow sewers located in a peri-urban area, central Thailand and validated with field data of a sewer system of the Gold Coast area, Queensland, Australia. Application of this model to improve construction and maintenance of gravity-flow sewers to minimize GHG production and reduce global warming is presented.

  16. Clinical Productivity System - A Decision Support Model

    CERN Document Server

    Bennett, Casey C

    2012-01-01

    Purpose: This goal of this study was to evaluate the effects of a data-driven clinical productivity system that leverages Electronic Health Record (EHR) data to provide productivity decision support functionality in a real-world clinical setting. The system was implemented for a large behavioral health care provider seeing over 75,000 distinct clients a year. Design/methodology/approach: The key metric in this system is a "VPU", which simultaneously optimizes multiple aspects of clinical care. The resulting mathematical value of clinical productivity was hypothesized to tightly link the organization's performance to its expectations and, through transparency and decision support tools at the clinician level, affect significant changes in productivity, quality, and consistency relative to traditional models of clinical productivity. Findings: In only 3 months, every single variable integrated into the VPU system showed significant improvement, including a 30% rise in revenue, 10% rise in clinical percentage, a...

  17. STEP - Product Model Data Sharing and Exchange

    DEFF Research Database (Denmark)

    Kroszynski, Uri

    1998-01-01

    - Product Data Representation and Exchange", featuring at present some 30 released parts, and growing continuously. Many of the parts are Application Protocols (AP). This article presents an overview of STEP, based upon years of involvement in three ESPRIT projects, which contributed to the development......During the last fifteen years, a very large effort to standardize the product models employed in product design, manufacturing and other life-cycle phases has been undertaken. This effort has the acronym STEP, and resulted in the International Standard ISO-10303 "Industrial Automation Systems...

  18. Modeling and analysis of biomass production systems

    Energy Technology Data Exchange (ETDEWEB)

    Mishoe, J.W.; Lorber, M.N.; Peart, R.M.; Fluck, R.C.; Jones, J.W.

    1984-01-01

    BIOMET is an interactive simulation model that is used to analyze specific biomass and methane production systems. The system model is composed of crop growth models, harvesting, transportation, conversion and economic submodels. By use of menus the users can configure the structure and set selected parameters of the system to analyze the effects of variables within the component models. For example, simulations of a water hyacinth system resulted in yields of 63, 48 and 37 mg/ha/year for different harvest schedules. For napier grass, unit methane costs were $3.04, $2.86 and $2.98 for various yields of biomass. 10 references.

  19. On product cannibalization. A new Lotka-Volterra model for asymmetric competition in the ICTs

    OpenAIRE

    Guidolin, Mariangela; Guseo, Renato

    2016-01-01

    Product cannibalization is a well known phenomenon in marketing and new product development and describes the case when one product steals sales from a product pertaining to the same brand. In this paper we present a new Lotka-Volterra model with asymmetric competition, which is useful to describe cases of product cannibalization. We apply the model to the case of Apple Inc, where iPhone sales concurred to determine the crisis of the iPad. Stimulated by this applied case, we studied a dffe...

  20. Age and Creative Productivity: Nonlinear Estimation of an Information-Processing Model.

    Science.gov (United States)

    Simonton, Dean Keith

    1989-01-01

    Applied two-step cognitive model to relationship between age and creative productivity. Selected ideation and elaboration rates as information-processing parameters that define mathematical function which describes age curves and specifies their variance across disciplines. Applied non-linear estimation program to further validate model. Despite…

  1. A Production Model for Construction: A Theoretical Framework

    Directory of Open Access Journals (Sweden)

    Ricardo Antunes

    2015-03-01

    Full Text Available The building construction industry faces challenges, such as increasing project complexity and scope requirements, but shorter deadlines. Additionally, economic uncertainty and rising business competition with a subsequent decrease in profit margins for the industry demands the development of new approaches to construction management. However, the building construction sector relies on practices based on intuition and experience, overlooking the dynamics of its production system. Furthermore, researchers maintain that the construction industry has no history of the application of mathematical approaches to model and manage production. Much work has been carried out on how manufacturing practices apply to construction projects, mostly lean principles. Nevertheless, there has been little research to understand the fundamental mechanisms of production in construction. This study develops an in-depth literature review to examine the existing knowledge about production models and their characteristics in order to establish a foundation for dynamic production systems management in construction. As a result, a theoretical framework is proposed, which will be instrumental in the future development of mathematical production models aimed at predicting the performance and behaviour of dynamic project-based systems in construction.

  2. Statistical modelling of fine red wine production

    OpenAIRE

    María Rosa Castro; Marcelo Eduardo Echegaray; Rosa Ana Rodríguez; Stella Maris Udaquiola

    2010-01-01

    Producing wine is a very important economic activity in the province of San Juan in Argentina; it is therefore most important to predict production regarding the quantity of raw material needed. This work was aimed at obtaining a model relating kilograms of crushed grape to the litres of wine so produced. Such model will be used for predicting precise future values and confidence intervals for determined quantities of crushed grapes. Data from a vineyard in the province of San Juan was ...

  3. Engineering of centrifugal dust-collectors based on parallel comparing tests applying computer modelling

    Science.gov (United States)

    Bulygin, Y. I.; Koronchik, D. A.; Abuzyarov, A. A.

    2015-09-01

    Currently researchers are giving serious consideration to studying questions, related to issues of atmosphere protection, in particular, studying of new construction of gas-cleaning SPM cyclonic devices effectivity. Engineering new devices is impossible without applying mathematical model methods, computer modeling and making physical models of studying processes due nature tests.

  4. Design Automation Systems for Production Preparation : Applied on the Rotary Draw Bending Process

    OpenAIRE

    Johansson, Joel

    2008-01-01

    Intensive competition on the global market puts great pressure on manufacturing companies to develop and produce products that meet requirements from customers and investors. One key factor in meeting these requirements is the efficiency of the product development and the production preparation process. Design automation is a powerful tool to increase efficiency in these two processes. The benefits of automating the production preparation process are shortened led-time, improved product perfo...

  5. Procurement model for copper and polymer electrical products

    Directory of Open Access Journals (Sweden)

    S. Sremac

    2013-10-01

    Full Text Available Procurement model for copper and polymer electrical products. Electrical cable structure (wire, insulation, filling and mantle is in accordance with the technical specifications of individual cable components in terms of the incorporated materials. Materials used in cable manufacture are copper, aluminum, rubber and polyvinyl chloride. One of the key issues in managing the flow of goods pertains to the timing of procurement. The combination of the two concepts can take advantage of individual strengths of fuzzy logic and neural networks in hybrid systems of homogeneous structure. The model has high practical significance, as, with minor modifications, it can be applied in any enterprise responsible for managing the goods flows.

  6. Tensor product model transformation based decoupled terminal sliding mode control

    Science.gov (United States)

    Zhao, Guoliang; Li, Hongxing; Song, Zhankui

    2016-06-01

    The main objective of this paper is to propose a tensor product model transformation based decoupled terminal sliding mode controller design methodology. The methodology is divided into two steps. In the first step, tensor product model transformation is applied to the single-input-multi-output system and a parameter-varying weighted linear time-invariant system is obtained. Then, decoupled terminal sliding mode controller is designed based on the linear time-invariant systems. The main novelty of this paper is that the nonsingular terminal sliding mode control design is based on a numerical model rather than an analytical one. Finally, simulations are tested on cart-pole system and translational oscillations with a rotational actuator system.

  7. Design, product structuring and modelling of mechatronic products and systems

    DEFF Research Database (Denmark)

    Conrad, Finn; Sørensen, Torben

    2003-01-01

    Information Technology offers software and hardware for improvement of the engineering design, structuring and control systems, and industrial applications. The latest progress in IT makes integration of an overall design and manufacturing IT- concept feasible and commercially attractive. An IT......-tool concept for modelling, simulation and design of mechatronic products and systems is proposed in this paper. It built on results from a Danish mechatronic research program on intelligent motion control as well as from the Esprit project SWING on IT-tools for rapid prototyping of fluid power components...

  8. Applications products of aviation forecast models

    Science.gov (United States)

    Garthner, John P.

    1988-01-01

    A service called the Optimum Path Aircraft Routing System (OPARS) supplies products based on output data from the Naval Oceanographic Global Atmospheric Prediction System (NOGAPS), a model run on a Cyber-205 computer. Temperatures and winds are extracted from the surface to 100 mb, approximately 55,000 ft. Forecast winds are available in six-hour time steps.

  9. Product modelling: '20 years of stalemate'?

    DEFF Research Database (Denmark)

    Galle, Per

    1998-01-01

    In a recent special issue of Design Studies Michael Ramscar, John Lee, and Helen Pain level a severe criticism against a field of research known as product modeling; a criticism that would be rather damaging if it were based on cogent arguments. I shall argue in this paper that it is not....

  10. Sampling from stochastic reservoir models constrained by production data

    Energy Technology Data Exchange (ETDEWEB)

    Hegstad, Bjoern Kaare

    1997-12-31

    When a petroleum reservoir is evaluated, it is important to forecast future production of oil and gas and to assess forecast uncertainty. This is done by defining a stochastic model for the reservoir characteristics, generating realizations from this model and applying a fluid flow simulator to the realizations. The reservoir characteristics define the geometry of the reservoir, initial saturation, petrophysical properties etc. This thesis discusses how to generate realizations constrained by production data, that is to say, the realizations should reproduce the observed production history of the petroleum reservoir within the uncertainty of these data. The topics discussed are: (1) Theoretical framework, (2) History matching, forecasting and forecasting uncertainty, (3) A three-dimensional test case, (4) Modelling transmissibility multipliers by Markov random fields, (5) Up scaling, (6) The link between model parameters, well observations and production history in a simple test case, (7) Sampling the posterior using optimization in a hierarchical model, (8) A comparison of Rejection Sampling and Metropolis-Hastings algorithm, (9) Stochastic simulation and conditioning by annealing in reservoir description, and (10) Uncertainty assessment in history matching and forecasting. 139 refs., 85 figs., 1 tab.

  11. Advanced geospatial technologies applied to gravel-bed river mapping and modeling

    Science.gov (United States)

    Aggett, Graeme Richard

    Mapping and modeling of river channels is essential in defining the Channel Migration Zone (CMZ). CMZ delineation is necessary to mitigate hazards, create opportunities to protect riparian habitat, predict channel response to changing land cover and disturbances, and design more environmentally-aligned engineering structures. This provides a compelling challenge to the GIScientist because of the need to understand fluvial process dynamics in space and time, and the narrow, elongated, and sinuous geometry of fluvial systems which complicates data collection, management and modeling of digital data describing these. This requires creation, management and correlation of a vast array of data of varying density and quality. Research presented here develops and applies advanced geospatial data, technologies, and modeling to CMZ mapping of a dynamic gravel-bed river in the state of Washington, USA. Chapter 2 demonstrates how new, object-based image processing techniques enhance river mapping accuracies and data modeling opportunities by incorporating the spatial characteristics and relationships of hydrogeomorphic objects into the classification process, by fusing high resolution DEMs with image data, and by accounting for uncertainty. In chapter 3, development and assimilation of a high resolution topographic LiDAR-based DEM with a one-dimensional hydraulic model enables the avulsion hazard of a reach of the Naches River in the state of Washington to be determined for multiple flow and channel-change scenarios. The DEM is used to optimize performance of the 1D hydraulic model HEC-RAS, post-processed output of which facilitates calculation of spatially explicit shear stress (tau0) and specific stream power per unit bed area (o). In Chapter 4 a new data intensive GIS-based framework for delineating CMZs is implemented and assessed. The approach incorporates historical maps, field-survey data, and LiDAR derived data products as well as a system design that provides a

  12. Hybrid optimization model of product concepts

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Deficiencies of applying the simple genetic algorithm to generate concepts were specified. Based on analyzing conceptual design and the morphological matrix of an excavator, the hybrid optimization model of generating its concepts was proposed, viz. an improved adaptive genetic algorithm was applied to explore the excavator concepts in the searching space of conceptual design, and a neural network was used to evaluate the fitness of the population. The optimization of generating concepts was finished through the "evolution - evaluation" iteration. The results show that by using the hybrid optimization model, not only the fitness evaluation and constraint conditions are well processed, but also the search precision and convergence speed of the optimization process are greatly improved. An example is presented to demonstrate the advantages of the proposed method and associated algorithms.

  13. A Multiobjective Fuzzy Aggregate Production Planning Model Considering Real Capacity and Quality of Products

    Directory of Open Access Journals (Sweden)

    Najmeh Madadi

    2014-01-01

    Full Text Available In this study, an attempt has been made to develop a multiobjective fuzzy aggregate production planning (APP model that best serves those companies whose aim is to have the best utilization of their resources in an uncertain environment while trying to keep an acceptable degree of quality and customer service level simultaneously. In addition, the study takes into account the performance and availability of production lines. To provide the optimal solution to the proposed model, first it was converted to an equivalent crisp multiobjective model and then goal programming was applied to the converted model. At the final step, the IBM ILOG CPLEX Optimization Studio software was used to obtain the final result based on the data collected from an automotive parts manufacturing company. The comparison of results obtained from solving the model with and without considering the performance and availability of production lines, revealed the significant importance of these two factors in developing a real and practical aggregate production plan.

  14. Product quality-based eco-efficiency applied to digital cameras.

    Science.gov (United States)

    Park, Pil-Ju; Tahara, Kiyotaka; Inaba, Atsushi

    2007-04-01

    When calculating eco-efficiency, there are considerable confusion and controversy about what the product value is and how it should be quantified. We have proposed here a quantification method for eco-efficiency that derives the ratio of the multiplication value of the product quality and the life span of a product to its whole environmental impact based on Life Cycle Assessment (LCA). In this study, product quality was used as the product value and quantified by the following three steps: (1) normalization based on a value function, (2) determination of the subjective weighting factors of the attributes, and (3) calculation of product quality of the chosen products. The applicability of the proposed method to an actual product was evaluated using digital cameras. The results show that the eco-efficiency values of products equipped with rechargeable batteries were higher than those products that use alkaline batteries, because of higher quality values and lower environmental impacts. The sensitivity analysis shows that the proposed method was superior to the existing methods, because it enables to identify the quality level of the chosen products by considering all products that have the same functions in the market and because, when adding a new product, the calculated quality values in the proposed method do not have to be changed.

  15. Eta and kaon production in a chiral quark model

    CERN Document Server

    Golli, Bojan

    2016-01-01

    We apply a coupled-channel formalism incorporating quasi-bound quark-model states to calculate pion scattering into eta N, K Lambda and K Sigma channels, as well eta p, eta n, K+Lambda, and K0Sigma+ photo-production processes. The meson-baryon and photon-baryon vertices are determined in a SU(3) version of the Cloudy Bag Model. Our model predicts sizable amplitudes in the P11, P13, P33 and S11 partial waves in agreement with the latest MAID isobar model and the recent partial-wave analyses of the Bonn-Gatchina group. We are able to give a quark-model explanation for the apparent resonance at 1685 MeV in the eta n channel.

  16. Applying Statistical Models and Parametric Distance Measures for Music Similarity Search

    Science.gov (United States)

    Lukashevich, Hanna; Dittmar, Christian; Bastuck, Christoph

    Automatic deriving of similarity relations between music pieces is an inherent field of music information retrieval research. Due to the nearly unrestricted amount of musical data, the real-world similarity search algorithms have to be highly efficient and scalable. The possible solution is to represent each music excerpt with a statistical model (ex. Gaussian mixture model) and thus to reduce the computational costs by applying the parametric distance measures between the models. In this paper we discuss the combinations of applying different parametric modelling techniques and distance measures and weigh the benefits of each one against the others.

  17. Species distribution models for crop pollination: a modelling framework applied to Great Britain.

    Science.gov (United States)

    Polce, Chiara; Termansen, Mette; Aguirre-Gutiérrez, Jesus; Boatman, Nigel D; Budge, Giles E; Crowe, Andrew; Garratt, Michael P; Pietravalle, Stéphane; Potts, Simon G; Ramirez, Jorge A; Somerwill, Kate E; Biesmeijer, Jacobus C

    2013-01-01

    Insect pollination benefits over three quarters of the world's major crops. There is growing concern that observed declines in pollinators may impact on production and revenues from animal pollinated crops. Knowing the distribution of pollinators is therefore crucial for estimating their availability to pollinate crops; however, in general, we have an incomplete knowledge of where these pollinators occur. We propose a method to predict geographical patterns of pollination service to crops, novel in two elements: the use of pollinator records rather than expert knowledge to predict pollinator occurrence, and the inclusion of the managed pollinator supply. We integrated a maximum entropy species distribution model (SDM) with an existing pollination service model (PSM) to derive the availability of pollinators for crop pollination. We used nation-wide records of wild and managed pollinators (honey bees) as well as agricultural data from Great Britain. We first calibrated the SDM on a representative sample of bee and hoverfly crop pollinator species, evaluating the effects of different settings on model performance and on its capacity to identify the most important predictors. The importance of the different predictors was better resolved by SDM derived from simpler functions, with consistent results for bees and hoverflies. We then used the species distributions from the calibrated model to predict pollination service of wild and managed pollinators, using field beans as a test case. The PSM allowed us to spatially characterize the contribution of wild and managed pollinators and also identify areas potentially vulnerable to low pollination service provision, which can help direct local scale interventions. This approach can be extended to investigate geographical mismatches between crop pollination demand and the availability of pollinators, resulting from environmental change or policy scenarios.

  18. Species distribution models for crop pollination: a modelling framework applied to Great Britain.

    Directory of Open Access Journals (Sweden)

    Chiara Polce

    Full Text Available Insect pollination benefits over three quarters of the world's major crops. There is growing concern that observed declines in pollinators may impact on production and revenues from animal pollinated crops. Knowing the distribution of pollinators is therefore crucial for estimating their availability to pollinate crops; however, in general, we have an incomplete knowledge of where these pollinators occur. We propose a method to predict geographical patterns of pollination service to crops, novel in two elements: the use of pollinator records rather than expert knowledge to predict pollinator occurrence, and the inclusion of the managed pollinator supply. We integrated a maximum entropy species distribution model (SDM with an existing pollination service model (PSM to derive the availability of pollinators for crop pollination. We used nation-wide records of wild and managed pollinators (honey bees as well as agricultural data from Great Britain. We first calibrated the SDM on a representative sample of bee and hoverfly crop pollinator species, evaluating the effects of different settings on model performance and on its capacity to identify the most important predictors. The importance of the different predictors was better resolved by SDM derived from simpler functions, with consistent results for bees and hoverflies. We then used the species distributions from the calibrated model to predict pollination service of wild and managed pollinators, using field beans as a test case. The PSM allowed us to spatially characterize the contribution of wild and managed pollinators and also identify areas potentially vulnerable to low pollination service provision, which can help direct local scale interventions. This approach can be extended to investigate geographical mismatches between crop pollination demand and the availability of pollinators, resulting from environmental change or policy scenarios.

  19. Food Safety Detection Methods Applied to National Special Rectification of Product Quality and Food Safety

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    @@ Afour-month period of national special rectification for product quality and food safety officially started on August 25, and was focused on eight fields, including those of agricultural products and processed foods.

  20. Applying unit process life cycle inventor (UPLCI) methodology in product/packaging combinatons

    NARCIS (Netherlands)

    Oude Luttikhuis, E.J.; Toxopeus, M.E.; Overcash, M.; Nee, Andrew Y.C.; Song, Bin; Ong, Soh-Khim

    2013-01-01

    This paper discusses how the UPLCI approach can be used for determining the inventory of the manufacturing phases of product/packaging combinations. The UPLCI approach can make the inventory of the manufacturing process of the product that is investigated more accurate. The life cycle of product/pac

  1. A risk analysis model in concurrent engineering product development.

    Science.gov (United States)

    Wu, Desheng Dash; Kefan, Xie; Gang, Chen; Ping, Gui

    2010-09-01

    Concurrent engineering has been widely accepted as a viable strategy for companies to reduce time to market and achieve overall cost savings. This article analyzes various risks and challenges in product development under the concurrent engineering environment. A three-dimensional early warning approach for product development risk management is proposed by integrating graphical evaluation and review technique (GERT) and failure modes and effects analysis (FMEA). Simulation models are created to solve our proposed concurrent engineering product development risk management model. Solutions lead to identification of key risk controlling points. This article demonstrates the value of our approach to risk analysis as a means to monitor various risks typical in the manufacturing sector. This article has three main contributions. First, we establish a conceptual framework to classify various risks in concurrent engineering (CE) product development (PD). Second, we propose use of existing quantitative approaches for PD risk analysis purposes: GERT, FMEA, and product database management (PDM). Based on quantitative tools, we create our approach for risk management of CE PD and discuss solutions of the models. Third, we demonstrate the value of applying our approach using data from a typical Chinese motor company.

  2. Decision models in designing flexible production systems

    Directory of Open Access Journals (Sweden)

    Florescu Adriana

    2017-01-01

    Full Text Available Flexible production system is a complex whole that raise some issues in terms of its design and in relation to the conditions for implementing it. To implement a flexible production system configuration must be found that satisfies both economic and system performance requirements. The configuration which best meet the objectives of introducing a flexible production system must be sought in the set of alternatives defined and evaluated. In this paper we present a methodology of realising the configuration and complex evaluation of the analyzed system. It will be developed models which generate new alternative configurations, optimization and evaluation models of the performance of the flexible production system. This will create a framework for interactive decision support, user-oriented that can be used by management to solve this selection problem. The applicative character of the study consist in tracking of the technological process in real time using the developed software package on the designed system, based on mathematical models for configuration and optimization of the system.

  3. Modelling the ATP production in mitochondria

    CERN Document Server

    Saa, Alberto

    2012-01-01

    We revisit here the mathematical model for ATP production in mitochondria introduced recently by Bertram, Pedersen, Luciani, and Sherman (BPLS) as a simplification of the more complete but intricate Magnus and Keizer's model. We correct some inaccuracies in the BPLS original approximations and then analyze some of the dynamical properties of the model. We infer from exhaustive numerical explorations that the enhanced BPLS equations have a unique attractor fixed point for physiologically acceptable ranges of mitochondrial variables and respiration inputs. We determine, in the stationary regime, the dependence of the mitochondrial variables on the respiration inputs, namely the cytosolic concentration of calcium ${\\rm Ca}_{\\rm c}$ and the substrate fructose 1,6-bisphosphate FBP. The same effect of calcium saturation reported for the original BPLS model is observed here. We find out, however, an interesting non-stationary effect: the inertia of the model tends to increase considerably for high concentrations of ...

  4. Streamlining environmental product declarations: a stage model

    Science.gov (United States)

    Lefebvre, Elisabeth; Lefebvre, Louis A.; Talbot, Stephane; Le Hen, Gael

    2001-02-01

    General public environmental awareness and education is increasing, therefore stimulating the demand for reliable, objective and comparable information about products' environmental performances. The recently published standard series ISO 14040 and ISO 14025 are normalizing the preparation of Environmental Product Declarations (EPDs) containing comprehensive information relevant to a product's environmental impact during its life cycle. So far, only a few environmentally leading manufacturing organizations have experimented the preparation of EPDs (mostly from Europe), demonstrating its great potential as a marketing weapon. However the preparation of EPDs is a complex process, requiring collection and analysis of massive amounts of information coming from disparate sources (suppliers, sub-contractors, etc.). In a foreseeable future, the streamlining of the EPD preparation process will require product manufacturers to adapt their information systems (ERP, MES, SCADA) in order to make them capable of gathering, and transmitting the appropriate environmental information. It also requires strong functional integration all along the product supply chain in order to ensure that all the information is made available in a standardized and timely manner. The goal of the present paper is two fold: first to propose a transitional model towards green supply chain management and EPD preparation; second to identify key technologies and methodologies allowing to streamline the EPD process and subsequently the transition toward sustainable product development

  5. Fungal Morphology in Industrial Enzyme Production - Modelling and Monitoring

    DEFF Research Database (Denmark)

    Quintanilla, D.; Hagemann, T.; Hansen, K.

    2015-01-01

    Filamentous fungi are widely used in the biotechnology industry for the production of industrial enzymes. Thus, considerable work has been done with the purpose of characterizing these processes. The ultimate goal of these efforts is to be able to control and predict fermentation performance......, and on the way the data is interpreted-i.e. which models were applied. The main filamentous fungi used in industrial fermentation are introduced, ranging from Trichoderma reesei to Aspergillus species. Due to the fact that secondary metabolites, like antibiotics, are not to be considered bulk products, organisms...... like e.g. Penicillium chrysogenum are just briefly touched upon for the description of some characterization techniques. The potential for development of different morphological phenotypes is discussed as well, also in view of what this could mean to productivity and-equally important-the collection...

  6. Applying Modern Techniques and Carrying Out English .Extracurricular—— On the Model United Nations Activity

    Institute of Scientific and Technical Information of China (English)

    XuXiaoyu; WangJian

    2004-01-01

    This paper is an introduction of the extracurricular activity of the Model United Nations in Northwestern Polyteehnical University (NPU) and it focuses on the application of the modem techniques in the activity and the pedagogical theories applied in it. An interview and questionnaire research will reveal the influence of the Model United Nations.

  7. To Be or Not to Be an Entrepreneur: Applying a Normative Model to Career Decisions

    Science.gov (United States)

    Callanan, Gerard A.; Zimmerman, Monica

    2016-01-01

    Reflecting the need for a better and broader understanding of the factors influencing the choices to enter into or exit an entrepreneurial career, this article applies a structured, normative model of career management to the career decision-making of entrepreneurs. The application of a structured model can assist career counselors, college career…

  8. How High Is the Tramping Track? Mathematising and Applying in a Calculus Model-Eliciting Activity

    Science.gov (United States)

    Yoon, Caroline; Dreyfus, Tommy; Thomas, Michael O. J.

    2010-01-01

    Two complementary processes involved in mathematical modelling are mathematising a realistic situation and applying a mathematical technique to a given realistic situation. We present and analyse work from two undergraduate students and two secondary school teachers who engaged in both processes during a mathematical modelling task that required…

  9. Applied modern biotechnology for cultivation of Ganoderma and development of their products.

    Science.gov (United States)

    Zhou, Xuan-Wei; Su, Kai-Qi; Zhang, Yong-Ming

    2012-02-01

    A white-rot basidiomycete Ganoderma spp. has long been used as a medicinal mushroom in Asia, and it has an array of pharmacological properties for immunomodulatory activity. There have been many reports about the bioactive components and their pharmacological properties. In order to analyze the current status of Ganoderma products, the detailed process of cultivation of Ganoderma spp. and development of their products are restated in this review article. These include the breeding, cultivating, extracting bioactive component, and processing Ganoderma products, etc. This article will expand people's common knowledge on Ganoderma, and provide a beneficial reference for research and industrial production.

  10. Collaborative Product Configuration Model in Networked Manufacturing Based on Semantic Web%投稿须知

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    Aim at the requirements of collaborative product configuration design in networked manufacturing, a collaborative product configuration model based on semantic web was exploredA semantic web-based structure of the collaborative product configuration model was proposedAnd a product configuration design workflow model was constructedThe collaborative product configuration ontology was constructed by definition of semantic-based metadata of collaborative product configuration information. The ontology was used for semantic annotation of information that dispersed in the network.So the product configuration information can shared between collaborative enterprises in networked manufacturing.And the efficiency of distribute information exchange and the collaborative product development level can be improved,The validity of the model was verified by applying the model into a networked collaborative design platform.

  11. Chitosan and Its Derivatives Applied in Harvesting Microalgae for Biodiesel Production: An Outlook

    Directory of Open Access Journals (Sweden)

    Guanyi Chen

    2014-01-01

    Full Text Available Although oil-accumulating microalgae are a promising feedstock for biodiesel production, large-scale biodiesel production is not yet economically feasible. As harvesting accounts for an important part of total production cost, mass production of microalgae biodiesel requires an efficient low-energy harvesting strategy so as to make biodiesel production economically attractive. Chitosan has emerged as a favorable flocculating agent in harvesting of microalgae. The aim of this paper is to review current research on the application of chitosan and chitosan-derived materials for harvesting microalgae. This offers a starting point for future studies able to invalidate, confirm, or complete the actual findings and to improve knowledge in this field.

  12. Fourth order phase-field model for local max-ent approximants applied to crack propagation

    OpenAIRE

    Amiri, Fatemeh; Millán, Daniel; Arroyo Balaguer, Marino; Silani, Mohammad; Rabczuk, Timon

    2016-01-01

    We apply a fourth order phase-field model for fracture based on local maximum entropy (LME) approximants. The higher order continuity of the meshfree LME approximants allows to directly solve the fourth order phase-field equations without splitting the fourth order differential equation into two second order differential equations. We will first show that the crack surface can be captured more accurately in the fourth order model. Furthermore, less nodes are needed for the fourth order model ...

  13. Modeling of power train by applying the virtual prototype concept; Kaso genkei ni yoru power train no model ka

    Energy Technology Data Exchange (ETDEWEB)

    Hiramatsu, S.; Harada, Y.; Arakawa, H.; Komori, S. [Mazda Motor Corp., Hiroshima (Japan); Sumida, S. [U-Shin Corp., Tokyo (Japan)

    1997-10-01

    This paper describes the simulation of power train that includes the model developed by applying the virtual prototype concept. By this concept, subsystem models which consist of functional model and mechanism models are integrated into a total system model. This peculiarity in architecture of model, which is called the hierarchical structure, enables us to model a system of large scale with many units, systems and parts easily. Two kinds of computer simulations are performed. One is engine revolution fluctuation by accessory load input, and the other is changing gears by automatic transmission. They are verified to have sufficient accuracy. 2 refs., 12 figs.

  14. A Numerical Procedure for Model Identifiability Analysis Applied to Enzyme Kinetics

    DEFF Research Database (Denmark)

    Daele, Timothy, Van; Van Hoey, Stijn; Gernaey, Krist;

    2015-01-01

    exercise, thereby bypassing the challenging task of model structure determination and identification. Parameter identification problems can thus lead to ill-calibrated models with low predictive power and large model uncertainty. Every calibration exercise should therefore be precededby a proper model...... and Pronzato (1997) and which can be easily set up for any type of model. In this paper the proposed approach is applied to the forward reaction rate of the enzyme kinetics proposed by Shin and Kim(1998). Structural identifiability analysis showed that no local structural model problems were occurring......The proper calibration of models describing enzyme kinetics can be quite challenging. In the literature, different procedures are available to calibrate these enzymatic models in an efficient way. However, in most cases the model structure is already decided on prior to the actual calibration...

  15. Applying the natural disasters vulnerability evaluation model to the March 2011 north-east Japan earthquake and tsunami.

    Science.gov (United States)

    Ruiz Estrada, Mario Arturo; Yap, Su Fei; Park, Donghyun

    2014-07-01

    Natural hazards have a potentially large impact on economic growth, but measuring their economic impact is subject to a great deal of uncertainty. The central objective of this paper is to demonstrate a model--the natural disasters vulnerability evaluation (NDVE) model--that can be used to evaluate the impact of natural hazards on gross national product growth. The model is based on five basic indicators-natural hazards growth rates (αi), the national natural hazards vulnerability rate (ΩT), the natural disaster devastation magnitude rate (Π), the economic desgrowth rate (i.e. shrinkage of the economy) (δ), and the NHV surface. In addition, we apply the NDVE model to the north-east Japan earthquake and tsunami of March 2011 to evaluate its impact on the Japanese economy.

  16. Satellite-based terrestrial production efficiency modeling

    Directory of Open Access Journals (Sweden)

    Obersteiner Michael

    2009-09-01

    Full Text Available Abstract Production efficiency models (PEMs are based on the theory of light use efficiency (LUE which states that a relatively constant relationship exists between photosynthetic carbon uptake and radiation receipt at the canopy level. Challenges remain however in the application of the PEM methodology to global net primary productivity (NPP monitoring. The objectives of this review are as follows: 1 to describe the general functioning of six PEMs (CASA; GLO-PEM; TURC; C-Fix; MOD17; and BEAMS identified in the literature; 2 to review each model to determine potential improvements to the general PEM methodology; 3 to review the related literature on satellite-based gross primary productivity (GPP and NPP modeling for additional possibilities for improvement; and 4 based on this review, propose items for coordinated research. This review noted a number of possibilities for improvement to the general PEM architecture - ranging from LUE to meteorological and satellite-based inputs. Current PEMs tend to treat the globe similarly in terms of physiological and meteorological factors, often ignoring unique regional aspects. Each of the existing PEMs has developed unique methods to estimate NPP and the combination of the most successful of these could lead to improvements. It may be beneficial to develop regional PEMs that can be combined under a global framework. The results of this review suggest the creation of a hybrid PEM could bring about a significant enhancement to the PEM methodology and thus terrestrial carbon flux modeling. Key items topping the PEM research agenda identified in this review include the following: LUE should not be assumed constant, but should vary by plant functional type (PFT or photosynthetic pathway; evidence is mounting that PEMs should consider incorporating diffuse radiation; continue to pursue relationships between satellite-derived variables and LUE, GPP and autotrophic respiration (Ra; there is an urgent need for

  17. Quantitative modeling of Cerenkov light production efficiency from medical radionuclides.

    Directory of Open Access Journals (Sweden)

    Bradley J Beattie

    Full Text Available There has been recent and growing interest in applying Cerenkov radiation (CR for biological applications. Knowledge of the production efficiency and other characteristics of the CR produced by various radionuclides would help in accessing the feasibility of proposed applications and guide the choice of radionuclides. To generate this information we developed models of CR production efficiency based on the Frank-Tamm equation and models of CR distribution based on Monte-Carlo simulations of photon and β particle transport. All models were validated against direct measurements using multiple radionuclides and then applied to a number of radionuclides commonly used in biomedical applications. We show that two radionuclides, Ac-225 and In-111, which have been reported to produce CR in water, do not in fact produce CR directly. We also propose a simple means of using this information to calibrate high sensitivity luminescence imaging systems and show evidence suggesting that this calibration may be more accurate than methods in routine current use.

  18. $Om$ diagnostic applied to scalar field models and slowing down of cosmic acceleration

    CERN Document Server

    Shahalam, M; Agarwal, Abhineet

    2015-01-01

    We apply the $Om$ diagnostic to models for dark energy based on scalar fields. In case of the power law potentials, we demonstrate the possibility of slowing down the expansion of the Universe around the present epoch for a specific range in the parameter space. For these models, we also examine the issues concerning the age of Universe. We use the $Om$ diagnostic to distinguish the $\\Lambda$CDM model from non minimally coupled scalar field, phantom field and generic quintessence models. Our study shows that the $Om$ has zero, positive and negative curvatures for $\\Lambda$CDM, phantom and quintessence models respectively. We use an integrated data base (SN+Hubble+BAO+CMB) for bservational analysis and demonstrate that $Om$ is a useful diagnostic to apply to observational data.

  19. Making Faces - State-Space Models Applied to Multi-Modal Signal Processing

    DEFF Research Database (Denmark)

    Lehn-Schiøler, Tue

    2005-01-01

    The two main focus areas of this thesis are State-Space Models and multi modal signal processing. The general State-Space Model is investigated and an addition to the class of sequential sampling methods is proposed. This new algorithm is denoted as the Parzen Particle Filter. Furthermore......, the Markov Chain Monte Carlo (MCMC) approach to filtering is examined and a scheme for MCMC to be used in on-line applications is proposed. In estimating parameters, it is shown that the EM-algorithm exhibits slow convergence especially in the low noise limit. It is demonstrated how a general gradient...... optimizer can be applied to speed up convergence. The linear version of the State-Space Model, the Kalman Filter, is applied to multi modal signal processing. It is demonstrated how a State-Space Model can be used to map from speech to lip movements. Besides the State-Space Model and the multi modal...

  20. On the spatial and temporal resolution of land cover products for applied use in wind resource mapping

    DEFF Research Database (Denmark)

    Hasager, Charlotte Bay; Badger, Merete; Dellwik, Ebba

    The suitability of Copernicus Global Land Service products for wind assessment is investigated using two approaches. In the first approach the CORINE land cover database and the pan-European highresolution products were considered as input to atmospheric flow models. The CORINE data were used...

  1. Optimization performance of an AnSBBR applied to biohydrogen production treating whey.

    Science.gov (United States)

    Lima, D M F; Lazaro, C Z; Rodrigues, J A D; Ratusznei, S M; Zaiat, M

    2016-03-15

    The present study investigated the influence of the influent concentration of substrate, feeding time and temperature on the production of biohydrogen from cheese whey in an AnSBBR with liquid phase recirculation. The highest hydrogen yield (0.80 molH2.molLactose(-1)) and productivity (660 mLH2 L(-1) d(-1)) were achieved for influent concentrations of 5400 mgDQO L(-1). No significant difference was noted in the biological hydrogen production for the feeding time conditions analyzed. The lowest temperature tested (15 °C) promoted the highest hydrogen yield and productivity (1.12 molH2 molLactose(-1) and 1080 mLH2 L(-1) d(-1)), and for the highest temperature (45 °C), hydrogen production did not occur. The indicator values for the hydrogen production obtained with this configuration were higher than those obtained in other studies using traditional configurations such as UASBr and CSTR. A phylogenetic analysis showed that the majority of the analyzed clones were similar to Clostridium. In addition, clones phylogenetically similar to the Lactobacilaceae family, notably Lactobacillus rhamnosus, and clones with similar sequences to Acetobacter indonesiensis were observed in small proportion in the reactor.

  2. Towards a model for protein production rates

    CERN Document Server

    Dong, J J; Zia, R K P

    2007-01-01

    In the process of translation, ribosomes read the genetic code on an mRNA and assemble the corresponding polypeptide chain. The ribosomes perform discrete directed motion which is well modeled by a totally asymmetric simple exclusion process (TASEP) with open boundaries. Using Monte Carlo simulations and a simple mean-field theory, we discuss the effect of one or two ``bottlenecks'' (i.e., slow codons) on the production rate of the final protein. Confirming and extending previous work by Chou and Lakatos, we find that the location and spacing of the slow codons can affect the production rate quite dramatically. In particular, we observe a novel ``edge'' effect, i.e., an interaction of a single slow codon with the system boundary. We focus in detail on ribosome density profiles and provide a simple explanation for the length scale which controls the range of these interactions.

  3. Towards a Model for Protein Production Rates

    Science.gov (United States)

    Dong, J. J.; Schmittmann, B.; Zia, R. K. P.

    2007-07-01

    In the process of translation, ribosomes read the genetic code on an mRNA and assemble the corresponding polypeptide chain. The ribosomes perform discrete directed motion which is well modeled by a totally asymmetric simple exclusion process (TASEP) with open boundaries. Using Monte Carlo simulations and a simple mean-field theory, we discuss the effect of one or two "bottlenecks" (i.e., slow codons) on the production rate of the final protein. Confirming and extending previous work by Chou and Lakatos, we find that the location and spacing of the slow codons can affect the production rate quite dramatically. In particular, we observe a novel "edge" effect, i.e., an interaction of a single slow codon with the system boundary. We focus in detail on ribosome density profiles and provide a simple explanation for the length scale which controls the range of these interactions.

  4. A Covariant OBE Model for $\\eta$ Production in NN Collisions

    CERN Document Server

    Gedalin, E; Razdolskaya, L A

    1998-01-01

    A relativistic covariant one boson exchange model, previously applied to describe elastic nucleon-nucleon scattering, is extended to study $\\eta$ production in NN collisions. The transition amplitude for the elementary BN->$\\eta$N process with B being the meson exchanged (B=$\\pi$, $|sigma$,$\\eta$, corresponding to s and u-channels with a nucleon or a nucleon isobar N*(1535MeV) in the intermediate states. Taking the relative phases of the various exchange amplitudes to be +1, the model reproduces the cross sections for the $NN\\to X\\eta$ reactions in a consistent manner. In the limit where all overall contributions from the exchange of pseudoscalart and scalar mesons with that of vector mesons cancel out. Consequently, much of the ambiguities in the model predictions due to unknown relative phases of different vector pseudoscalar exchanges are strongly reduced.

  5. Structure-selection techniques applied to continuous-time nonlinear models

    Science.gov (United States)

    Aguirre, Luis A.; Freitas, Ubiratan S.; Letellier, Christophe; Maquet, Jean

    2001-10-01

    This paper addresses the problem of choosing the multinomials that should compose a polynomial mathematical model starting from data. The mathematical representation used is a nonlinear differential equation of the polynomial type. Some approaches that have been used in the context of discrete-time models are adapted and applied to continuous-time models. Two examples are included to illustrate the main ideas. Models obtained with and without structure selection are compared using topological analysis. The main differences between structure-selected models and complete structure models are: (i) the former are more parsimonious than the latter, (ii) a predefined fixed-point configuration can be guaranteed for the former, and (iii) the former set of models produce attractors that are topologically closer to the original attractor than those produced by the complete structure models.

  6. Recent developments of surface complexation models applied to environmental aquatic chemistry

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Based on numerous latest references, the current developments in surface complexation, surface precipitation and the corresponding models (SCMs and SPMs), were reviewed. The contents involved comparison on surface charge composition and layer-structure of solid-solution interface for the classical 1-pK and 2- pK models, In addition, the fundamental concept and relations of the new models, i.e., multi-site complexation (MUSIC) and charge -distribution (CD) MUSIC models were described as well. To avoid misuse or abuse, it must be emphasized that the applicability nd limitation for each model should be considered carefully when selecting the concerned model(s). In addition, some new powerful techniques for surface characterization and analysis applied to model establishment and modification were also briefly introduced.

  7. Applying Value Stream Mapping Technique for Production Improvement in a Manufacturing Company: A Case Study

    Science.gov (United States)

    Jeyaraj, K. L.; Muralidharan, C.; Mahalingam, R.; Deshmukh, S. G.

    2013-01-01

    The purpose of this paper is to explain how value stream mapping (VSM) is helpful in lean implementation and to develop the road map to tackle improvement areas to bridge the gap between the existing state and the proposed state of a manufacturing firm. Through this case study, the existing stage of manufacturing is mapped with the help of VSM process symbols and the biggest improvement areas like excessive TAKT time, production, and lead time are identified. Some modifications in current state map are suggested and with these modifications future state map is prepared. Further TAKT time is calculated to set the pace of production processes. This paper compares the current state and future state of a manufacturing firm and witnessed 20 % reduction in TAKT time, 22.5 % reduction in processing time, 4.8 % reduction in lead time, 20 % improvement in production, 9 % improvement in machine utilization, 7 % improvement in man power utilization, objective improvement in workers skill level, and no change in the product and semi finished product inventory level. The findings are limited due to the focused nature of the case study. This case study shows that VSM is a powerful tool for lean implementation and allows the industry to understand and continuously improve towards lean manufacturing.

  8. Applying Process-Based Models for Subsurface Flow Treatment Wetlands: Recent Developments and Challenges

    Directory of Open Access Journals (Sweden)

    Guenter Langergraber

    2016-12-01

    Full Text Available To date, only few process-based models for subsurface flow treatment wetlands have been developed. For modelling a treatment wetland, these models have to comprise a number of sub-models to describe water flow, pollutant transport, pollutant transformation and degradation, effects of wetland plants, and transport and deposition of suspended particulate matter. The two most advanced models are the HYDRUS Wetland Module and BIO-PORE. These two models are briefly described. This paper shows typical simulation results for vertical flow wetlands and discusses experiences and challenges using process-based wetland models in relation to the sub-models describing the most important wetland processes. It can be demonstrated that existing simulation tools can be applied for simulating processes in treatment wetlands. Most important for achieving a good match between measured and simulated pollutant concentrations is a good calibration of the water flow and transport models. Only after these calibrations have been made and the effect of the influent fractionation on simulation results has been considered, should changing the parameters of the biokinetic models be taken into account. Modelling the effects of wetland plants is possible and has to be considered when important. Up to now, models describing clogging are the least established models among the sub-models required for a complete wetland model and thus further development and research is required.

  9. Dilepton production with the SMASH model

    CERN Document Server

    Weil, Janus; Petersen, Hannah

    2016-01-01

    In this work the SMASH model is presented ("Simulating Many Accelerated Strongly-Interacting Hadrons"), a next-generation hadronic transport approach, which is designed to describe the non-equilibrium evolution of hadronic matter in heavy-ion collisions. We discuss first dilepton spectra obtained with SMASH in the few-GeV energy range of GSI/FAIR, where the dynamics of hadronic matter is dominated by the production and decay of various resonance states. In particular we show how electromagnetic transition form factors can emerge in a transport picture under the hypothesis of vector-meson dominance.

  10. Finite element modeling and analysis of piezo-integrated composite structures under large applied electric fields

    Science.gov (United States)

    Rao, M. N.; Tarun, S.; Schmidt, R.; Schröder, K.-U.

    2016-05-01

    In this article, we focus on static finite element (FE) simulation of piezoelectric laminated composite plates and shells, considering the nonlinear constitutive behavior of piezoelectric materials under large applied electric fields. Under the assumptions of small strains and large electric fields, the second-order nonlinear constitutive equations are used in the variational principle approach, to develop a nonlinear FE model. Numerical simulations are performed to study the effect of material nonlinearity for piezoelectric bimorph and laminated composite plates as well as cylindrical shells. In comparison to the experimental investigations existing in the literature, the results predicted by the present model agree very well. The importance of the present nonlinear model is highlighted especially in large applied electric fields, and it is shown that the difference between the results simulated by linear and nonlinear constitutive FE models cannot be omitted.

  11. Applying rough sets in word segmentation disambiguation based on maximum entropy model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    To solve the complicated feature extraction and long distance dependency problem in Word Segmentation Disambiguation ( WSD), this paper proposes to apply rough sets in WSD based on the Maximum Entropy model. Firstly, rough set theory is applied to extract the complicated features and long distance features, even from noise or inconsistent corpus. Secondly, these features are added into the Maximum Entropy model, and consequently, the feature weights can be assigned according to the performance of the whole disambiguation model. Finally, the semantic lexicon is adopted to build class-based rough set features to overcome data sparseness. The experiment indicated that our method performed better than previous models, which got top rank in WSD in 863 Evaluation in 2003. This system ranked first and second respectively in MSR and PKU open test in the Second International Chinese Word Segmentation Bakeoff held in 2005.

  12. Parametric Cost Estimation Utilizing Development-to-Production Relationship Applied to the Advanced Amphibious Assault Vehicle

    Science.gov (United States)

    1991-12-01

    cime span between development and production to predict TFUs. The inclusion of a variable that explains time spent in development is compatible with...1971. "Cost Performance Reports", Bradley Fighting Vehicle, FMC Corp., 1980-1989. Fisher, Gene H., Cost Considerations in Systems Analysis, RAND Corp

  13. Applying Classical Ethical Theories to Ethical Decision Making in Public Relations: Perrier's Product Recall.

    Science.gov (United States)

    Pratt, Cornelius B.

    1994-01-01

    Links ethical theories to the management of the product recall of the Perrier Group of America. Argues for a nonsituational theory-based eclectic approach to ethics in public relations to enable public relations practitioners, as strategic communication managers, to respond effectively to potentially unethical organizational actions. (SR)

  14. Multichannel Learning Research Applied to Principles of Television Production: A Review and Synthesis of the Literature.

    Science.gov (United States)

    Hanson, LuEtt

    1989-01-01

    Reviews multichannel learning research to find the best ways to combine audio and video in television to improve learning, and summarizes the research findings into principles for instructional television production. Highlights include the effects of redundancy on learning and on audience attention, message clarity, and problems in instructional…

  15. 2D Articulatory Velum Modeling Applied to Copy Synthesis of Sentences Containing Nasal Phonemes

    OpenAIRE

    Laprie, Yves; Elie, Benjamin; Tsukanova, Anastasiia

    2015-01-01

    International audience; Articulatory synthesis could become a valuable tool to investigate links between articulatory gestures and acoustic cues. This papers presents the construction of an articulatory model of the velum which is intended to complete a model already comprising other articulators. The velum contour was de-lineated and extracted from a thousand of X-ray images corresponding to short sentences of French. A principal component analysis was applied in order to derive the main def...

  16. The k-ε-fP model applied to wind farms

    DEFF Research Database (Denmark)

    Laan, van der, Paul Maarten; Sørensen, Niels N.; Réthoré, Pierre-Elouan;

    2015-01-01

    The recently developed k-ε-fP eddy-viscosity model is applied to one on-shore and two off-shore wind farms. The results are compared with power measurements and results of the standard k-ε eddy-viscosity model. In addition, the wind direction uncertainty of the measurements is used to correct...... by the turbulence models becomes smaller for wind turbines that are located further downstream. Moreover, the difference between the capability of the turbulence models to estimate the wind farm efficiency reduces with increasing wind farm size and wind turbine spacing. Copyright © 2014 John Wiley & Sons, Ltd....... the model results with a Gaussian filter. The standard k-ε eddy-viscosity model underpredicts the power deficit of the first downstream wind turbines, whereas the k-ε-fP eddy-viscosity model shows a good agreement with the measurements. However, the difference in the power deficit predicted...

  17. From Add-On to Mainstream: Applying Distance Learning Models for ALL Students

    Science.gov (United States)

    Zai, Robert, III.; Wesley, Threasa L.

    2013-01-01

    The use of distance learning technology has allowed Northern Kentucky University's W. Frank Steely Library to remove traditional boundaries between both distance and on-campus students. An emerging model that applies these distance learning methodologies to all students has proven effective for enhancing reference and instructional services. This…

  18. Risk assessment and food allergy: the probabilistic model applied to allergens

    NARCIS (Netherlands)

    Spanjersberg, M.Q.I.; Kruizinga, A.G.; Rennen, M.A.J.; Houben, G.F.

    2007-01-01

    In order to assess the risk of unintended exposure to food allergens, traditional deterministic risk assessment is usually applied, leading to inconsequential conclusions as 'an allergic reaction cannot be excluded'. TNO therefore developed a quantitative risk assessment model for allergens based on

  19. Applying the Transtheoretical Model to Reality Television: The Biggest Loser Case Study

    Science.gov (United States)

    Barry, Adam E.; Piazza-Gardner, Anna K.

    2012-01-01

    This teaching idea presents a heuristic example using reality television as a tool for applying health behavior theory. It utilizes The Biggest Loser (TBL) to provide "real world" cases depicting how individuals progress through/experience the Transtheoretical Model (TTM). Observing TBL contestants provides students practice grounding…

  20. Applying an Environmental Model to Address High-Risk Drinking: A Town/Gown Case Study

    Science.gov (United States)

    Bishop, John B.; Downs, Tracy T.; Cohen, Deborah

    2008-01-01

    This article provides a case study of a project by the University of Delaware and the City of Newark to apply an environmental model to address the excessive use of alcohol by college students. Data about changes in the behavior and experiences of students over a 10-year period are compared. The authors discuss some of the practical implications…

  1. Flipped Classroom Adapted to the ARCS Model of Motivation and Applied to a Physics Course

    Science.gov (United States)

    Asiksoy, Gülsüm; Özdamli, Fezile

    2016-01-01

    This study aims to determine the effect on the achievement, motivation and self-sufficiency of students of the flipped classroom approach adapted to Keller's ARCS (Attention, Relevance, Confidence and Satisfaction) motivation model and applied to a physics course. The study involved 66 students divided into two classes of a physics course. The…

  2. Active lubrication applied to radial gas journal bearings. Part 2: Modelling improvement and experimental validation

    DEFF Research Database (Denmark)

    Pierart, Fabián G.; Santos, Ilmar F.

    2016-01-01

    Actively-controlled lubrication techniques are applied to radial gas bearings aiming at enhancing one of their most critical drawbacks, their lack of damping. A model-based control design approach is presented using simple feedback control laws, i.e. proportional controllers. The design approach...

  3. Modeling the current distribution in HTS tapes with transport current and applied magnetic field

    NARCIS (Netherlands)

    Yazawa, Takashi; Rabbers, Jan-Jaap; Shevchenko, Oleg A.; Haken, ten Bennie; Kate, ten Herman H.J.; Maeda, Hideaki

    1999-01-01

    A numerical model is developed for the current distribution in a high temperature superconducting (HTS) tape, (Bi,Pb)2Sr2 Ca2Cu3Ox-Ag, subjected to a combination of a transport current and an applied magnetic field. This analysis is based on a two-dimensional formulation of Maxwell's equations in te

  4. Applying the Codependency Model to a Group for Families of Obsessive-Compulsive People.

    Science.gov (United States)

    Cooper, Marlene

    1995-01-01

    Applies codependency group model to families of obsessive-compulsive people based on view that these families are normal, feeling people who are trying to cope with unremitting stress. Clinical vignettes illustrate how these families are similar to families of alcoholics in their management of emotions and in their dysfunctional behaviors. (JBJ)

  5. Problems and advantages of applying the e-learning model to the teaching of English

    OpenAIRE

    Shaparenko, А.; Golikova, А.

    2013-01-01

    In this article we mention some potential and noted problems and advantages of applying the e-learning model to the teaching of English. In the area of foreign language teaching a lot has been done, but there are constant attempts for new solutions. Another option for e-learning is a hybrid course.

  6. Assessing options to increase water productivity in irrigated river basins using remote sensing and modelling tools

    NARCIS (Netherlands)

    Dam, van J.C.; Singh, R.; Bessembinder, J.J.E.; Leffelaar, P.A.; Bastiaanssen, W.G.M.; Jhorar, R.K.; Kroes, J.G.; Droogers, P.

    2006-01-01

    In regions where water is more scarce than land, the water productivity concept (e.g. crop yield per unit of water utilized) provides a useful framework to analyse crop production increase or water savings in irrigated agriculture. Generic crop and soil models were applied at field and regional scal

  7. Lean manufacturing and Toyota Production System terminology applied to the procurement of vascular stents in interventional radiology.

    Science.gov (United States)

    de Bucourt, Maximilian; Busse, Reinhard; Güttler, Felix; Wintzer, Christian; Collettini, Federico; Kloeters, Christian; Hamm, Bernd; Teichgräber, Ulf K

    2011-08-01

    OBJECTIVES: To apply the economic terminology of lean manufacturing and the Toyota Production System to the procurement of vascular stents in interventional radiology. METHODS: The economic- and process-driven terminology of lean manufacturing and the Toyota Production System is first presented, including information and product flow as well as value stream mapping (VSM), and then applied to an interdisciplinary setting of physicians, nurses and technicians from different medical departments to identify wastes in the process of endovascular stent procurement in interventional radiology. RESULTS: Using the so-called seven wastes approach of the Toyota Production System (waste of overproducing, waiting, transport, processing, inventory, motion and waste of defects and spoilage) as well as further waste characteristics (gross waste, process and method waste, and micro waste), wastes in the process of endovascular stent procurement in interventional radiology were identified and eliminated to create an overall smoother process from the procurement as well as from the medical perspective. CONCLUSION: Economic terminology of lean manufacturing and the Toyota Production System, especially VSM, can be used to visualise and better understand processes in the procurement of vascular stents in interventional radiology from an economic point of view.

  8. The Production Management Improvement Model of Small and Medium-sized Permanent Magnetic Materials Manufacturing Companies

    Institute of Scientific and Technical Information of China (English)

    TIAN Zhong-qu; ZHANG Shu-juan; JIN Ya-juan; REN Miao; AN Pei

    2016-01-01

    An integrated production planning and control model based on MRPⅡand JIT is put forward through analyzing the characteristics of magnetic materials manufacturing companies. Master Production Schedule with limited capacity and operational plan in workshop level based on the basic data of flow chart are formulated by this model which applied JIT idea and based on customer order demand. Push production is adapted during execution phase combined with process flow cards. The model is helpful to reduce inventory, keep certain flexbility of production and improve continuity and equilibrium of manufacturing process.

  9. Statistical modelling of fine red wine production

    Directory of Open Access Journals (Sweden)

    María Rosa Castro

    2010-05-01

    Full Text Available Producing wine is a very important economic activity in the province of San Juan in Argentina; it is therefore most important to predict production regarding the quantity of raw material needed. This work was aimed at obtaining a model relating kilograms of crushed grape to the litres of wine so produced. Such model will be used for predicting precise future values and confidence intervals for determined quantities of crushed grapes. Data from a vineyard in the province of San Juan was thus used in this work. The sampling coefficient of correlation was calculated and a dispersion diagram was then constructed; this indicated a li- neal relationship between the litres of wine obtained and the kilograms of crushed grape. Two lineal models were then adopted and variance analysis was carried out because the data came from normal populations having the same variance. The most appropriate model was obtained from this analysis; it was validated with experimental values, a good approach being obtained.

  10. Forensic engineering: applying materials and mechanics principles to the investigation of product failures.

    Science.gov (United States)

    Hainsworth, S V; Fitzpatrick, M E

    2007-06-01

    Forensic engineering is the application of engineering principles or techniques to the investigation of materials, products, structures or components that fail or do not perform as intended. In particular, forensic engineering can involve providing solutions to forensic problems by the application of engineering science. A criminal aspect may be involved in the investigation but often the problems are related to negligence, breach of contract, or providing information needed in the redesign of a product to eliminate future failures. Forensic engineering may include the investigation of the physical causes of accidents or other sources of claims and litigation (for example, patent disputes). It involves the preparation of technical engineering reports, and may require giving testimony and providing advice to assist in the resolution of disputes affecting life or property.This paper reviews the principal methods available for the analysis of failed components and then gives examples of different component failure modes through selected case studies.

  11. Performance of different fire retardant products applied on Norway spruce tested in a Cone calorimeter

    Directory of Open Access Journals (Sweden)

    Kögl Josef

    2013-11-01

    Full Text Available On the European market there are several fire retardant products available, which reach class B in the European classification system. The producers promise their fire retardants are effective in reducing different reaction to fire parameters of wood such as the time to ignition, the mass loss rate, the heat release rate, the total heat release, the charring rate and the flame spread. This paper discusses the performance of fire retardant products as pressure impregnated wood, non-intumescence surface coatings and intumescence coatings on Norway spruce (Picea abies. The investigations are performed by using a cone calo- rimeter test according to ISO 5660. The thermal exposures of the investigations are 50 kW/m2 and the standard IS0 834 test curve. As result information about the heat release rate, the mass loss rate and the total heat release for duration of 900 seconds will be presented in this paper.

  12. Quantification of Benefits and Cost from Applying a Product Configuration System

    DEFF Research Database (Denmark)

    Kristjansdottir, Katrin; Shafiee, Sara; Hvam, Lars

    This article aims at analyzing the long-term benefits and the cost from developing, implementing and maintaining product configuration systems (PCSs). The results presented indicate that over 5 years period a case company has achieved significant savings as a result to reduced workload of generat......This article aims at analyzing the long-term benefits and the cost from developing, implementing and maintaining product configuration systems (PCSs). The results presented indicate that over 5 years period a case company has achieved significant savings as a result to reduced workload...... of generating the products’ specifications. In addition the lead-time for generating products’ specifications has been reduced and indications of improved quality of the products’ specifications and additional sales are identified. The research verifies the benefits described in the current literature...... and contributes by linking the benefits to the direct cost savings companies can expect from utilizing PCSs....

  13. Development of statistical methodologies applied to anthropometric data oriented towards the ergonomic design of products

    OpenAIRE

    Vinué Visús, Guillermo

    2014-01-01

    Ergonomics is the scientific discipline that studies the interactions between human beings and the elements of a system and presents multiple applications in areas such as clothing and footwear design or both working and household environments. In each of these sectors, knowing the anthropometric dimensions of the current target population is fundamental to ensure that products suit as well as possible most of the users who make up the population. Anthropometry refers to the study of the meas...

  14. Applying Behavioral Economics Concepts in Designing Usage-Based Car Insurance Products

    OpenAIRE

    Greenberg, Allen

    2010-01-01

    Behavioral economics, a discipline combining economics and psychology to explain consumer decision making, offers insights on how best to institute transportation pricing in a manner that is acceptable to drivers and also meets public policy objectives. As an example of how to use this relatively new discipline to enhance the acceptance and benefits of transportation pricing, its application to designing usage-based or pay-as-you-drive-and-you-save (PAYDAYS) insurance products is explored. Sp...

  15. Brand extensions : a marketing plan for the launching of a tea product line applied to Luso

    OpenAIRE

    Santos, João Paulo Coelho dos

    2013-01-01

    Project submitted as partial requirement for the conferral of Master of Science in Marketing / JEL Codes: Marketing M3, Health I19 This thesis project has the main goal of developing a strategic and operational marketing plan to support the launching of a tea product range by Luso directed to healthy minded consumers. The launching process will take place in Portuguese market during the summer of 2013. In this marketing plan was identified the potential market size and set actions for t...

  16. The production of subjectivities in the Net: Following the trail of a division of applied psychology

    OpenAIRE

    Leal Ferreira, Arthur Arruda; Universidade Federal do Rio de Janeiro; Foureaux, Bruno; Universidade Federal do Rio de Janeiro, Rio de Janeiro, Brasil; Torres Brandão, Julia; Universidade Federal do Rio de Janeiro, Rio de Janeiro, Brasil; Ruthes Sodré, Karoline; Universidade Federal do Rio de Janeiro, Rio de Janeiro, Brasil; Barbosa Verly Miguel, Marcus Vinicius; Universidade Federal do Rio de Janeiro, Rio de Janeiro, Brasil; Barbosa Pereira, Natalia; Universidade Federal do Rio de Janeiro, Rio de Janeiro, Brasil

    2013-01-01

    This paper seeks to bring to the stage the different modes of production of subjectivities engendered by clinical psychological practices and modes of translation and coordination between them. Such research is based on the conceptual Political Epistemology of Stengers and Despret and the Actor-Network Theory of Latour and Law. For these authors, scientific knowledge is produced not as a representation of reality through well-formed sentences, but as modes of articulation between researchers ...

  17. Modeling of Kefir Production with Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    Hüseyin Nail Akgül

    2014-06-01

    Full Text Available The fermentation is ended with pH 4.6 values in industrial production of kefir. In this study, the incubation temperature, the incubation time and inoculums of culture were chose as variable parameters of kefir. In conventional control systems, the value of pH can be found by trial method. In these systems, if the number of input parameters is greater, the method of trial and error creates a system dependent on the person as well as troublesome. Fuzzy logic can be used in such cases. Modeling studies with this fuzzy logic control are examined in two portions. The first part consists of fuzzy rules and membership functions, while the second part consists of clarify. Kefir incubation temperature between 20 and 25°C, the incubation period between 18 to 22 hours and the inoculum ratio of culture between 1-5% are selected for optimum production conditions. Three separate fuzzy sets (triangular membership function are used to blur the incubation temperature, the incubation time and the inoculum ratio of culture. Because the membership function numbers belonging to the the input parameters are 3 units, 3x3x3=27 line rule is obtained by multiplying these numbers. The table of fuzzy rules was obtained using the method of Mamdani. The membership function values were determined by the method of average weight using three trapezoidal area of membership functions created for clarification. The success of the system will be found, comparing the numerical values obtained with pH values that should be. Eventually, to achieve the desired pH value of 4.6 in the production of kefir, with the using of fuzzy logic, the workload of people will be decreased and the productivity of business can be increased. In this case, it can be provided savings in both cost and time.

  18. Process balance and product quality in the production of natural indigo from Polygonum tinctorium Ait. applying low-technology methods.

    Science.gov (United States)

    Bechtold, T; Turcanu, A; Geissler, S; Ganglberger, E

    2002-02-01

    Indigo is the most important blue component in the class of natural dyes for cellulose and protein fibres. In the moderate European climate Polygonum tinctorium Ait. could be an interesting source for natural indigo (Vat blue 1). Following a cultivation of the plant material a simple procedure for the extraction of the indigo precursor indican was investigated with regard to crop and quality of dye obtained. The dependence of the crop on the storage conditions of the harvested plant material was investigated. The results quantify the distinct sensitivity of the fresh material to the time of storage before extraction with regard to the amount of natural indigo obtained, the photometrically determined indigo content in the product and the shade and colour depth observed in standardised dyeing experiments. A basic set of data is presented, which describes the process in terms of consumption of energy, water and chemicals and organic waste released from the extraction step.

  19. Modelling the effects of the sterile insect technique applied to Eldana saccharina Walker in sugarcane

    Directory of Open Access Journals (Sweden)

    L Potgieter

    2012-12-01

    Full Text Available A mathematical model is formulated for the population dynamics of an Eldana saccharina Walker infestation of sugarcane under the influence of partially sterile released insects. The model describes the population growth of and interaction between normal and sterile E.saccharina moths in a temporally variable, but spatially homogeneous environment. The model consists of a deterministic system of difference equations subject to strictly positive initial data. The primary objective of this model is to determine suitable parameters in terms of which the above population growth and interaction may be quantified and according to which E.saccharina infestation levels and the associated sugarcane damage may be measured. Although many models have been formulated in the past describing the sterile insect technique, few of these models describe the technique for Lepidopteran species with more than one life stage and where F1-sterility is relevant. In addition, none of these models consider the technique when fully sterile females and partially sterile males are being released. The model formulated is also the first to describe the technique applied specifically to E.saccharina, and to consider the economic viability of applying the technique to this species. Pertinent decision support is provided to farm managers in terms of the best timing for releases, release ratios and release frequencies.

  20. DEFICIENT INFORMATION MODELING OF MECHANICAL PRODUCTS FOR CONCEPTUAL SHAPE DESIGN

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In allusion to the deficient feature of product information in conceptual design, a framework of deficient information modeling for conceptual shape design is put forward, which includes qualitative shape modeling (a qualitative solid model), uncertain shape modeling (an uncertain relation model) and imprecise shape modeling (an imprecise region model). In the framework, the qualitative solid model is the core, which represents qualitatively (using symbols) the conceptual shapes of mechanical products. The uncertain relation model regarding domain relations as objects and the imprecise region model regarding domains as objects are used to deal with the uncertain and imprecise issues respectively, which arise from qualitative shape modeling or exist in product information itself.

  1. Proposal for product development model focused on ce certification methodology

    Directory of Open Access Journals (Sweden)

    Nathalia Marcia Goulart Pinheiro

    2015-09-01

    Full Text Available This paper presents a critical analysis comparing 21 product development models in order to identify whether these structures meet the demands Product Certification of the European Community (CE. Furthermore, it presents a product development model, comprising the steps in the models analyzed, including improvements in activities for referred product certification. The proposed improvements are justified by the growing quest for the internationalization of products and processes within companies.

  2. Extension of internationalisation models drivers and processes for the globalisation of product development

    DEFF Research Database (Denmark)

    Søndergaard, Erik Stefan; Oehmen, Josef; Ahmed-Kristensen, Saeema

    2016-01-01

    This paper develops an extension to established production- and supply chain management focused internationalisation models. It applies explorative case studies in Danish and Chinese engineering firms to discover how the globalisation process of product development differs from Danish and Chinese...... perspectives. The paper uses internationalisation and global product development theory to explain similarities and differences in the approaches. Grounded in case study results, a new model for internationalisation is proposed. The new model expands the internationalisation process model to include steps...... of product development and collaborative distributed development beyond sourcing, sales and production elements. The paper then provides propositions for how to further develop the suggested model, and how western companies can learn from the Chinese approaches, and globalise their product development...

  3. A generalised chemical precipitation modelling approach in wastewater treatment applied to calcite

    DEFF Research Database (Denmark)

    Mbamba, Christian Kazadi; Batstone, Damien J.; Flores Alsina, Xavier

    2015-01-01

    , the present study aims to identify a broadly applicable precipitation modelling approach. The study uses two experimental platforms applied to calcite precipitating from synthetic aqueous solutions to identify and validate the model approach. Firstly, dynamic pH titration tests are performed to define...... of the mineral particulate state (Xcryst) and, for calcite, have a 2nd order dependency (exponent n ¼ 2.05 ± 0.29) on thermodynamic supersaturation (s). Parameter analysis indicated that the model was more tolerant to a fast kinetic coefficient (kcryst) and so, in general, it is recommended that a large kcryst...

  4. A THIRD-ORDER BOUSSINESQ MODEL APPLIED TO NONLINEAR EVOLUTION OF SHALLOW-WATER WAVES

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The conventional Boussinesq model is extended to the third order in dispersion and nonlinearity. The new equations are shown to possess better linear dispersion characteristics. For the evolution of periodic waves over a constant depth, the computed wave envelops are spatially aperiodic and skew. The model is then applied to the study of wave focusing by a topographical lens and the results are compared with Whalin's (1971) experimental data as well as some previous results from the conventional Boussinesq model. Encouragingly, improved agreement with Whalin's experimental data is found.

  5. Event based uncertainty assessment in urban drainage modelling, applying the GLUE methodology

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Beven, K.J.; Jensen, Jacob Birk

    2008-01-01

    In the present paper an uncertainty analysis on an application of the commercial urban drainage model MOUSE is conducted. Applying the Generalized Likelihood Uncertainty Estimation (GLUE) methodology the model is conditioned on observation time series from two flow gauges as well as the occurrence...... of combined sewer overflow. The GLUE methodology is used to test different conceptual setups in order to determine if one model setup gives a better goodness of fit conditional on the observations than the other. Moreover, different methodological investigations of GLUE are conducted in order to test...

  6. Dynamic plant uptake model applied for drip irrigation of an insecticide to pepper fruit plants

    DEFF Research Database (Denmark)

    Legind, Charlotte Nielsen; Kennedy, C. M.; Rein, Arno;

    2011-01-01

    irrigation, its application for a soil-applied insecticide and a sensitivity analysis of the model parameters. RESULTS: The model predicted the measured increase and decline of residues following two soil applications of an insecticide to peppers, with an absolute error between model and measurement ranging...... from 0.002 to 0.034 mg kg fw—1. Maximum measured concentrations in pepper fruit were approximately 0.22 mg kg fw—1. Temperature was the most sensitive component for predicting the peak and final concentration in pepper fruit, through its influence on soil and plant degradation rates...

  7. Applying a Knowledge Management Modeling Tool for Manufacturing Vision (MV) Development

    DEFF Research Database (Denmark)

    Wang, Chengbo; Luxhøj, James T.; Johansen, John

    2004-01-01

    that the CBRM is supportive to the decision-making process of applying and augmenting organizational knowledge. It provides a new angle to tackle strategic management issues within the manufacturing system of a business operation. Explores a new proposition within strategic manufacturing management by enriching......This paper introduces an empirical application of an experimental model for knowledge management within an organization, namely a case-based reasoning model for manufacturing vision development (CBRM). The model integrates the development process of manufacturing vision with the methodology of case...

  8. Comparison among Models to Estimate the Shielding Effectiveness Applied to Conductive Textiles

    Directory of Open Access Journals (Sweden)

    Alberto Lopez

    2013-01-01

    Full Text Available The purpose of this paper is to present a comparison among two models and its measurement to calculate the shielding effectiveness of electromagnetic barriers, applying it to conductive textiles. Each one, models a conductive textile as either a (1 wire mesh screen or (2 compact material. Therefore, the objective is to perform an analysis of the models in order to determine which one is a better approximation for electromagnetic shielding fabrics. In order to provide results for the comparison, the shielding effectiveness of the sample has been measured by means of the standard ASTM D4935-99.

  9. The bi-potential method applied to the modeling of dynamic problems with friction

    Science.gov (United States)

    Feng, Z.-Q.; Joli, P.; Cros, J.-M.; Magnain, B.

    2005-10-01

    The bi-potential method has been successfully applied to the modeling of frictional contact problems in static cases. This paper presents an extension of this method for dynamic analysis of impact problems with deformable bodies. A first order algorithm is applied to the numerical integration of the time-discretized equation of motion. Using the Object-Oriented Programming (OOP) techniques in C++ and OpenGL graphical support, a finite element code including pre/postprocessor FER/Impact is developed. The numerical results show that, at the present stage of development, this approach is robust and efficient in terms of numerical stability and precision compared with the penalty method.

  10. Identification of profitable areas to apply product configuration systems in Engineering-To-Order companies

    DEFF Research Database (Denmark)

    Kristjansdottir, Katrin; Hvam, Lars; Shafiee, Sara

    2015-01-01

    This article suggests a systematic framework for identifying potential areas, where Engineering-To-Order (ETO) companies may increase their profit-ability by implementing a Product Configuration System (PCS). In order to do so a three-step framework is proposed based on literature. The starting...... point is to conduct a profitability analysis to determine the accuracy of the cost estima-tions, and based on that the reason for the deviations across different projects is found. The next step is to generate the scope for different scenarios that aim to improve the current situation. Finally...

  11. Reduction of Biological Sludge Production Applying an Alternating Oxic/anoxic Process in Water Line.

    Science.gov (United States)

    Eusebi, Anna Laura; Panigutti, Maximiliano; Battistoni, Paolo

    2016-06-01

    Alternating oxic/anoxic process, applied for the main objective of the improvement of nitrogen performances, was studied in terms of secondary effect of biomass reduction. The process was carried out in one real water resource recovery facility and the data were compared with the previous conventional period when a conventional process was adopted. The main mechanism of the process for the sludge minimization is recognized in the metabolic uncoupling. In fact, an increase of the specific oxygen uptake rate in the biological reactor was recorded stimulated by the change of the oxidation reduction potential environment. Moreover, the heterotrophic growth yield was measured equal to 0.385 kgVSS/kgCOD. The global percentage of reduction was tested with the mass balance of solids. The process is able to decrease the observed sludge yield up to 20%. The specific energy consumption was evaluated.

  12. Information Sharing In Shipbuilding based on the Product State Model

    DEFF Research Database (Denmark)

    Larsen, Michael Holm

    1999-01-01

    The paper provides a review of product modelling technologies and the overall architecture for the Product State Model (PSM) environment as a basis for how dynamically updated product data can improve control of production activities. Especially, the paper focuses on the circumstances prevailing...

  13. Global sensitivity analysis applied to drying models for one or a population of granules

    DEFF Research Database (Denmark)

    Mortier, Severine Therese F. C.; Gernaey, Krist; Thomas, De Beer;

    2014-01-01

    compared to our earlier work. beta(2) was found to be the most important factor for the single particle model which is useful information when performing model calibration. For the PBM-model, the granule radius and gas temperature were found to be most sensitive. The former indicates that granulator......The development of mechanistic models for pharmaceutical processes is of increasing importance due to a noticeable shift toward continuous production in the industry. Sensitivity analysis is a powerful tool during the model building process. A global sensitivity analysis (GSA), exploring...... sensitivity in a broad parameter space, is performed to detect the most sensitive factors in two models, that is, one for drying of a single granule and one for the drying of a population of granules [using population balance model (PBM)], which was extended by including the gas velocity as extra input...

  14. Radioscopy applied to the improvement of industrial processes of quality control in the Brazilian footwear production

    Energy Technology Data Exchange (ETDEWEB)

    Fernandes, Marcela Tatiana Fernandes; Mello Filho, Mauro Otto de Cavalcanti, E-mail: mbeserra@cefet-rj.br, E-mail: maurootto@cefet-rj.br [Centro Federal de Educacao Tecnologica Celso Suckow da Fonseca (CEFET-RJ), Rio de Janeiro, RJ (Brazil); Raupp, Fernanda Maria Pereira, E-mail: fraupp@puc-rio.br [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio), Rio de Janeiro, RJ (Brazil). Departamento de Engenharia Industrial

    2013-07-01

    According to the Ministry of Development, Industry and Foreign Trade, China has the last five years in the Brazilian footwear market for imports, representing 70% of total imports. Brazil has been recording declines in footwear exports; in 2011 there was an average reduction of 21.5% compared to 2010. Thus, Brazil has moved to the eighth position in the export market. Moreover, Asians have been improving the quality and technological level of their footwear for niche markets. It is well known that the introduction of new technologies into industrial organizations enables adding value to their products, making the organizations more competitive in the global market. In this work, we present a study on the use of radioscopy technique to improve quality control of the Brazilian footwear industry. Being already used by some international footwear manufactures, aiming at the identification of strange bodies, control jumps, among other aspects, this technique brings innovation to the referred industry, since it is a non-destructive test approach that makes use of X-rays. We also propose a tool for the application of radioscopy technique to improve quality control processes of footwear production, employing concepts of Failure Modes and Effects Analysis (FMEA). (author)

  15. A new glacier model resolving ice dynamics applied to the Alps

    Science.gov (United States)

    Maussion, Fabien; Marzeion, Ben

    2016-04-01

    Most regional and global glacier models rely on empirical scaling laws to account for glacier area and volume change with time. These scaling methods are computationally cheap and are statistically robust when applied to many glaciers, but their accuracy considerably lowers at the glacier or catchment scale. The nearest alternative in terms of complexity - glacier flowline modelling - requires significantly more information about the glacier geometry. Here we present a new open source glacier model applicable at regional to global scale implementing i) the determination of glacier centerlines, ii) the inversion of glacier bed topography, and iii) a multi-branch flowline model handling glacier tributaries. Using the HISTALP dataset as climatological input we apply the model in the Alps for 1800 to present and present new estimations of present-day and past glacier volume. The relatively large number of independent data available for validation in this region allow a critical discussion of the added value of our new approach. In particular, we will focus our discussion on two contradictory aspects inherent to any geoscientific model development: while our model clearly opens wide-ranging possibilities to better resolve the glacier processes, this new playground is associated with an increase in complexity, the number of calibration parameters, and…uncertainty?

  16. Applying a Hybrid QFD-TOPSIS Method to Design Product in the Industry (Case Study in Sum Service Company)

    OpenAIRE

    2012-01-01

    Electronics industry as an industry with high added value and television production industry especially as one of its pillars play an important role in the economy of each country. Therefore, the aim study of this paper is to illustrate how, using a combined QFD-TOPSIS model, organizations are able to their design product in accordance with requirements of consumers with a case study in Sum Service Company. Quality Function Deployment (QFD) is one such extremely important quality management t...

  17. New Design Objective and Human Intent-based Management of Changes for Product Modeling

    Directory of Open Access Journals (Sweden)

    László Horváth

    2007-03-01

    Full Text Available This paper is concerned with the evaluation of effects of modeled object changesat development of product in virtual space. Modeling of engineering objects such aselements and structures of products, results of analyses and tests, processes for production,and customer services has reached the level where sophisticated descriptions and modelingprocedures serve lifecycle management of product information (PLM. However, effectiveutilization of highly associative product models is impossible in current modeling becausetechniques are not available for tracking and evaluation of high number of associativerelationships in large product model. The author analyzed the above problem andconsidered inappropriate organization of product information as its main cause. In orderto gain a solution in current modeling systems, a new method is proposed in this paper forchange management. In this method, joint modeling of design objectives and human intentis applied for shape-centered products. As background information for the proposedmodeling, paper discusses research results in change management for product developmentin modeling environments. Following this, integrated modeling of closely relatedengineering objects is proposed as extension to current industrial PLM systems. Next,design objective driven product change management is detailed. Finally, virtual space isoutlined as a possible advanced application of the proposed change management with thecapability of representation of human intent.

  18. Species Distribution Models for Crop Pollination: A Modelling Framework Applied to Great Britain

    NARCIS (Netherlands)

    C. Polce; M. Termansen; J. Aguirre-Gutiérrez; N.D. Boatman; G.E. Budge; A. Crowe; M.P. Garratt; S. Pietravalle; S.G. Potts; J. A. Ramirez; K.E. Somerwill; J.C. Biesmeijer

    2013-01-01

    Insect pollination benefits over three quarters of the world's major crops. There is growing concern that observed declines in pollinators may impact on production and revenues from animal pollinated crops. Knowing the distribution of pollinators is therefore crucial for estimating their availabilit

  19. APPLYING BLACK-BOX TESTING TO MODEL TRANSFORMATIONS IN THE MODEL DRIVEN ARCHITECTURE CONTEXT

    Directory of Open Access Journals (Sweden)

    Luciane Telinski Wiedermann Agner

    2014-01-01

    Full Text Available Testing model transformations has played a leading role with the dissemination of MDA in software development processes. Software testing based on black-box testing, together with the “category partitioning” method, can be efficiently used in order to conduct the verification of model transformations. This study employs software testing techniques to an ATL model transformation in the MDA context and points out their benefits. The black-box testing method was adapted to the MT-PROAPES model transformation based on profiles and platform models. The platform models define the range of input models of the MT-PROAPES and are used for the creation of the test cases. The test cases were selected so as to meet certain requirements and increase the ability to detect errors in the model transformation. This approach makes the test process more agile and does not require any abstraction of behavioral properties of the transformations. The field of transformation testing and verification still faces significant challenges and requires a lot of research. Although having some limitations, black-box testing conforms to various situations, besides allowing its integration with other test strategies.

  20. Modelling of composite concrete block pavement systems applying a cohesive zone model

    DEFF Research Database (Denmark)

    Skar, Asmus; Poulsen, Peter Noe

    This paper presents a numerical analysis of the fracture behaviour of the cement bound base material in composite concrete block pavement systems, using a cohesive zone model. The functionality of the proposed model is tested on experimental and numerical investigations of beam bending tests...

  1. Reynolds stress turbulence model applied to two-phase pressurized thermal shocks in nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Mérigoux, Nicolas, E-mail: nicolas.merigoux@edf.fr; Laviéville, Jérôme; Mimouni, Stéphane; Guingo, Mathieu; Baudry, Cyril

    2016-04-01

    Highlights: • NEPTUNE-CFD is used to model two-phase PTS. • k-ε model did produce some satisfactory results but also highlights some weaknesses. • A more advanced turbulence model has been developed, validated and applied for PTS. • Coupled with LIM, the first results confirmed the increased accuracy of the approach. - Abstract: Nuclear power plants are subjected to a variety of ageing mechanisms and, at the same time, exposed to potential pressurized thermal shock (PTS) – characterized by a rapid cooling of the internal Reactor Pressure Vessel (RPV) surface. In this context, NEPTUNE-CFD is used to model two-phase PTS and give an assessment on the structural integrity of the RPV. The first available choice was to use standard first order turbulence model (k-ε) to model high-Reynolds number flows encountered in Pressurized Water Reactor (PWR) primary circuits. In a first attempt, the use of k-ε model did produce some satisfactory results in terms of condensation rate and temperature field distribution on integral experiments, but also highlights some weaknesses in the way to model highly anisotropic turbulence. One way to improve the turbulence prediction – and consequently the temperature field distribution – is to opt for more advanced Reynolds Stress turbulence Model. After various verification and validation steps on separated effects cases – co-current air/steam-water stratified flows in rectangular channels, water jet impingements on water pool free surfaces – this Reynolds Stress turbulence Model (R{sub ij}-ε SSG) has been applied for the first time to thermal free surface flows under industrial conditions on COSI and TOPFLOW-PTS experiments. Coupled with the Large Interface Model, the first results confirmed the adequacy and increased accuracy of the approach in an industrial context.

  2. Pion production model - connection between dynamics and quark models

    Energy Technology Data Exchange (ETDEWEB)

    Lee, T.-S. H.; Sato, T.

    2000-05-17

    The authors discuss the difficulties in testing the hadron models by using the N{sup *} parameters extracted from the empirical amplitude analyses of the {pi}N and {gamma}N reaction data. As an alternative or perhaps a more advantageous approach, they present a Hamiltonian formulation that can relate the pion production dynamics and the constituent quark models of N{sup *} structure. The application of the approach in investigating the {Delta} and N{sup *}(S{sub 11}) excitations is reviewed. It is found that while the {Delta} excitation can be described satisfactory, the {pi}N scattering in S{sub 11} channel can not be described by the constituent quark models based on either the one-gluon-exchange or one-meson-exchange mechanisms. A phenomenological quark-quark potential has been constructed to reproduce the S{sub 11} amplitude.

  3. Addressing dependability by applying an approach for model-based risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjorn Axel [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: bjorn.axel.gran@hrp.no; Fredriksen, Rune [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: rune.fredriksen@hrp.no; Thunem, Atoosa P.-J. [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: atoosa.p-j.thunem@hrp.no

    2007-11-15

    This paper describes how an approach for model-based risk assessment (MBRA) can be applied for addressing different dependability factors in a critical application. Dependability factors, such as availability, reliability, safety and security, are important when assessing the dependability degree of total systems involving digital instrumentation and control (I and C) sub-systems. In order to identify risk sources their roles with regard to intentional system aspects such as system functions, component behaviours and intercommunications must be clarified. Traditional risk assessment is based on fault or risk models of the system. In contrast to this, MBRA utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tried out within the telemedicine and e-commerce areas, and provided through a series of seven trials a sound basis for risk assessments. In this paper the results from the CORAS project are presented, and it is discussed how the approach for applying MBRA meets the needs of a risk-informed Man-Technology-Organization (MTO) model, and how methodology can be applied as a part of a trust case development.

  4. Modelling demand for crude oil products in Spain

    Energy Technology Data Exchange (ETDEWEB)

    Pedregal, D.J. [Escuela Tecnica Superior de Ingenieros Industriales and Instituto de Matematica Aplicada a la Ciencia y la Ingenieria (IMACI), Universidad de Castilla-La Mancha (UCLM), Avenida Camilo Jose Cela s/n, 13071 Ciudad Real (Spain); Dejuan, O.; Gomez, N.; Tobarra, M.A. [Facultad de Ciencias Economicas y Empresariales, Universidad de Castilla-La Mancha (UCLM) (Spain)

    2009-11-15

    This paper develops an econometric model for the five most important crude oil products demand in Spain. The aim is the estimation of a range of elasticities of such demands that would serve as the basis for an applied general equilibrium model used for forecasting energy demand in a broader framework. The main distinctive features of the system with respect to previous literature are (1) it takes advantage of monthly information coming from very different information sources and (2) multivariate unobserved components (UC) models are implemented allowing for a separate analysis of long- and short-run relations. UC models decompose time series into a number of unobserved though economic meaningful components mainly trend, seasonal and irregular. A module is added to such structure to take into account the influence of exogenous variables necessary to compute price, cross and income elasticities. Since all models implemented are multivariate in nature, the demand components are allowed to interact among them through the system noises (similar to a seemingly unrelated equations model). The results show unambiguously that the main factor driving demand is real income with prices having little impact on energy consumption. (author)

  5. A mechanistic particle flux model applied to the oceanic phosphorus cycle

    Directory of Open Access Journals (Sweden)

    T. DeVries

    2014-03-01

    Full Text Available The sinking and decomposition of particulate organic matter are critical processes in the ocean's biological pump, but are poorly understood and crudely represented in biogeochemical models. Here we present a mechanistic model for particle fluxes in the ocean that solves the evolution of the particle size distribution with depth. The model can represent a wide range of particle flux profiles, depending on the surface particle size distribution, the relationships between particle size, mass and velocity, and the rate of particle mass loss during decomposition. Spatially variable flux profiles are embedded in a data-constrained ocean circulation model, where the most uncertain parameters governing particle dynamics are tuned to achieve an optimal fit to the global distribution of phosphate. The resolution of spatially variable particle sizes has a significant effect on modeled organic matter production rates, increasing production in oligotrophic regions and decreasing production in eutrophic regions compared to a model that assumes spatially uniform particle sizes and sinking fluxes. The mechanistic particle model can reproduce global nutrient distributions better than, and sediment trap fluxes as well as, other commonly used empirical formulas. However, these independent data constraints cannot be simultaneously matched in a closed P budget commonly assumed in ocean models. Through a systematic addition of model processes, we show that the apparent discrepancy between particle flux and nutrient data can be resolved through P burial, but only if that burial is associated with a slowly decaying component of organic matter as might be achieved through protection by ballast minerals. Moreover, the model solution that best matches both datasets requires a larger rate of P burial (and compensating inputs than have been previously estimated. Our results imply a marine PO4 inventory with a residence time of a few thousand years, similar to that of the

  6. The promises and pitfalls of applying computational models to neurological and psychiatric disorders.

    Science.gov (United States)

    Teufel, Christoph; Fletcher, Paul C

    2016-10-01

    Computational models have become an integral part of basic neuroscience and have facilitated some of the major advances in the field. More recently, such models have also been applied to the understanding of disruptions in brain function. In this review, using examples and a simple analogy, we discuss the potential for computational models to inform our understanding of brain function and dysfunction. We argue that they may provide, in unprecedented detail, an understanding of the neurobiological and mental basis of brain disorders and that such insights will be key to progress in diagnosis and treatment. However, there are also potential problems attending this approach. We highlight these and identify simple principles that should always govern the use of computational models in clinical neuroscience, noting especially the importance of a clear specification of a model's purpose and of the mapping between mathematical concepts and reality.

  7. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

    Energy Technology Data Exchange (ETDEWEB)

    Elsheikh, Ahmed H., E-mail: aelsheikh@ices.utexas.edu [Institute for Computational Engineering and Sciences (ICES), University of Texas at Austin, TX (United States); Institute of Petroleum Engineering, Heriot-Watt University, Edinburgh EH14 4AS (United Kingdom); Wheeler, Mary F. [Institute for Computational Engineering and Sciences (ICES), University of Texas at Austin, TX (United States); Hoteit, Ibrahim [Department of Earth Sciences and Engineering, King Abdullah University of Science and Technology (KAUST), Thuwal (Saudi Arabia)

    2014-02-01

    A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using Stochastic Ensemble Method (SEM). NS is an efficient sampling algorithm that can be used for Bayesian calibration and estimating the Bayesian evidence for prior model selection. Nested sampling has the advantage of computational feasibility. Within the nested sampling algorithm, a constrained sampling step is performed. For this step, we utilize HMC to reduce the correlation between successive sampled states. HMC relies on the gradient of the logarithm of the posterior distribution, which we estimate using a stochastic ensemble method based on an ensemble of directional derivatives. SEM only requires forward model runs and the simulator is then used as a black box and no adjoint code is needed. The developed HNS algorithm is successfully applied for Bayesian calibration and prior model selection of several nonlinear subsurface flow problems.

  8. The Motivational Knowledge Management Model: proposal to apply it in the library sector

    Directory of Open Access Journals (Sweden)

    Daniel López-Fernández

    2016-12-01

    Full Text Available In professional environments, attention paid to aspects such as supervisory styles, interpersonal relationships and workers eagerness can have a positive impact on employee motivation and, consequently, on their performance and well-being. To achieve this, knowledge management models such as those presented here can be applied. This model generates diagnoses of motivation and recommendations for improvement, both systematically and scientifically. Consequently, it is especially useful for managers and human resource departments. The proposed model can be adapted to different kinds of professional groups, including those in library and documentation services. The suitability, reliability and usefulness of the proposed model have been empirically checked through case studies with 92 students and 166 professionals. The positive results allow us to conclude that the model is effective and useful for assessing and improving motivation.

  9. Applying the ISO 9126 Model to the Evaluation of an E-learning System in Iran

    Directory of Open Access Journals (Sweden)

    Hossein Pedram

    2012-03-01

    Full Text Available One of the models presented in e-learning quality system field is ISO 9126 model, which applied in this research to evaluate e-learning system of Amirkabir University. This model system for evaluation, the six main variables provided that each of these variables by several other indicators was measured. Thus, the model parameters as ISO 9126 and turned the questionnaire survey among samples (120 experts and students of Amirkabir University and the distribution were completed. Based on the results between the electronic learning systems, Amirkabir University with each of the six-factor model of effective, direct and meaningful relationship exist. Also, based on the results, the system of e-learning in Amirkabir University was affected most respectively by maintainability, efficiency, portability, functionality, usability and reliability.

  10. Positive Mathematical Programming Approaches – Recent Developments in Literature and Applied Modelling

    Directory of Open Access Journals (Sweden)

    Thomas Heckelei

    2012-05-01

    Full Text Available This paper reviews and discusses the more recent literature and application of Positive Mathematical Programming in the context of agricultural supply models. Specifically, advances in the empirical foundation of parameter specifications as well as the economic rationalisation of PMP models – both criticized in earlier reviews – are investigated. Moreover, the paper provides an overview on a larger set of models with regular/repeated policy application that apply variants of PMP. Results show that most applications today avoid arbitrary parameter specifications and rely on exogenous information on supply responses to calibrate model parameters. However, only few approaches use multiple observations to estimate parameters, which is likely due to the still considerable technical challenges associated with it. Equally, we found only limited reflection on the behavioral or technological assumptions that could rationalise the PMP model structure while still keeping the model’s advantages.

  11. Applying Forecast Models from the Center for Integrated Space Weather Modeling

    Science.gov (United States)

    Gehmeyr, M.; Baker, D. N.; Millward, G.; Odstrcil, D.

    2007-12-01

    The Center for Integrated Space Weather Modeling (CISM) has developed three forecast models (FMs) for the Sun-Earth chain. They have been matured by various degrees toward the operational stage. The Sun-Earth FM suite comprises empirical and physical models: the Planetary Equivalent Amplitude (AP-FM), the Solar Wind (SW- FM), and the Geospace (GS-FM) models. We give a brief overview of these forecast models and touch briefly on the associated validation studies. We demonstrate the utility of the models: AP-FM supporting the operations of the AIM (Aeronomy of Ice in the Mesosphere) mission soon after launch; SW-FM providing assistance with the interpretation of the STEREO beacon data; and GS-FM combining model and observed data to characterize the aurora borealis. We will then discuss space weather tools in a more general sense, point out where the current capabilities and shortcomings are, and conclude with a look forward to what areas need improvement to facilitate better real-time forecasts.

  12. SPI drought class prediction using log-linear models applied to wet and dry seasons

    Science.gov (United States)

    Moreira, Elsa E.

    2016-08-01

    A log-linear modelling for 3-dimensional contingency tables was used with categorical time series of SPI drought class transitions for prediction of monthly drought severity. Standardized Precipitation Index (SPI) time series in 12- and 6-month time scales were computed for 10 precipitation time series relative to GPCC datasets with 2.5° spatial resolution located over Portugal and with 112 years length (1902-2014). The aim was modelling two-month step class transitions for the wet and dry seasons of the year and then obtain probability ratios - Odds - as well as their respective confidence intervals to estimate how probable a transition is compared to another. The prediction results produced by the modelling applied to wet and dry season separately, for the 6- and the 12-month SPI time scale, were compared with the results produced by the same modelling without the split, using skill scores computed for the entire time series length. Results point to good prediction performances ranging from 70 to 80% in the percentage of corrects (PC) and 50-70% in the Heidke skill score (HSS), with the highest scores obtained when the modelling is applied to the SPI12. The adding up of the wet and dry seasons introduced in the modelling brought improvements in the predictions, of about 0.9-4% in the PC and 1.3-6.8% in the HSS, being the highest improvements obtained in the SPI6 application.

  13. Appropriate Mathematical Model of DC Servo Motors Applied in SCARA Robots

    Directory of Open Access Journals (Sweden)

    Attila L. Bencsik

    2004-11-01

    Full Text Available In the first part of the presentation detailed description of the modular technical system built up of electric components and end-effectors is given. Each of these components was developed at different industrial companies separately. The particular mechatronic unit under consideration was constructed by the use of the appropriate mathematical model of these units. The aim of this presentation is to publish the results achieved by the use of a mathematical modeling technique invented and applied in the development of different mechatronic units as drives and actuators. The unified model describing the whole system was developed with the integration of the models valid to the particular components. In the phase of testing the models a program approximating typical realistic situations in terms of work-loads and physical state of the system during operation was developed and applied. The main innovation here presented consists in integrating the conclusions of professional experiences the developers gained during their former R&D activity in different professional environments. The control system is constructed on the basis of classical methods, therefore the results of the model investigations can immediately be utilized by the developer of the whole complex system, which for instance may be an industrial robot.

  14. Applying risk and resilience models to predicting the effects of media violence on development.

    Science.gov (United States)

    Prot, Sara; Gentile, Douglas A

    2014-01-01

    Although the effects of media violence on children and adolescents have been studied for over 50 years, they remain controversial. Much of this controversy is driven by a misunderstanding of causality that seeks the cause of atrocities such as school shootings. Luckily, several recent developments in risk and resilience theories offer a way out of this controversy. Four risk and resilience models are described, including the cascade model, dose-response gradients, pathway models, and turning-point models. Each is described and applied to the existing media effects literature. Recommendations for future research are discussed with regard to each model. In addition, we examine current developments in theorizing that stressors have sensitizing versus steeling effects and recent interest in biological and gene by environment interactions. We also discuss several of the cultural aspects that have supported the polarization and misunderstanding of the literature, and argue that applying risk and resilience models to the theories and data offers a more balanced way to understand the subtle effects of media violence on aggression within a multicausal perspective.

  15. Developing, Applying, and Evaluating Models for Rapid Screening of Chemical Exposures

    DEFF Research Database (Denmark)

    Arnot, J.; Shin, H.; Ernstoff, Alexi;

    2015-01-01

    to limited exposure data there is limited information on chemical use patterns and production and emission quantities. These data gaps require the application of mass balance, statistical and quantitative structure-activity relationship (QSAR) models to predict exposure and exposure potential for humans...

  16. Applying the Achievement Orientation Model to the Job Satisfaction of Teachers of the Gifted

    Science.gov (United States)

    Siegle, Del; McCoach, D. Betsy; Shea, Kelly

    2014-01-01

    Factors associated with motivation and satisfaction aid in understanding the processes that enhance achievement and productivity. Siegle and McCoach (2005) proposed a motivational model for understanding student achievement and underachievement that included self-perceptions in three areas (meaningfulness [goal valuation], self-efficacy, and…

  17. SIGMA: A Knowledge-Based Simulation Tool Applied to Ecosystem Modeling

    Science.gov (United States)

    Dungan, Jennifer L.; Keller, Richard; Lawless, James G. (Technical Monitor)

    1994-01-01

    The need for better technology to facilitate building, sharing and reusing models is generally recognized within the ecosystem modeling community. The Scientists' Intelligent Graphical Modelling Assistant (SIGMA) creates an environment for model building, sharing and reuse which provides an alternative to more conventional approaches which too often yield poorly documented, awkwardly structured model code. The SIGMA interface presents the user a list of model quantities which can be selected for computation. Equations to calculate the model quantities may be chosen from an existing library of ecosystem modeling equations, or built using a specialized equation editor. Inputs for dim equations may be supplied by data or by calculation from other equations. Each variable and equation is expressed using ecological terminology and scientific units, and is documented with explanatory descriptions and optional literature citations. Automatic scientific unit conversion is supported and only physically-consistent equations are accepted by the system. The system uses knowledge-based semantic conditions to decide which equations in its library make sense to apply in a given situation, and supplies these to the user for selection. "Me equations and variables are graphically represented as a flow diagram which provides a complete summary of the model. Forest-BGC, a stand-level model that simulates photosynthesis and evapo-transpiration for conifer canopies, was originally implemented in Fortran and subsequenty re-implemented using SIGMA. The SIGMA version reproduces daily results and also provides a knowledge base which greatly facilitates inspection, modification and extension of Forest-BGC.

  18. Parameters Investigation of Mathematical Model of Productivity for Automated Line with Availability by DMAIC Methodology

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2014-01-01

    Full Text Available Automated line is widely applied in industry especially for mass production with less variety product. Productivity is one of the important criteria in automated line as well as industry which directly present the outputs and profits. Forecast of productivity in industry accurately in order to achieve the customer demand and the forecast result is calculated by using mathematical model. Mathematical model of productivity with availability for automated line has been introduced to express the productivity in terms of single level of reliability for stations and mechanisms. Since this mathematical model of productivity with availability cannot achieve close enough productivity compared to actual one due to lack of parameters consideration, the enhancement of mathematical model is required to consider and add the loss parameters that is not considered in current model. This paper presents the investigation parameters of productivity losses investigated by using DMAIC (Define, Measure, Analyze, Improve, and Control concept and PACE Prioritization Matrix (Priority, Action, Consider, and Eliminate. The investigated parameters are important for further improvement of mathematical model of productivity with availability to develop robust mathematical model of productivity in automated line.

  19. Applying trait-based models to achieve functional targets for theory-driven ecological restoration.

    Science.gov (United States)

    Laughlin, Daniel C

    2014-07-01

    Manipulating community assemblages to achieve functional targets is a key component of restoring degraded ecosystems. The response-and-effect trait framework provides a conceptual foundation for translating restoration goals into functional trait targets, but a quantitative framework has been lacking for translating trait targets into assemblages of species that practitioners can actually manipulate. This study describes new trait-based models that can be used to generate ranges of species abundances to test theories about which traits, which trait values and which species assemblages are most effective for achieving functional outcomes. These models are generalisable, flexible tools that can be widely applied across many terrestrial ecosystems. Examples illustrate how the framework generates assemblages of indigenous species to (1) achieve desired community responses by applying the theories of environmental filtering, limiting similarity and competitive hierarchies, or (2) achieve desired effects on ecosystem functions by applying the theories of mass ratios and niche complementarity. Experimental applications of this framework will advance our understanding of how to set functional trait targets to achieve the desired restoration goals. A trait-based framework provides restoration ecology with a robust scaffold on which to apply fundamental ecological theory to maintain resilient and functioning ecosystems in a rapidly changing world.

  20. Mathematical Model for the Selection of Processing Parameters in Selective Laser Sintering of Polymer Products

    OpenAIRE

    Ana Pilipović; Igor Drstvenšek; Mladen Šercer

    2014-01-01

    Additive manufacturing (AM) is increasingly applied in the development projects from the initial idea to the finished product. The reasons are multiple, but what should be emphasised is the possibility of relatively rapid manufacturing of the products of complicated geometry based on the computer 3D model of the product. There are numerous limitations primarily in the number of available materials and their properties, which may be quite different from the properties of the material of the fi...

  1. Investigations of Bread Production with Postponed Staling Applying Instrumental Measurements of Bread Crumb Color

    Directory of Open Access Journals (Sweden)

    Vladimir S. Popov

    2009-10-01

    Full Text Available Crumb color quality characteristics of bread of different compositions (whole grain, rye, barley and diet bread at 24 hours intervals during three days after bread preparation were investigated by means of a MOM-color 100 tristimulus photo colorimeter, in CIE, CIELab, ANLAB and Hunter systems. The highest value of average reflectance y (% was found for barley bread (immediately after preparation, so that can be said that this sample was “conditionally” the lightest. The lowest values of y (% were found for diet bread, so that it can be considered as the “conditionally” the darkest product. Colors of all investigated bread samples were lighter after three days of keeping compared to day 0. Changes of average reflectance of bread samples packed in polyethylene packaging with keeping time can be described by linear equation (correlation coefficient 0.99. The dominant wavelength of barley and diet bread confirm the presence of yellow pigment. Color qualities of the mentioned kinds of bread depend on processes during bread staling and raw material composition of bread (flour. Color quality measurements can be used as easy auxiliary method for screening in the development of slower staling bread.

  2. Comparative Study of Various E. coli Strains for Biohydrogen Production Applying Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Péter Bakonyi

    2012-01-01

    Full Text Available The proper strategy to establish efficient hydrogen-producing biosystems is the biochemical, physiological characterization of hydrogen-producing microbes followed by metabolic engineering in order to give extraordinary properties to the strains and, finally, bioprocess optimization to realize enhanced hydrogen fermentation capability. In present paper, it was aimed to show the utility both of strain engineering and process optimization through a comparative study of wild-type and genetically modified E. coli strains, where the effect of two major operational factors (substrate concentration and pH on bioH2 production was investigated by experimental design and response surface methodology (RSM was used to determine the suitable conditions in order to obtain maximum yields. The results revealed that by employing the genetically engineered E. coli (DJT 135 strain under optimized conditions (pH: 6.5; Formate conc.: 1.25 g/L, 0.63 mol H2/mol formate could be attained, which was 1.5 times higher compared to the wild-type E. coli (XL1-BLUE that produced 0.42 mol H2/mol formate (pH: 6.4; Formate conc.: 1.3 g/L.

  3. Production optimisation in the petrochemical industry by hierarchical multivariate modelling

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Magnus; Furusjoe, Erik; Jansson, Aasa

    2004-06-01

    This project demonstrates the advantages of applying hierarchical multivariate modelling in the petrochemical industry in order to increase knowledge of the total process. The models indicate possible ways to optimise the process regarding the use of energy and raw material, which is directly linked to the environmental impact of the process. The refinery of Nynaes Refining AB (Goeteborg, Sweden) has acted as a demonstration site in this project. The models developed for the demonstration site resulted in: Detection of an unknown process disturbance and suggestions of possible causes; Indications on how to increase the yield in combination with energy savings; The possibility to predict product quality from on-line process measurements, making the results available at a higher frequency than customary laboratory analysis; Quantification of the gradually lowered efficiency of heat transfer in the furnace and increased fuel consumption as an effect of soot build-up on the furnace coils; Increased knowledge of the relation between production rate and the efficiency of the heat exchangers. This report is one of two reports from the project. It contains a technical discussion of the result with some degree of detail. A shorter and more easily accessible report is also available, see IVL report B1586-A.

  4. Production TTR modeling and dynamic buckling analysis

    Institute of Scientific and Technical Information of China (English)

    Hugh Liu; John Wei; Edward Huang

    2013-01-01

    In a typical tension leg platform (TLP) design,the top tension factor (TTF),measuring the top tension of a top tensioned riser (TTR) relative to its submerged weight in water,is one of the most important design parameters that has to be specified properly.While a very small TTF may lead to excessive vortex induced vibration (ⅤⅣ),clashing issues and possible compression close to seafloor,an unnecessarily high TTF may translate into excessive riser cost and vessel payload,and even has impacts on the TLP sizing and design in general.In the process of a production TTR design,it is found that its outer casing can be subjected to compression in a worst-case scenario with some extreme metocean and hardware conditions.The present paper shows how finite element analysis (FEA) models using beam elements and two different software packages (Flexcom and ABAQUS) are constructed to simulate the TTR properly,and especially the pipe-in-pipe effects.An ABAQUS model with hybrid elements (beam elements globally + shell elements locally) can be used to investigate how the outer casing behaves under compression.It is shown for the specified TTR design,even with its outer casing being under some local compression in the worst-case scenario,dynamic buckling would not occur; therefore the TTR design is adequate.

  5. A STUDY ON NEW PRODUCT DEMAND FORECASTING BASED ON BASS DIFFUSION MODEL

    Directory of Open Access Journals (Sweden)

    Zuhaimy Ismail

    2013-01-01

    Full Text Available A forecasting model of new product demand has been developed and applied to forecast new vehicle demand in Malaysia. Since the publication of the Bass model in 1969, innovation of new diffusion theory has sparked considerable research among marketing science scholars, operational researchers and mathematicians. This study considers the Bass Model for forecasting the diffusion of new products or an innovation in the Malaysian society. The objective of the proposed model is to represent the level of spread on new products among a given set of society in terms of a simple mathematical function that elapsed since the introduction of new products. With limited amount of data available for new products, a robust Bass model was developed to forecast the sales volume. A procedure of the proposed diffusion model was designed and the parameters were estimated. Results obtained by applying the proposed model and numerical calculation show that the proposed Bass diffusion model is robust and effective for forecasting demand of new products. This study concludes that the newly developed bass diffusion of demand function has significantly contributed for forecasting the diffusion of new products.

  6. Applying an Information Processing Model to Measure the Effectiveness of a Mailed Circular Advertisement.

    Science.gov (United States)

    1982-01-01

    and if the direction of flow of the process is as stated, Aaker & Day (1974) came to some interesting conclusions. They found that adver- 17 tising...the sample. Nevertheless, the study did indicate positive effects of a mailed circular ad. II 2i REFERENCES REFERENCES Aaker , D. A., & Day, G. S. A...dynamic model of relation- ships among advertising, consumer awareness, attitudes, and behavior. Journal of Applied Psychology, 1974, 39, 281-286. Aaker

  7. An Equation of State for Fluids by Applying the Tower—Well Potential Model

    Institute of Scientific and Technical Information of China (English)

    ZengXiangdong; ShangDemin; 等

    1994-01-01

    A simple theoretical equation of state is derived by applying the Tower-well potential model about the molecular distribution based on the generalized van der Waals partition function.It needs only three molecular parameters which have distince physical meanings,The resulting equation of state predicts rapther well the vapor pressures,saturated liquid volumes,saturated vapor volumes and PVT thermodynamic properties of polar and structureally complex molecules over a wide temperature and pressure range.

  8. Steganography Algorithm in Different Colour Model Using an Energy Adjustment Applied with Discrete Wavelet Transform

    Directory of Open Access Journals (Sweden)

    Carvajal-Gamez

    2012-09-01

    Full Text Available When color images are processed in different color model for implementing steganographic algorithms, is important to study the quality of the host and retrieved images, since it is typically used digital filters, visibly reaching deformed images. Using a steganographic algorithm, numerical calculations performed by the computer cause errors and alterations in the test images, so we apply a proposed scaling factor depending on the number of bits of the image to adjust these errors.

  9. Steganography Algorithm in Different Colour Model Using an Energy Adjustment Applied with Discrete Wavelet Transform

    Directory of Open Access Journals (Sweden)

    B.E. Carvajal-Gámez

    2012-08-01

    Full Text Available When color images are processed in different color model for implementing steganographic algorithms, is important to study the quality of the host and retrieved images, since it is typically used digital filters, visibly reaching deformed images. Using a steganographic algorithm, numerical calculations performed by the computer cause errors and alterations in the test images, so we apply a proposed scaling factor depending on the number of bits of the image to adjust these errors.

  10. Applying nonlinear MODM model to supply chain management with quantity discount policy under complex fuzzy environment

    Directory of Open Access Journals (Sweden)

    Zhe Zhang

    2014-06-01

    Full Text Available Purpose: The aim of this paper is to deal with the supply chain management (SCM with quantity discount policy under the complex fuzzy environment, which is characterized as the bi-fuzzy variables. By taking into account the strategy and the process of decision making, a bi-fuzzy nonlinear multiple objective decision making (MODM model is presented to solve the proposed problem.Design/methodology/approach: The bi-fuzzy variables in the MODM model are transformed into the trapezoidal fuzzy variables by the DMs's degree of optimism ?1 and ?2, which are de-fuzzified by the expected value index subsequently. For solving the complex nonlinear model, a multi-objective adaptive particle swarm optimization algorithm (MO-APSO is designed as the solution method.Findings: The proposed model and algorithm are applied to a typical example of SCM problem to illustrate the effectiveness. Based on the sensitivity analysis of the results, the bi-fuzzy nonlinear MODM SCM model is proved to be sensitive to the possibility level ?1.Practical implications: The study focuses on the SCM under complex fuzzy environment in SCM, which has a great practical significance. Therefore, the bi-fuzzy MODM model and MO-APSO can be further applied in SCM problem with quantity discount policy.Originality/value: The bi-fuzzy variable is employed in the nonlinear MODM model of SCM to characterize the hybrid uncertain environment, and this work is original. In addition, the hybrid crisp approach is proposed to transferred to model to an equivalent crisp one by the DMs's degree of optimism and the expected value index. Since the MODM model consider the bi-fuzzy environment and quantity discount policy, so this paper has a great practical significance.

  11. Developing engineering processes through integrated modelling of product and process

    DEFF Research Database (Denmark)

    Nielsen, Jeppe Bjerrum; Hvam, Lars

    2012-01-01

    activities as well as insight into how product features affect the engineering processes. The article suggests possible ways of integrating models of products with models of engineering processes. The models have been tested and further developed in an action research study carried out in collaboration......This article aims at developing an operational tool for integrated modelling of product assortments and engineering processes in companies making customer specific products. Integrating a product model in the design of engineering processes will provide a deeper understanding of the engineering...

  12. Correction of approximation errors with Random Forests applied to modelling of aerosol first indirect effect

    Directory of Open Access Journals (Sweden)

    A. Lipponen

    2013-04-01

    Full Text Available In atmospheric models, due to their computational time or resource limitations, physical processes have to be simulated using reduced models. The use of a reduced model, however, induces errors to the simulation results. These errors are referred to as approximation errors. In this paper, we propose a novel approach to correct these approximation errors. We model the approximation error as an additive noise process in the simulation model and employ the Random Forest (RF regression algorithm for constructing a computationally low cost predictor for the approximation error. In this way, the overall simulation problem is decomposed into two separate and computationally efficient simulation problems: solution of the reduced model and prediction of the approximation error realization. The approach is tested for handling approximation errors due to a reduced coarse sectional representation of aerosol size distribution in a cloud droplet activation calculation. The results show a significant improvement in the accuracy of the simulation compared to the conventional simulation with a reduced model. The proposed approach is rather general and extension of it to different parameterizations or reduced process models that are coupled to geoscientific models is a straightforward task. Another major benefit of this method is that it can be applied to physical processes that are dependent on a large number of variables making them difficult to be parameterized by traditional methods.

  13. Nonlinear models applied to seed germination of Rhipsalis cereuscula Haw (Cactaceae

    Directory of Open Access Journals (Sweden)

    Terezinha Aparecida Guedes

    2014-09-01

    Full Text Available The objective of this analysis was to fit germination data of Rhipsalis cereuscula Haw seeds to the Weibull model with three parameters using Frequentist and Bayesian methods. Five parameterizations were compared using the Bayesian analysis to fit a prior distribution. The parameter estimates from the Frequentist method were similar to the Bayesian responses considering the following non-informative a priori distribution for the parameter vectors: gamma (10³, 10³ in the model M1, normal (0, 106 in the model M2, uniform (0, Lsup in the model M3, exp (μ in the model M4 and Lnormal (μ, 106 in the model M5. However, to achieve the convergence in the models M4 and M5, we applied the μ from the estimates of the Frequentist approach. The best models fitted by the Bayesian method were the M1 and M3. The adequacy of these models was based on the advantages over the Frequentist method such as the reduced computational efforts and the possibility of comparison.

  14. Applying forces to elastic network models of large biomolecules using a haptic feedback device.

    Science.gov (United States)

    Stocks, M B; Laycock, S D; Hayward, S

    2011-03-01

    Elastic network models of biomolecules have proved to be relatively good at predicting global conformational changes particularly in large systems. Software that facilitates rapid and intuitive exploration of conformational change in elastic network models of large biomolecules in response to externally applied forces would therefore be of considerable use, particularly if the forces mimic those that arise in the interaction with a functional ligand. We have developed software that enables a user to apply forces to individual atoms of an elastic network model of a biomolecule through a haptic feedback device or a mouse. With a haptic feedback device the user feels the response to the applied force whilst seeing the biomolecule deform on the screen. Prior to the interactive session normal mode analysis is performed, or pre-calculated normal mode eigenvalues and eigenvectors are loaded. For large molecules this allows the memory and number of calculations to be reduced by employing the idea of the important subspace, a relatively small space of the first M lowest frequency normal mode eigenvectors within which a large proportion of the total fluctuation occurs. Using this approach it was possible to study GroEL on a standard PC as even though only 2.3% of the total number of eigenvectors could be used, they accounted for 50% of the total fluctuation. User testing has shown that the haptic version allows for much more rapid and intuitive exploration of the molecule than the mouse version.

  15. An Integrated Optimization Design Method Based on Surrogate Modeling Applied to Diverging Duct Design

    Science.gov (United States)

    Hanan, Lu; Qiushi, Li; Shaobin, Li

    2016-12-01

    This paper presents an integrated optimization design method in which uniform design, response surface methodology and genetic algorithm are used in combination. In detail, uniform design is used to select the experimental sampling points in the experimental domain and the system performance is evaluated by means of computational fluid dynamics to construct a database. After that, response surface methodology is employed to generate a surrogate mathematical model relating the optimization objective and the design variables. Subsequently, genetic algorithm is adopted and applied to the surrogate model to acquire the optimal solution in the case of satisfying some constraints. The method has been applied to the optimization design of an axisymmetric diverging duct, dealing with three design variables including one qualitative variable and two quantitative variables. The method of modeling and optimization design performs well in improving the duct aerodynamic performance and can be also applied to wider fields of mechanical design and seen as a useful tool for engineering designers, by reducing the design time and computation consumption.

  16. A simple mathematical model of society collapse applied to Easter Island

    Science.gov (United States)

    Bologna, M.; Flores, J. C.

    2008-02-01

    In this paper we consider a mathematical model for the evolution and collapse of the Easter Island society. Based on historical reports, the available primary resources consisted almost exclusively in the trees, then we describe the inhabitants and the resources as an isolated dynamical system. A mathematical, and numerical, analysis about the Easter Island community collapse is performed. In particular, we analyze the critical values of the fundamental parameters and a demographic curve is presented. The technological parameter, quantifying the exploitation of the resources, is calculated and applied to the case of another extinguished civilization (Copán Maya) confirming the consistency of the adopted model.

  17. Bayesian GGE biplot models applied to maize multi-environments trials.

    Science.gov (United States)

    de Oliveira, L A; da Silva, C P; Nuvunga, J J; da Silva, A Q; Balestre, M

    2016-06-17

    The additive main effects and multiplicative interaction (AMMI) and the genotype main effects and genotype x environment interaction (GGE) models stand out among the linear-bilinear models used in genotype x environment interaction studies. Despite the advantages of their use to describe genotype x environment (AMMI) or genotype and genotype x environment (GGE) interactions, these methods have known limitations that are inherent to fixed effects models, including difficulty in treating variance heterogeneity and missing data. Traditional biplots include no measure of uncertainty regarding the principal components. The present study aimed to apply the Bayesian approach to GGE biplot models and assess the implications for selecting stable and adapted genotypes. Our results demonstrated that the Bayesian approach applied to GGE models with non-informative priors was consistent with the traditional GGE biplot analysis, although the credible region incorporated into the biplot enabled distinguishing, based on probability, the performance of genotypes, and their relationships with the environments in the biplot. Those regions also enabled the identification of groups of genotypes and environments with similar effects in terms of adaptability and stability. The relative position of genotypes and environments in biplots is highly affected by the experimental accuracy. Thus, incorporation of uncertainty in biplots is a key tool for breeders to make decisions regarding stability selection and adaptability and the definition of mega-environments.

  18. Model of an International Environmental Agreement among Asymmetric Nations applied to Debris Mitigation

    CERN Document Server

    Singer, Michael J

    2010-01-01

    We investigate how ideas from the International Environmental Agreement (IEA) literature can be applied to the problem of space debris mitigation. The problem of space debris is similar to other international environmental problems in that there is a potential for a tragedy of the commons effect--individual nations bear all the cost of their mitigation measures but share only a fraction of the benefit. Consequently, nations have a tendency to underinvest in mitigation. Coalitions of nations, brought together by IEAs, have the potential to lessen the tragedy of the commons effect by pooling the costs and benefits of mitigation. This work brings together two recent modeling advances: i) a game theoretic model for studying the potential gains from IEA cooperation between nations with asymmetric costs and benefits, ii) an orbital debris model that gives the societal cost that specific actions, such as failing to deorbit an inactive satellite, have on the environment. We combine these two models with empirical lau...

  19. Evaluation model applied to TRANSPETRO's Marine Terminals Standardization Program

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Maria Fatima Ludovico de; Mueller, Gabriela [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Garcia, Luciano Maldonado [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper describes an innovative evaluation model applied to TRANSPETRO's 'Marine Terminals Standardization Program' based on updating approaches of programs evaluation and organizational learning. Since the program was launched in 2004, the need for having an evaluation model able to evaluate its implementation progress, to measure the degree of standards compliance and its potential economic, social and environmental impacts has become evident. Within a vision of safe and environmentally responsible operations of marine terminals, this evaluation model was jointly designed by TRANSPETRO and PUC-Rio to promote continuous improvement and learning in operational practices and in the standardization process itself. TRANSPETRO believes that standardization supports its services and management innovation capability by creating objective and internationally recognized parameters, targets and metrology for its business activities. The conceptual model and application guidelines for this important tool are presented in this paper, as well as the next steps towards its implementation. (author)

  20. Adapted strategic plannig model applied to small business: a case study in the fitness area

    Directory of Open Access Journals (Sweden)

    Eduarda Tirelli Hennig

    2012-06-01

    Full Text Available The strategic planning is an important management tool in the corporate scenario and shall not be restricted to big Companies. However, this kind of planning process in small business may need special adaptations due to their own characteristics. This paper aims to identify and adapt the existent models of strategic planning to the scenario of a small business in the fitness area. Initially, it is accomplished a comparative study among models of different authors to identify theirs phases and activities. Then, it is defined which of these phases and activities should be present in a model that will be utilized in a small business. That model was applied to a Pilates studio; it involves the establishment of an organizational identity, an environmental analysis as well as the definition of strategic goals, strategies and actions to reach them. Finally, benefits to the organization could be identified, as well as hurdles in the implementation of the tool.

  1. Modelling production system architectures in the early phases of product development

    DEFF Research Database (Denmark)

    Guðlaugsson, Tómas Vignir; Martin Ravn, Poul; Mortensen, Niels Henrik;

    2016-01-01

    on – leading to an improved basis for prioritizing activities in the project. Requirements for the contents of the framework are presented, and literature on production and system models is reviewed. The production system architecture modelling framework is founded on methods and approaches in literature......This article suggests a framework for modelling a production system architecture in the early phases of product development.The challenge in these phases is that the products to be produced are not completely defined and yet decisions need to be made early in the process on what investments...... and adjusted to fit the modelling requirements of a production system architecture at an early phase of development. The production system architecture models capture and describe the structure, capabilities and expansions of the production system architecture underdevelopment. The production system...

  2. Mathematical Modeling and Analysis of Classified Marketing of Agricultural Products

    Institute of Scientific and Technical Information of China (English)

    Fengying; WANG

    2014-01-01

    Classified marketing of agricultural products was analyzed using the Logistic Regression Model. This method can take full advantage of information in agricultural product database,to find factors influencing best selling degree of agricultural products,and make quantitative analysis accordingly. Using this model,it is also able to predict sales of agricultural products,and provide reference for mapping out individualized sales strategy for popularizing agricultural products.

  3. Applying downscaled global climate model data to a hydrodynamic surface-water and groundwater model

    Science.gov (United States)

    Swain, Eric; Stefanova, Lydia; Smith, Thomas

    2014-01-01

    Precipitation data from Global Climate Models have been downscaled to smaller regions. Adapting this downscaled precipitation data to a coupled hydrodynamic surface-water/groundwater model of southern Florida allows an examination of future conditions and their effect on groundwater levels, inundation patterns, surface-water stage and flows, and salinity. The downscaled rainfall data include the 1996-2001 time series from the European Center for Medium-Range Weather Forecasting ERA-40 simulation and both the 1996-1999 and 2038-2057 time series from two global climate models: the Community Climate System Model (CCSM) and the Geophysical Fluid Dynamic Laboratory (GFDL). Synthesized surface-water inflow datasets were developed for the 2038-2057 simulations. The resulting hydrologic simulations, with and without a 30-cm sea-level rise, were compared with each other and field data to analyze a range of projected conditions. Simulations predicted generally higher future stage and groundwater levels and surface-water flows, with sea-level rise inducing higher coastal salinities. A coincident rise in sea level, precipitation and surface-water flows resulted in a narrower inland saline/fresh transition zone. The inland areas were affected more by the rainfall difference than the sea-level rise, and the rainfall differences make little difference in coastal inundation, but a larger difference in coastal salinities.

  4. The IT Advantage Assessment Model: Applying an Expanded Value Chain Model to Academia

    Science.gov (United States)

    Turner, Walter L.; Stylianou, Antonis C.

    2004-01-01

    Academia faces an uncertain future as the 21st century unfolds. New demands, discerning students, increased competition from non-traditional competitors are just a few of the forces demanding a response. The use of information technology (IT) in academia has not kept pace with its use in industry. What has been lacking is a model for the strategic…

  5. Integrated modelling of nitrate loads to coastal waters and land rent applied to catchment-scale water management.

    Science.gov (United States)

    Refsgaard, A; Jacobsen, T; Jacobsen, B; Ørum, J-E

    2007-01-01

    The EU Water Framework Directive (WFD) requires an integrated approach to river basin management in order to meet environmental and ecological objectives. This paper presents concepts and full-scale application of an integrated modelling framework. The Ringkoebing Fjord basin is characterized by intensive agricultural production and leakage of nitrate constitute a major pollution problem with respect groundwater aquifers (drinking water), fresh surface water systems (water quality of lakes) and coastal receiving waters (eutrophication). The case study presented illustrates an advanced modelling approach applied in river basin management. Point sources (e.g. sewage treatment plant discharges) and distributed diffuse sources (nitrate leakage) are included to provide a modelling tool capable of simulating pollution transport from source to recipient to analyse the effects of specific, localized basin water management plans. The paper also includes a land rent modelling approach which can be used to choose the most cost-effective measures and the location of these measures. As a forerunner to the use of basin-scale models in WFD basin water management plans this project demonstrates the potential and limitations of comprehensive, integrated modelling tools.

  6. A dynamic model of oceanic sulfur (DMOS) applied to the Sargasso Sea: Simulating the dimethylsulfide (DMS) summer paradox

    Science.gov (United States)

    Vallina, S. M.; Simó, R.; Anderson, T. R.; Gabric, A.; Cropp, R.; Pacheco, J. M.

    2008-03-01

    A new one-dimensional model of DMSP/DMS dynamics (DMOS) is developed and applied to the Sargasso Sea in order to explain what drives the observed dimethylsulfide (DMS) summer paradox: a summer DMS concentration maximum concurrent with a minimum in the biomass of phytoplankton, the producers of the DMS precursor dimethylsulfoniopropionate (DMSP). Several mechanisms have been postulated to explain this mismatch: a succession in phytoplankton species composition towards higher relative abundances of DMSP producers in summer; inhibition of bacterial DMS consumption by ultraviolet radiation (UVR); and direct DMS production by phytoplankton due to UVR-induced oxidative stress. None of these hypothetical mechanisms, except for the first one, has been tested with a dynamic model. We have coupled a new sulfur cycle model that incorporates the latest knowledge on DMSP/DMS dynamics to a preexisting nitrogen/carbon-based ecological model that explicitly simulates the microbial-loop. This allows the role of bacteria in DMS production and consumption to be represented and quantified. The main improvements of DMOS with respect to previous DMSP/DMS models are the explicit inclusion of: solar-radiation inhibition of bacterial sulfur uptakes; DMS exudation by phytoplankton caused by solar-radiation-induced stress; and uptake of dissolved DMSP by phytoplankton. We have conducted a series of modeling experiments where some of the DMOS sulfur paths are turned "off" or "on," and the results on chlorophyll-a, bacteria, DMS, and DMSP (particulate and dissolved) concentrations have been compared with climatological data of these same variables. The simulated rate of sulfur cycling processes are also compared with the scarce data available from previous works. All processes seem to play a role in driving DMS seasonality. Among them, however, solar-radiation-induced DMS exudation by phytoplankton stands out as the process without which the model is unable to produce realistic DMS simulations

  7. Modeling Sustainable Bioenergy Feedstock Production in the Alps

    Science.gov (United States)

    Kraxner, Florian; Leduc, Sylvain; Kindermann, Georg; Fuss, Sabine; Pietsch, Stephan; Lakyda, Ivan; Serrano Leon, Hernan; Shchepashchenko, Dmitry; Shvidenko, Anatoly

    2016-04-01

    Sustainability of bioenergy is often indicated by the neutrality of emissions at the conversion site while the feedstock production site is assumed to be carbon neutral. Recent research shows that sustainability of bioenergy systems starts with feedstock management. Even if sustainable forest management is applied, different management types can impact ecosystem services substantially. This study examines different sustainable forest management systems together with an optimal planning of green-field bioenergy plants in the Alps. Two models - the biophysical global forest model (G4M) and a techno-economic engineering model for optimizing renewable energy systems (BeWhere) are implemented. G4M is applied in a forward looking manner in order to provide information on the forest under different management scenarios: (1) managing the forest for maximizing the carbon sequestration; or (2) managing the forest for maximizing the harvestable wood amount for bioenergy production. The results from the forest modelling are then picked up by the engineering model BeWhere, which optimizes the bioenergy production in terms of energy demand (power and heat demand by population) and supply (wood harvesting potentials), feedstock harvesting and transport costs, the location and capacity of the bioenergy plant as well as the energy distribution logistics with respect to heat and electricity (e.g. considering existing grids for electricity or district heating etc.). First results highlight the importance of considering ecosystem services under different scenarios and in a geographically explicit manner. While aiming at producing the same amount of bioenergy under both forest management scenarios, it turns out that in scenario (1) a substantially larger area (distributed across the Alps) will need to be used for producing (and harvesting) the necessary amount of feedstock than under scenario (2). This result clearly shows that scenario (2) has to be seen as an "intensification

  8. A methodology for estimating the uncertainty in model parameters applying the robust Bayesian inferences

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joo Yeon; Lee, Seung Hyun; Park, Tai Jin [Korean Association for Radiation Application, Seoul (Korea, Republic of)

    2016-06-15

    Any real application of Bayesian inference must acknowledge that both prior distribution and likelihood function have only been specified as more or less convenient approximations to whatever the analyzer's true belief might be. If the inferences from the Bayesian analysis are to be trusted, it is important to determine that they are robust to such variations of prior and likelihood as might also be consistent with the analyzer's stated beliefs. The robust Bayesian inference was applied to atmospheric dispersion assessment using Gaussian plume model. The scopes of contaminations were specified as the uncertainties of distribution type and parametric variability. The probabilistic distribution of model parameters was assumed to be contaminated as the symmetric unimodal and unimodal distributions. The distribution of the sector-averaged relative concentrations was then calculated by applying the contaminated priors to the model parameters. The sector-averaged concentrations for stability class were compared by applying the symmetric unimodal and unimodal priors, respectively, as the contaminated one based on the class of ε-contamination. Though ε was assumed as 10%, the medians reflecting the symmetric unimodal priors were nearly approximated within 10% compared with ones reflecting the plausible ones. However, the medians reflecting the unimodal priors were approximated within 20% for a few downwind distances compared with ones reflecting the plausible ones. The robustness has been answered by estimating how the results of the Bayesian inferences are robust to reasonable variations of the plausible priors. From these robust inferences, it is reasonable to apply the symmetric unimodal priors for analyzing the robustness of the Bayesian inferences.

  9. Maturity Models 101: A Primer for Applying Maturity Models to Smart Grid Security, Resilience, and Interoperability

    Science.gov (United States)

    2012-11-01

    needed to meet challenge problems. These models have been sponsored by governments, individual organizations, and consortia (including industry -specific...power industry , leading to the introduction of many new systems, business processes, markets, and enterprise integration approaches. How do you manage the...in technology and its application in the electric power industry , leading to the introduction of many new systems, business processes, markets, and

  10. A mixed integer linear programming model applied in barge planning for Omya

    Directory of Open Access Journals (Sweden)

    David Bredström

    2015-12-01

    Full Text Available This article presents a mathematical model for barge transport planning on the river Rhine, which is part of a decision support system (DSS recently taken into use by the Swiss company Omya. The system is operated by Omya’s regional office in Cologne, Germany, responsible for distribution planning at the regional distribution center (RDC in Moerdijk, the Netherlands. The distribution planning is a vital part of supply chain management of Omya’s production of Norwegian high quality calcium carbonate slurry, supplied to European paper manufacturers. The DSS operates within a vendor managed inventory (VMI setting, where the customer inventories are monitored by Omya, who decides upon the refilling days and quantities delivered by barges. The barge planning problem falls into the category of inventory routing problems (IRP and is further characterized with multiple products, heterogeneous fleet with availability restrictions (the fleet is owned by third party, vehicle compartments, dependency of barge capacity on water-level, multiple customer visits, bounded customer inventories and rolling planning horizon. There are additional modelling details which had to be considered to make it possible to employ the model in practice at a sufficient level of detail. To the best of our knowledge, we have not been able to find similar models covering all these aspects in barge planning. This article presents the developed mixed-integer programming model and discusses practical experience with its solution. Briefly, it also puts the model into the context of the entire business case of value chain optimization in Omya.

  11. Integrated modelling of crop production and nitrate leaching with the Daisy model.

    Science.gov (United States)

    Manevski, Kiril; Børgesen, Christen D; Li, Xiaoxin; Andersen, Mathias N; Abrahamsen, Per; Hu, Chunsheng; Hansen, Søren

    2016-01-01

    An integrated modelling strategy was designed and applied to the Soil-Vegetation-Atmosphere Transfer model Daisy for simulation of crop production and nitrate leaching under pedo-climatic and agronomic environment different than that of model original parameterisation. The points of significance and caution in the strategy are: •Model preparation should include field data in detail due to the high complexity of the soil and the crop processes simulated with process-based model, and should reflect the study objectives. Inclusion of interactions between parameters in a sensitivity analysis results in better account for impacts on outputs of measured variables.•Model evaluation on several independent data sets increases robustness, at least on coarser time scales such as month or year. It produces a valuable platform for adaptation of the model to new crops or for the improvement of the existing parameters set. On daily time scale, validation for highly dynamic variables such as soil water transport remains challenging. •Model application is demonstrated with relevance for scientists and regional managers. The integrated modelling strategy is applicable for other process-based models similar to Daisy. It is envisaged that the strategy establishes model capability as a useful research/decision-making, and it increases knowledge transferability, reproducibility and traceability.

  12. A continuous time delay-difference type model (CTDDM) applied to stock assessment of the southern Atlantic albacore Thunnus alalunga

    Science.gov (United States)

    Liao, Baochao; Liu, Qun; Zhang, Kui; Baset, Abdul; Memon, Aamir Mahmood; Memon, Khadim Hussain; Han, Yanan

    2016-09-01

    A continuous time delay-diff erence model (CTDDM) has been established that considers continuous time delays of biological processes. The southern Atlantic albacore ( Thunnus alalunga) stock is the one of the commercially important tuna population in the marine world. The age structured production model (ASPM) and the surplus production model (SPM) have already been used to assess the albacore stock. However, the ASPM requires detailed biological information and the SPM lacks the biological realism. In this study, we focus on applying a CTDDM to the southern Atlantic albacore ( T. alalunga) species, which provides an alternative method to assess this fishery. It is the first time that CTDDM has been provided for assessing the Atlantic albacore ( T. alalunga) fishery. CTDDM obtained the 80% confidence interval of MSY (maximum sustainable yield) of (21 510 t, 23 118t). The catch in 2011 (24 100 t) is higher than the MSY values and the relative fishing mortality ratio ( F 2011/ F MSY) is higher than 1.0. The results of CTDDM were analyzed to verify the proposed methodology and provide reference information for the sustainable management of the southern Atlantic albacore stock. The CTDDM treats the recruitment, the growth, and the mortality rates as all varying continuously over time and fills gaps between ASPM and SPM in this stock assessment.

  13. Goal Model Integration for Tailoring Product Line Development Processes

    Directory of Open Access Journals (Sweden)

    Arfan Mansoor

    2016-07-01

    Full Text Available Many companies rely on the promised benefits of product lines, targeting systems between fully custom made software and mass products. Such customized mass products account for a large number of applications automatically derived from a product line. This results in the special importance of product lines for companies with a large part of their product portfolio based on their product line. The success of product line development efforts is highly dependent on tailoring the development process. This paper presents an integrative model of influence factors to tailor product line development processes according to different project needs, organizational goals, individual goals of the developers or constraints of the environment. This model integrates goal models, SPEM models and requirements to tailor development processes.

  14. Applying Mathematical Optimization Methods to an ACT-R Instance-Based Learning Model.

    Directory of Open Access Journals (Sweden)

    Nadia Said

    Full Text Available Computational models of cognition provide an interface to connect advanced mathematical tools and methods to empirically supported theories of behavior in psychology, cognitive science, and neuroscience. In this article, we consider a computational model of instance-based learning, implemented in the ACT-R cognitive architecture. We propose an approach for obtaining mathematical reformulations of such cognitive models that improve their computational tractability. For the well-established Sugar Factory dynamic decision making task, we conduct a simulation study to analyze central model parameters. We show how mathematical optimization techniques can be applied to efficiently identify optimal parameter values with respect to different optimization goals. Beyond these methodological contributions, our analysis reveals the sensitivity of this particular task with respect to initial settings and yields new insights into how average human performance deviates from potential optimal performance. We conclude by discussing possible extensions of our approach as well as future steps towards applying more powerful derivative-based optimization methods.

  15. Multi products single machine EPQ model with immediate rework process

    Directory of Open Access Journals (Sweden)

    Jahangir Biabani

    2012-01-01

    Full Text Available This paper develops an economic production quantity (EPQ inventory model with rework process for a single stage production system with one machine. The existence of a unique machine results in limited production capacity. The aim of this research is to determine both the optimal cycle length and the optimal production quantity for each product to minimize the expected total cost (holding, production, setup, rework costs. The convexity of the inventory model is derived. Also the objective function is proved to be convex. The proposed inventory model is validated with illustrating numerical examples and the optimal period length and the total system cost are analyzed.

  16. A Simple Economic Model of Cocaine Production

    Science.gov (United States)

    1994-01-01

    the production dhain that are vulnerable to itr o as well as providing a clearer picture of the path that cocaine takes before reading retail markts in...In fact, given the volatility of cocaine product prices in PBC, it is probably not uncommon for segments of production to be unprofitable at times

  17. Sawtooth mitigation in 3D MHD tokamak modelling with applied magnetic perturbations

    Science.gov (United States)

    Bonfiglio, D.; Veranda, M.; Cappello, S.; Chacón, L.; Escande, D. F.

    2017-01-01

    The effect of magnetic perturbations (MPs) on the sawtoothing dynamics of the internal kink mode in the tokamak is discussed in the framework of nonlinear 3D MHD modelling. Numerical simulations are performed with the pixie3d code (Chacón 2008 Phys. Plasmas 15 056103) based on a D-shaped configuration in toroidal geometry. MPs are applied as produced by two sets of coils distributed along the toroidal direction, one set located above and the other set below the outboard midplane, like in experimental devices such as DIII-D and ASDEX Upgrade. The capability of n  =  1 MPs to affect quasi-periodic sawteeth is shown to depend on the toroidal phase difference Δ φ between the perturbations produced by the two sets of coils. In particular, sawtooth mitigation is obtained for the Δ φ =π phasing, whereas no significant effect is observed for Δ φ =0 . Numerical findings are explained by the interplay between different poloidal harmonics in the spectrum of applied MPs, and appear to be consistent with experiments performed in the DIII-D device. Sawtooth mitigation and stimulation of self-organized helical states by applied MPs have been previously demonstrated in both circular tokamak and reversed-field pinch (RFP) experiments in the RFX-mod device, and in related 3D MHD modelling.

  18. Optimal control policies for continuous review production-inventory models

    OpenAIRE

    Germs, Remco; Foreest, Nicky D. van

    2012-01-01

    In this paper, we consider a stochastic version of a single-item production-inventory system in which the demand process is a mixture of a compound Poisson process and a constant demand rate. This model generalizes classical continuous-review single product inventory models with infinite planning horizon such as the EOQ model or production-inventory models with compound Poisson demand. We establish for the first time conditions on the inventory costs and the demand distribution such that the ...

  19. An Online Gravity Modeling Method Applied for High Precision Free-INS.

    Science.gov (United States)

    Wang, Jing; Yang, Gongliu; Li, Jing; Zhou, Xiao

    2016-09-23

    For real-time solution of inertial navigation system (INS), the high-degree spherical harmonic gravity model (SHM) is not applicable because of its time and space complexity, in which traditional normal gravity model (NGM) has been the dominant technique for gravity compensation. In this paper, a two-dimensional second-order polynomial model is derived from SHM according to the approximate linear characteristic of regional disturbing potential. Firstly, deflections of vertical (DOVs) on dense grids are calculated with SHM in an external computer. And then, the polynomial coefficients are obtained using these DOVs. To achieve global navigation, the coefficients and applicable region of polynomial model are both updated synchronously in above computer. Compared with high-degree SHM, the polynomial model takes less storage and computational time at the expense of minor precision. Meanwhile, the model is more accurate than NGM. Finally, numerical test and INS experiment show that the proposed method outperforms traditional gravity models applied for high precision free-INS.

  20. Information Technology Model for Product Lifecycle Engineering

    Directory of Open Access Journals (Sweden)

    Bhanumathi KS

    2013-02-01

    Full Text Available An aircraft is a complex, multi-disciplinary, system-engineered product that requires real-time global technical collaboration through its life-cycle. Engineering data and processes which form the backbone of the aircraft should be under strict Configuration Control (CC. It should be model-based and allow for 3D visualization and manipulation. This requires accurate, realtime collaboration and concurrent engineering-based business processes operating in an Integrated Digital Environment (IDE. The IDE uses lightweight, neutral Computer Aided Design (CAD Digital Mock-Up (DMU. The DMU deals with complex structural assemblies and systems of more than a hundred thousand parts created by engineers across the globe, each using diverse CAD, Computer Aided Engineering (CAE, Computer Aided Manufacturing (CAM, Computer Integrated Manufacturing (CIM, Enterprise Resource Planning (ERP, Supply Chain Management(SCM,Customer Relationship Management(CRM and Computer Aided Maintenance Management System (CAMMS systems. In this paper, a comprehensive approach to making such an environment a reality is presented.