WorldWideScience

Sample records for application models based

  1. Modelling Gesture Based Ubiquitous Applications

    CERN Document Server

    Zacharia, Kurien; Varghese, Surekha Mariam

    2011-01-01

    A cost effective, gesture based modelling technique called Virtual Interactive Prototyping (VIP) is described in this paper. Prototyping is implemented by projecting a virtual model of the equipment to be prototyped. Users can interact with the virtual model like the original working equipment. For capturing and tracking the user interactions with the model image and sound processing techniques are used. VIP is a flexible and interactive prototyping method that has much application in ubiquitous computing environments. Different commercial as well as socio-economic applications and extension to interactive advertising of VIP are also discussed.

  2. Agent Based Modeling Applications for Geosciences

    Science.gov (United States)

    Stein, J. S.

    2004-12-01

    Agent-based modeling techniques have successfully been applied to systems in which complex behaviors or outcomes arise from varied interactions between individuals in the system. Each individual interacts with its environment, as well as with other individuals, by following a set of relatively simple rules. Traditionally this "bottom-up" modeling approach has been applied to problems in the fields of economics and sociology, but more recently has been introduced to various disciplines in the geosciences. This technique can help explain the origin of complex processes from a relatively simple set of rules, incorporate large and detailed datasets when they exist, and simulate the effects of extreme events on system-wide behavior. Some of the challenges associated with this modeling method include: significant computational requirements in order to keep track of thousands to millions of agents, methods and strategies of model validation are lacking, as is a formal methodology for evaluating model uncertainty. Challenges specific to the geosciences, include how to define agents that control water, contaminant fluxes, climate forcing and other physical processes and how to link these "geo-agents" into larger agent-based simulations that include social systems such as demographics economics and regulations. Effective management of limited natural resources (such as water, hydrocarbons, or land) requires an understanding of what factors influence the demand for these resources on a regional and temporal scale. Agent-based models can be used to simulate this demand across a variety of sectors under a range of conditions and determine effective and robust management policies and monitoring strategies. The recent focus on the role of biological processes in the geosciences is another example of an area that could benefit from agent-based applications. A typical approach to modeling the effect of biological processes in geologic media has been to represent these processes in

  3. Python-Based Applications for Hydrogeological Modeling

    Science.gov (United States)

    Khambhammettu, P.

    2013-12-01

    Python is a general-purpose, high-level programming language whose design philosophy emphasizes code readability. Add-on packages supporting fast array computation (numpy), plotting (matplotlib), scientific /mathematical Functions (scipy), have resulted in a powerful ecosystem for scientists interested in exploratory data analysis, high-performance computing and data visualization. Three examples are provided to demonstrate the applicability of the Python environment in hydrogeological applications. Python programs were used to model an aquifer test and estimate aquifer parameters at a Superfund site. The aquifer test conducted at a Groundwater Circulation Well was modeled with the Python/FORTRAN-based TTIM Analytic Element Code. The aquifer parameters were estimated with PEST such that a good match was produced between the simulated and observed drawdowns. Python scripts were written to interface with PEST and visualize the results. A convolution-based approach was used to estimate source concentration histories based on observed concentrations at receptor locations. Unit Response Functions (URFs) that relate the receptor concentrations to a unit release at the source were derived with the ATRANS code. The impact of any releases at the source could then be estimated by convolving the source release history with the URFs. Python scripts were written to compute and visualize receptor concentrations for user-specified source histories. The framework provided a simple and elegant way to test various hypotheses about the site. A Python/FORTRAN-based program TYPECURVEGRID-Py was developed to compute and visualize groundwater elevations and drawdown through time in response to a regional uniform hydraulic gradient and the influence of pumping wells using either the Theis solution for a fully-confined aquifer or the Hantush-Jacob solution for a leaky confined aquifer. The program supports an arbitrary number of wells that can operate according to arbitrary schedules. The

  4. Measurement-based load modeling: Theory and application

    Institute of Scientific and Technical Information of China (English)

    MA; Jin; HAN; Dong; HE; RenMu

    2007-01-01

    Load model is one of the most important elements in power system operation and control. However, owing to its complexity, load modeling is still an open and very difficult problem. Summarizing our work on measurement-based load modeling in China for more than twenty years, this paper systematically introduces the mathematical theory and applications regarding the load modeling. The flow chart and algorithms for measurement-based load modeling are presented. A composite load model structure with 13 parameters is also proposed. Analysis results based on the trajectory sensitivity theory indicate the importance of the load model parameters for the identification. Case studies show the accuracy of the presented measurement-based load model. The load model thus built has been validated by field measurements all over China. Future working directions on measurement- based load modeling are also discussed in the paper.

  5. Model Checking-Based Testing of Web Applications

    Institute of Scientific and Technical Information of China (English)

    ZENG Hongwei; MIAO Huaikou

    2007-01-01

    A formal model representing the navigation behavior of a Web application as the Kripke structure is proposed and an approach that applies model checking to test case generation is presented. The Object Relation Diagram as the object model is employed to describe the object structure of a Web application design and can be translated into the behavior model. A key problem of model checking-based test generation for a Web application is how to construct a set of trap properties that intend to cause the violations of model checking against the behavior model and output of counterexamples used to construct the test sequences.We give an algorithm that derives trap properties from the object model with respect to node and edge coverage criteria.

  6. Cascaded process model based control: packed absorption column application.

    Science.gov (United States)

    Govindarajan, Anand; Jayaraman, Suresh Kumar; Sethuraman, Vijayalakshmi; Raul, Pramod R; Rhinehart, R Russell

    2014-03-01

    Nonlinear, adaptive, process-model based control is demonstrated in a cascaded single-input-single-output mode for pressure drop control in a pilot-scale packed absorption column. The process is shown to be nonlinear. Control is demonstrated in both servo and regulatory modes, for no wind-up in a constrained situation, and for bumpless transfer. Model adaptation is demonstrated and shown to provide process insight. The application procedure is revealed as a design guide to aid others in implementing process-model based control.

  7. WWW Business Applications Based on the Cellular Model

    Institute of Scientific and Technical Information of China (English)

    Toshio Kodama; Tosiyasu L. Kunii; Yoichi Seki

    2008-01-01

    A cellular model based on the Incrementally Modular Abstraction Hierarchy (IMAH) is a novel model that can represent the architecture of and changes in cyberworlds, preserving invariants from a general level to a specific one. We have developed a data processing system called the Cellular Data System (CDS). In the development of business applications, you can prevent combinatorial explosion in the process of business design and testing by using CDS. In this paper, we have first designed and implemented wide-use algebra on the presentation level. Next, we have developed and verified the effectiveness of two general business applications using CDS: 1) a customer information management system, and 2) an estimate system.

  8. Intelligent control based on intelligent characteristic model and its application

    Institute of Scientific and Technical Information of China (English)

    吴宏鑫; 王迎春; 邢琰

    2003-01-01

    This paper presents a new intelligent control method based on intelligent characteristic model for a kind of complicated plant with nonlinearities and uncertainties, whose controlled output variables cannot be measured on line continuously. The basic idea of this method is to utilize intelligent techniques to form the characteristic model of the controlled plant according to the principle of combining the char-acteristics of the plant with the control requirements, and then to present a new design method of intelli-gent controller based on this characteristic model. First, the modeling principles and expression of the intelligent characteristic model are presented. Then based on description of the intelligent characteristic model, the design principles and methods of the intelligent controller composed of several open-loops and closed-loops sub controllers with qualitative and quantitative information are given. Finally, the ap-plication of this method in alumina concentration control in the real aluminum electrolytic process is in-troduced. It is proved in practice that the above methods not only are easy to implement in engineering design but also avoid the trial-and-error of general intelligent controllers. It has taken better effect in the following application: achieving long-term stable control of low alumina concentration and increasing the controlled ratio of anode effect greatly from 60% to 80%.

  9. Application of model based control to robotic manipulators

    Science.gov (United States)

    Petrosky, Lyman J.; Oppenheim, Irving J.

    1988-01-01

    A robot that can duplicate humam motion capabilities in such activities as balancing, reaching, lifting, and moving has been built and tested. These capabilities are achieved through the use of real time Model-Based Control (MBC) techniques which have recently been demonstrated. MBC accounts for all manipulator inertial forces and provides stable manipulator motion control even at high speeds. To effectively demonstrate the unique capabilities of MBC, an experimental robotic manipulator was constructed, which stands upright, balancing on a two wheel base. The mathematical modeling of dynamics inherent in MBC permit the control system to perform functions that are impossible with conventional non-model based methods. These capabilities include: (1) Stable control at all speeds of operation; (2) Operations requiring dynamic stability such as balancing; (3) Detection and monitoring of applied forces without the use of load sensors; (4) Manipulator safing via detection of abnormal loads. The full potential of MBC has yet to be realized. The experiments performed for this research are only an indication of the potential applications. MBC has no inherent stability limitations and its range of applicability is limited only by the attainable sampling rate, modeling accuracy, and sensor resolution. Manipulators could be designed to operate at the highest speed mechanically attainable without being limited by control inadequacies. Manipulators capable of operating many times faster than current machines would certainly increase productivity for many tasks.

  10. Application of Z-Number Based Modeling in Psychological Research

    Directory of Open Access Journals (Sweden)

    Rafik Aliev

    2015-01-01

    Full Text Available Pilates exercises have been shown beneficial impact on physical, physiological, and mental characteristics of human beings. In this paper, Z-number based fuzzy approach is applied for modeling the effect of Pilates exercises on motivation, attention, anxiety, and educational achievement. The measuring of psychological parameters is performed using internationally recognized instruments: Academic Motivation Scale (AMS, Test of Attention (D2 Test, and Spielberger’s Anxiety Test completed by students. The GPA of students was used as the measure of educational achievement. Application of Z-information modeling allows us to increase precision and reliability of data processing results in the presence of uncertainty of input data created from completed questionnaires. The basic steps of Z-number based modeling with numerical solutions are presented.

  11. DATA MODEL CUSTOMIZATION FOR YII BASED ERP APPLICATION

    Directory of Open Access Journals (Sweden)

    Andre Leander

    2014-01-01

    Full Text Available As UD. Logam Utama’s business grow, trigger the need of fast and accurate information in order to improve performance, efficiency, control and company’s values. The company needs a system that can integrate each functional area. ERP has centralized database and able to be configured, according to company’s business processes.First phase of application development is analysis and design the company’s business processes. The design phase produce a number of models that will be used to created application.The final result of application development is an ERP application that can be configured with the company’s business process. The ERP application consist of warehouse or production module, purchasing module, sales module, and accounting module.

  12. CAD-model-based vision for space applications

    Science.gov (United States)

    Shapiro, Linda G.

    1988-01-01

    A pose acquisition system operating in space must be able to perform well in a variety of different applications including automated guidance and inspections tasks with many different, but known objects. Since the space station is being designed with automation in mind, there will be CAD models of all the objects, including the station itself. The construction of vision models and procedures directly from the CAD models is the goal of this project. The system that is being designed and implementing must convert CAD models to vision models, predict visible features from a given view point from the vision models, construct view classes representing views of the objects, and use the view class model thus derived to rapidly determine the pose of the object from single images and/or stereo pairs.

  13. Model Based Fault Detection in a Centrifugal Pump Application

    DEFF Research Database (Denmark)

    Kallesøe, Carsten; Cocquempot, Vincent; Izadi-Zamanabadi, Roozbeh

    2006-01-01

    A model based approach for fault detection in a centrifugal pump, driven by an induction motor, is proposed in this paper. The fault detection algorithm is derived using a combination of structural analysis, observer design and Analytical Redundancy Relation (ARR) design. Structural consideration...... is capable of detecting four different faults in the mechanical and hydraulic parts of the pump.......A model based approach for fault detection in a centrifugal pump, driven by an induction motor, is proposed in this paper. The fault detection algorithm is derived using a combination of structural analysis, observer design and Analytical Redundancy Relation (ARR) design. Structural considerations...

  14. Physical based Schottky barrier diode modeling for THz applications

    DEFF Research Database (Denmark)

    Yan, Lei; Krozer, Viktor; Michaelsen, Rasmus Schandorph;

    2013-01-01

    In this work, a physical Schottky barrier diode model is presented. The model is based on physical parameters such as anode area, Ohmic contact area, doping profile from epitaxial (EPI) and substrate (SUB) layers, layer thicknesses, barrier height, specific contact resistance, and device...... temperature. The effects of barrier height lowering, nonlinear resistance from the EPI layer, and hot electron noise are all included for accurate characterization of the Schottky diode. To verify the diode model, measured I-V and C-V characteristics are compared with the simulation results. Due to the lack...

  15. An application-semantics-based relaxed transaction model for internetware

    Institute of Scientific and Technical Information of China (English)

    HUANG Tao; DING Xiaoning; WEI Jun

    2006-01-01

    An internetware application is composed by existing individual services, while transaction processing is a key mechanism to make the composition reliable. The existing research of transactional composite service (TCS) depends on the analysis to composition structure and exception handling mechanism in order to guarantee the relaxed atomicity.However, this approach cannot handle some application-specific requirements and causes lots of unnecessary failure recoveries or even aborts. In this paper, we propose a relaxed transaction model, including system mode, relaxed atomicity criterion, static checking algorithm and dynamic enforcement algorithm. Users are able to define different relaxed atomicity constraint for different TCS according to application-specific requirements, including acceptable configurations and the preference order. The checking algorithm determines whether the constraint can be guaranteed to be satisfied. The enforcement algorithm monitors the execution and performs transaction management work according to the constraint. Compared to the existing work, our approach can handle complex application requirements, avoid unnecessary failure recoveries and perform the transaction management work automatically.

  16. GIS application on spatial landslide analysis using statistical based models

    Science.gov (United States)

    Pradhan, Biswajeet; Lee, Saro; Buchroithner, Manfred F.

    2009-09-01

    This paper presents the assessment results of spatially based probabilistic three models using Geoinformation Techniques (GIT) for landslide susceptibility analysis at Penang Island in Malaysia. Landslide locations within the study areas were identified by interpreting aerial photographs, satellite images and supported with field surveys. Maps of the topography, soil type, lineaments and land cover were constructed from the spatial data sets. There are ten landslide related factors were extracted from the spatial database and the frequency ratio, fuzzy logic, and bivariate logistic regression coefficients of each factor was computed. Finally, landslide susceptibility maps were drawn for study area using frequency ratios, fuzzy logic and bivariate logistic regression models. For verification, the results of the analyses were compared with actual landslide locations in study area. The verification results show that bivariate logistic regression model provides slightly higher prediction accuracy than the frequency ratio and fuzzy logic models.

  17. A Systematic Review of Agent-Based Modelling and Simulation Applications in the Higher Education Domain

    Science.gov (United States)

    Gu, X.; Blackmore, K. L.

    2015-01-01

    This paper presents the results of a systematic review of agent-based modelling and simulation (ABMS) applications in the higher education (HE) domain. Agent-based modelling is a "bottom-up" modelling paradigm in which system-level behaviour (macro) is modelled through the behaviour of individual local-level agent interactions (micro).…

  18. A Model of Cloud Based Application Environment for Software Testing

    CERN Document Server

    Vengattaraman, T; Baskaran, R

    2010-01-01

    Cloud computing is an emerging platform of service computing designed for swift and dynamic delivery of assured computing resources. Cloud computing provide Service-Level Agreements (SLAs) for guaranteed uptime availability for enabling convenient and on-demand network access to the distributed and shared computing resources. Though the cloud computing paradigm holds its potential status in the field of distributed computing, cloud platforms are not yet to the attention of majority of the researchers and practitioners. More specifically, still the researchers and practitioners community has fragmented and imperfect knowledge on cloud computing principles and techniques. In this context, one of the primary motivations of the work presented in this paper is to reveal the versatile merits of cloud computing paradigm and hence the objective of this work is defined to bring out the remarkable significances of cloud computing paradigm through an application environment. In this work, a cloud computing model for sof...

  19. A Kano Model Based Linguistic Application for Customer Needs Analysis

    Directory of Open Access Journals (Sweden)

    Md Mamunur Rashid

    2011-05-01

    Full Text Available Linguistic is the systematic study of language. Now quality doesn't always mean the "tangible attribute" of a product or service. It may also be linguistic. Thus, linguistic has applied for product design through capturing the voice of Customers. Capturing of the voice of customers has been done in different way, like Quality Function Deployment (QFD, Kansei Engineering and Kano Model regarding product design. Kano Model has two dimensional linguistic approaches, which is more voice capturing capacity than other methods. Reverse attribute study is important for more reliable product design for next actions than other attributes of Kano model i.e. attractive, must-be, one-dimensional and indifferent. Thus, this paper is exclusively study for reverse attribute. For this purpose, a reverse attribute based linguistic approach, which is run in the computer system for product design regarding Kano model aspect using threshold numbers of real consumers opinions converted into probability through fuzzy concept as an input of Monte Carlo Simulation system determining virtual customers is described in this paper.

  20. Middleware Based Model of Heterogeneous Systems for SCADA Distributed Applications

    Directory of Open Access Journals (Sweden)

    UNGUREAN, I.

    2010-05-01

    Full Text Available Infrastructure underlying the distributed information systems is heterogeneous and very complex. Middleware allows the development of distributed information systems, without knowing the functioning details of an infrastructure, by its abstracting. An essential issue on designing such systems is represented by choosing the middleware technologies. An architectural model of a SCADA system based on middleware is proposed in this paper. This system is formed of servers that centralize data and clients, which receive information from a server, thus allowing the chart displaying of such information. All these components own a specific functionality and can exchange information, by means of a middleware bus. A middleware bus signifies a software bus, where more middleware technologies can coexist.

  1. A Model-Based Method to Design an Application Common Platform for Enterprise Information Systems

    Science.gov (United States)

    Ishihara, Akira; Furuta, Hirohisa; Yamaoka, Takayuki; Seo, Kazuo; Nishida, Shogo

    This paper presents a model-based method to design a software platform, called an application common platform for developments of enterprise information systems. This application common platform(ACP) wraps existing reusable software assets to hide their details from application developers and provide domain level application programming interfaces, so that reusability of software assets and productivity of application improve. In this paper, we present a software architecture which organizes applications, ACP, and software assets and illustrate a development process of ACP. Especially, we show design rules to derive ACP design models from application design models and software assets design models. We also define metrics of reusability and productivity and evaluate the proposed method through real developments of enterprise information systems. As a result, the proposed method reduced 20% of development cost compared to estimation cost.

  2. Erosion risk assessment in the southern Amazon - Data Preprocessing, data base application and process based modelling

    Science.gov (United States)

    Schindewolf, Marcus; Herrmann, Marie-Kristin; Herrmann, Anne-Katrin; Schultze, Nico; Amorim, Ricardo S. S.; Schmidt, Jürgen

    2015-04-01

    The study region along the BR 16 highway belongs to the "Deforestation Arc" at the southern border of the Amazon rainforest. At the same time, it incorporates a land use gradient as colonization started in the 1975-1990 in Central Mato Grosso in 1990 in northern Mato Grosso and most recently in 2004-2005 in southern Pará. Based on present knowledge soil erosion is one of the key driver of soil degradation. Hence, there is a strong need to implement soil erosion control measures in eroding landscapes. Planning and dimensioning of such measures require reliable and detailed information on the temporal and spatial distribution of soil loss, sediment transport and deposition. Soil erosion models are increasingly used, in order to simulate the physical processes involved and to predict the effects of soil erosion control measures. The process based EROSION 3D simulation model is used for surveying soil erosion and deposition on regional catchments. Although EROSION 3D is a widespread, extensively validated model, the application of the model on regional scale remains challenging due to the enormous data requirements and complex data processing operations. In this context the study includes the compilation, validation and generalisation of existing land use and soil data in order to generate a consistent EROSION 3D input datasets. As a part of this process a GIS-linked data base application allows to transfer the original soil and land use data into model specific parameter files. This combined methodology provides different risk assessment maps for certain demands on regional scale. Besides soil loss and sediment transport, sediment pass over points into surface water bodies and particle enrichment can be simulated using the EROSION 3D model. Thus the estimation of particle bound nutrient and pollutant inputs into surface water bodies becomes possible. The study ended up in a user-friendly, timesaving and improved software package for the simulation of soil loss and

  3. Model-based application development for massively parallel embedded systems

    NARCIS (Netherlands)

    Jacobs, Johannes Wilhelmus Maria

    2008-01-01

    The development of embedded systems in information-rich contexts is governed by some intertwined trends. The increase of both volume of data to be processed and the related processing functionality feeds the growing complexity of applications. Independently, the processing hardware that is needed to

  4. Physiologically Based Pharmacokinetic (PBPK) Modeling and Simulation Approaches: A Systematic Review of Published Models, Applications, and Model Verification.

    Science.gov (United States)

    Sager, Jennifer E; Yu, Jingjing; Ragueneau-Majlessi, Isabelle; Isoherranen, Nina

    2015-11-01

    Modeling and simulation of drug disposition has emerged as an important tool in drug development, clinical study design and regulatory review, and the number of physiologically based pharmacokinetic (PBPK) modeling related publications and regulatory submissions have risen dramatically in recent years. However, the extent of use of PBPK modeling by researchers, and the public availability of models has not been systematically evaluated. This review evaluates PBPK-related publications to 1) identify the common applications of PBPK modeling; 2) determine ways in which models are developed; 3) establish how model quality is assessed; and 4) provide a list of publically available PBPK models for sensitive P450 and transporter substrates as well as selective inhibitors and inducers. PubMed searches were conducted using the terms "PBPK" and "physiologically based pharmacokinetic model" to collect published models. Only papers on PBPK modeling of pharmaceutical agents in humans published in English between 2008 and May 2015 were reviewed. A total of 366 PBPK-related articles met the search criteria, with the number of articles published per year rising steadily. Published models were most commonly used for drug-drug interaction predictions (28%), followed by interindividual variability and general clinical pharmacokinetic predictions (23%), formulation or absorption modeling (12%), and predicting age-related changes in pharmacokinetics and disposition (10%). In total, 106 models of sensitive substrates, inhibitors, and inducers were identified. An in-depth analysis of the model development and verification revealed a lack of consistency in model development and quality assessment practices, demonstrating a need for development of best-practice guidelines.

  5. Modeling Component-based Bragg gratings Application: tunable lasers

    Directory of Open Access Journals (Sweden)

    Hedara Rachida

    2011-09-01

    Full Text Available The principal function of a grating Bragg is filtering, which can be used in optical fibers based component and active or passive semi conductors based component, as well as telecommunication systems. Their ideal use is with lasers with fiber, amplifiers with fiber or Laser diodes. In this work, we are going to show the principal results obtained during the analysis of various types of grating Bragg by the method of the coupled modes. We then present the operation of DBR are tunable. The use of Bragg gratings in a laser provides single-mode sources, agile wavelength. The use of sampled grating increases the tuning range.

  6. A Com-Gis Based Decision Tree Model Inagricultural Application

    Science.gov (United States)

    Cheng, Wei; Wang, Ke; Zhang, Xiuying

    The problem of agricultural soil pollution by heavy metals has been receiving an increasing attention in the last few decades. Geostatistics module in ArcGIS, could not however efficiently simulate the spatial distribution of heavy metals with satisfied accuracy when the spatial autocorrelation of the study area severely destroyed by human activities. In this study, the classificationand regression tree (CART) has been integrated into ArcGIS using ArcObjects and Visual Basic for Application (VBA) to predict the spatial distribution of soil heavy metals contents in the area severely polluted. This is a great improvement comparing with ordinary Kriging method in ArcGIS. The integrated approach allows for relatively easy, fast, and cost-effective estimation of spatially distributed soil heavy metals pollution.

  7. Computational model for refrigerators based on Peltier effect application

    Energy Technology Data Exchange (ETDEWEB)

    Astrain, D.; Vian, J.G.; Albizua, J. [Departamento de Ingenieria Mecanica, Energetica y de Materiales, Universidad Publica de Navarra, UPNa. Pamplona (Spain)

    2005-12-01

    A computational model, which simulates thermal and electric performance of thermoelectric refrigerators, has been developed. This model solves the non-linear system that is made up of the thermoelectric equations and the heat conduction equations providing values for temperature, electric consumption, heat flow and coefficient of performance of the refrigerator. Finite differences method is used in order to solve the system and also semi empirical expressions for convection coefficients. Subsequently a thermoelectric refrigerator with an inner volume of 55x10{sup -3}m{sup 3} has been designed and tested, whose cold system is composed of a Peltier pellet (50W of maximum power) and a fan of 2W. An experimental analysis of its performance in different conditions has been carried out with this prototype, which, in his turn, has been useful for assessing the accuracy of the developed model. The built thermoelectric refrigerator prototype, offers advantages with respect to vapour compression classical technology such as: a more ecological system, more silent and robust and more precise in the control of temperatures which make it suitable for camping vehicles, buses, special transports for electro medicine, etc. (author)

  8. Spatial Decision Support Applications Based on Three-Dimensional City Models

    Institute of Scientific and Technical Information of China (English)

    LI Chaokui; ZHU Qing; ZHANG Yeting; HUANG Duo; ZHAO Jie; CHEN Songlin

    2004-01-01

    The basic mathematic models, such as the statistic model, the time-serial model, the spatial dynamic model etc., and some typical analysis methods based on 3DCM are proposed and discussed. A few typical spatial decision making methods integrating the spatial analysis and the basic mathematical models are also introduced, e.g. Visual impact assessment, dispersion of noise immissions, base station plan for wireless communication. In addition, a new idea of expectation of further applications and add-in-value service of 3DCM is promoted. As an example, the sunshine analysis is studied and some helpful conclusions are drawn.

  9. Application for managing model-based material properties for simulation-based engineering

    Science.gov (United States)

    Hoffman, Edward L.

    2009-03-03

    An application for generating a property set associated with a constitutive model of a material includes a first program module adapted to receive test data associated with the material and to extract loading conditions from the test data. A material model driver is adapted to receive the loading conditions and a property set and operable in response to the loading conditions and the property set to generate a model response for the material. A numerical optimization module is adapted to receive the test data and the model response and operable in response to the test data and the model response to generate the property set.

  10. The Model and Design for COM-Based e-Commerce Application System

    Institute of Scientific and Technical Information of China (English)

    TANG Xiao-mei; SUN Li

    2002-01-01

    From the point of constructing e-commerce application system, based on the structured analysis and ObjectOriented Design method, a combined modeling method Business-Process Driven(BPD) is proposed. This method focuses on the business process through the development process of the system. First, the business model of the system, then commercial object model is introduced according to the business model. At last the COM-model for the system is established. The system is implemented in an iterative and incremental way. The design and analysis result of each stage is illustrated by series of views using the modeling tool UML.

  11. Application of a data base management system to a finite element model

    Science.gov (United States)

    Rogers, J. L., Jr.

    1980-01-01

    In today's software market, much effort is being expended on the development of data base management systems (DBMS). Most commercially available DBMS were designed for business use. However, the need for such systems within the engineering and scientific communities is becoming apparent. A potential DBMS application that appears attractive is the handling of data for finite element engineering models. The applications of a commercially available, business-oriented DBMS to a structural engineering, finite element model is explored. The model, DBMS, an approach to using the DBMS, advantages and disadvantages are described. Plans for research on a scientific and engineering DBMS are discussed.

  12. SOFT SENSING MODEL BASED ON SUPPORT VECTOR MACHINE AND ITS APPLICATION

    Institute of Scientific and Technical Information of China (English)

    Yan Weiwu; Shao Huihe; Wang Xiaofan

    2004-01-01

    Soft sensor is widely used in industrial process control.It plays an important role to improve the quality of product and assure safety in production.The core of soft sensor is to construct soft sensing model.A new soft sensing modeling method based on support vector machine (SVM) is proposed.SVM is a new machine learning method based on statistical learning theory and is powerful for the problem characterized by small sample, nonlinearity, high dimension and local minima.The proposed methods are applied to the estimation of frozen point of light diesel oil in distillation column.The estimated outputs of soft sensing model based on SVM match the real values of frozen point and follow varying trend of frozen point very well.Experiment results show that SVM provides a new effective method for soft sensing modeling and has promising application in industrial process applications.

  13. A Comparative Study of Relational and Non-Relational Database Models in a Web- Based Application

    OpenAIRE

    Cornelia Gyorödi; Robert Gyorödi; Roxana Sotoc

    2015-01-01

    The purpose of this paper is to present a comparative study between relational and non-relational database models in a web-based application, by executing various operations on both relational and on non-relational databases thus highlighting the results obtained during performance comparison tests. The study was based on the implementation of a web-based application for population records. For the non-relational database, we used MongoDB and for the relational database, we used MSSQL 2014. W...

  14. Security Model for Microsoft Based Mobile Sales Management Application in Private Cloud Computing

    Directory of Open Access Journals (Sweden)

    Kuan Chee Houng

    2013-05-01

    Full Text Available The Microsoft-based mobile sales management application is a sales force management application that currently running on Windows Mobile 6.5. It handles sales-related activity and cuts down the administrative task of sales representative. Then, Windows launch a new mobile operating system, Windows Phone and stop providing support to Windows Mobile. This has become an obstacle for Windows Mobile development. From time to time, Windows Mobile will be eliminated from the market due to no support provided by Microsoft. Besides that, Windows Mobile application cannot run on Windows Phone mobile operating system due to lack of compatibility. Therefore, applications those run on Windows Mobile need to find a solution addressing this problem. The rise of cloud computing technology in delivering software as a service becomes a solution. The Microsoft-based mobile sales management application delivers a service to run in a web browser, rather than limited by certain type of mobile that run the Windows Mobile operating system. However, there are some security issues need to concern in order to deliver the Microsoft-based mobile application as a service in private cloud computing. Therefore, security model is needed to answer the security issues in private cloud computing. This research is to propose a security model for the Microsoft-based mobile sales management application in private cloud computing. Lastly, a User Acceptance Test (UAT is carried out to test the compatibility between proposed security model of Microsoft-based mobile sales management application in a private cloud and tablet computers.

  15. Application of numerical methods for diffusion-based modeling of skin permeation.

    Science.gov (United States)

    Frasch, H Frederick; Barbero, Ana M

    2013-02-01

    The application of numerical methods for mechanistic, diffusion-based modeling of skin permeation is reviewed. Methods considered here are finite difference, method of lines, finite element, finite volume, random walk, cellular automata, and smoothed particle hydrodynamics. First the methods are briefly explained with rudimentary mathematical underpinnings. Current state of the art numerical models are described, and then a chronological overview of published models is provided. Key findings and insights of reviewed models are highlighted. Model results support a primarily transcellular pathway with anisotropic lipid transport. Future endeavors would benefit from a fundamental analysis of drug/vehicle/skin interactions.

  16. Application-Oriented Confidentiality and Integrity Dynamic Union Security Model Based on MLS Policy

    Science.gov (United States)

    Xue, Mingfu; Hu, Aiqun; He, Chunlong

    We propose a new security model based on MLS Policy to achieve a better security performance on confidentiality, integrity and availability. First, it realizes a combination of BLP model and Biba model through a two-dimensional independent adjustment of integrity and confidentiality. And, the subject's access range is adjusted dynamically according to the security label of related objects and the subject's access history. Second, the security level of the trusted subject is extended to writing and reading privilege range respectively, following the principle of least privilege. Third, it adjusts the objects' security levels after adding confidential information to prevent the information disclosure. Fourth, it uses application-oriented logic to protect specific applications to avoid the degradation of security levels. Thus, it can ensure certain applications operate smoothly. Lastly, examples are presented to show the effectiveness and usability of the proposed model.

  17. Application of soft computing based hybrid models in hydrological variables modeling: a comprehensive review

    Science.gov (United States)

    Fahimi, Farzad; Yaseen, Zaher Mundher; El-shafie, Ahmed

    2016-02-01

    Since the middle of the twentieth century, artificial intelligence (AI) models have been used widely in engineering and science problems. Water resource variable modeling and prediction are the most challenging issues in water engineering. Artificial neural network (ANN) is a common approach used to tackle this problem by using viable and efficient models. Numerous ANN models have been successfully developed to achieve more accurate results. In the current review, different ANN models in water resource applications and hydrological variable predictions are reviewed and outlined. In addition, recent hybrid models and their structures, input preprocessing, and optimization techniques are discussed and the results are compared with similar previous studies. Moreover, to achieve a comprehensive view of the literature, many articles that applied ANN models together with other techniques are included. Consequently, coupling procedure, model evaluation, and performance comparison of hybrid models with conventional ANN models are assessed, as well as, taxonomy and hybrid ANN models structures. Finally, current challenges and recommendations for future researches are indicated and new hybrid approaches are proposed.

  18. A Comparative Study of Relational and Non-Relational Database Models in a Web- Based Application

    Directory of Open Access Journals (Sweden)

    Cornelia Gyorödi

    2015-11-01

    Full Text Available The purpose of this paper is to present a comparative study between relational and non-relational database models in a web-based application, by executing various operations on both relational and on non-relational databases thus highlighting the results obtained during performance comparison tests. The study was based on the implementation of a web-based application for population records. For the non-relational database, we used MongoDB and for the relational database, we used MSSQL 2014. We will also present the advantages of using a non-relational database compared to a relational database integrated in a web-based application, which needs to manipulate a big amount of data.

  19. CACM: A New Coordination Model in Mobile Agent-Based Information Retrieval Applications

    Institute of Scientific and Technical Information of China (English)

    TANGXinhuai; ZHANGYaying; YAOYinxiong; YOUJinyuan

    2005-01-01

    In mobile agent systems, an application may be composed of several mobile agents that cooperatively perform a task. Multiple mobile agents need to communicate and interact with each other to accomplish their cooperative goal. Coordination model aims to provide solutions to interactions between concurrent activities, hiding the computing details and focusing on interaction between activities. A Context-aware coordination model (CACM), which combines mobility and coordination, is proposed for mobile agent applications, i.e. in mobile agent based information retrieval applications. The context-aware coordination model transfers interactions between agents from globally coupling interactions to locally uncoupling tuple space interactions. In addition, programmable tuple space is adopted to solve the problems of context-aware coordination introduced by mobility and data heterogeneity in mobile agent systems. Furthermore, environment specific and application specific coordination policy can be integrated into the programmable tuple space for customized requirements. Finally an application sample system-information retrieval in mobile agent applications is carried out to test the performance of the proposed model.

  20. Compact modeling of CRS devices based on ECM cells for memory, logic and neuromorphic applications

    Science.gov (United States)

    Linn, E.; Menzel, S.; Ferch, S.; Waser, R.

    2013-09-01

    Dynamic physics-based models of resistive switching devices are of great interest for the realization of complex circuits required for memory, logic and neuromorphic applications. Here, we apply such a model of an electrochemical metallization (ECM) cell to complementary resistive switches (CRSs), which are favorable devices to realize ultra-dense passive crossbar arrays. Since a CRS consists of two resistive switching devices, it is straightforward to apply the dynamic ECM model for CRS simulation with MATLAB and SPICE, enabling study of the device behavior in terms of sweep rate and series resistance variations. Furthermore, typical memory access operations as well as basic implication logic operations can be analyzed, revealing requirements for proper spike and level read operations. This basic understanding facilitates applications of massively parallel computing paradigms required for neuromorphic applications.

  1. Design and implementation of space physics multi-model application integration based on web

    Science.gov (United States)

    Jiang, Wenping; Zou, Ziming

    With the development of research on space environment and space science, how to develop network online computing environment of space weather, space environment and space physics models for Chinese scientific community is becoming more and more important in recent years. Currently, There are two software modes on space physics multi-model application integrated system (SPMAIS) such as C/S and B/S. the C/S mode which is traditional and stand-alone, demands a team or workshop from many disciplines and specialties to build their own multi-model application integrated system, that requires the client must be deployed in different physical regions when user visits the integrated system. Thus, this requirement brings two shortcomings: reducing the efficiency of researchers who use the models to compute; inconvenience of accessing the data. Therefore, it is necessary to create a shared network resource access environment which could help users to visit the computing resources of space physics models through the terminal quickly for conducting space science research and forecasting spatial environment. The SPMAIS develops high-performance, first-principles in B/S mode based on computational models of the space environment and uses these models to predict "Space Weather", to understand space mission data and to further our understanding of the solar system. the main goal of space physics multi-model application integration system (SPMAIS) is to provide an easily and convenient user-driven online models operating environment. up to now, the SPMAIS have contained dozens of space environment models , including international AP8/AE8、IGRF、T96 models,and solar proton prediction model、geomagnetic transmission model,etc. which are developed by Chinese scientists. another function of SPMAIS is to integrate space observation data sets which offers input data for models online high-speed computing. In this paper, service-oriented architecture (SOA) concept that divides

  2. A bootstrap based space-time surveillance model with an application to crime occurrences

    Science.gov (United States)

    Kim, Youngho; O'Kelly, Morton

    2008-06-01

    This study proposes a bootstrap-based space-time surveillance model. Designed to find emerging hotspots in near-real time, the bootstrap based model is characterized by its use of past occurrence information and bootstrap permutations. Many existing space-time surveillance methods, using population at risk data to generate expected values, have resulting hotspots bounded by administrative area units and are of limited use for near-real time applications because of the population data needed. However, this study generates expected values for local hotspots from past occurrences rather than population at risk. Also, bootstrap permutations of previous occurrences are used for significant tests. Consequently, the bootstrap-based model, without the requirement of population at risk data, (1) is free from administrative area restriction, (2) enables more frequent surveillance for continuously updated registry database, and (3) is readily applicable to criminology and epidemiology surveillance. The bootstrap-based model performs better for space-time surveillance than the space-time scan statistic. This is shown by means of simulations and an application to residential crime occurrences in Columbus, OH, year 2000.

  3. Integrated knowledge-based modeling and its application for classification problems

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Knowledge discovery from data directly can hardly avoid the fact that it is biased towards the collected experimental data, whereas, expert systems are always baffled with the manual knowledge acquisition bottleneck. So it is believable that integrating the knowledge embedded in data and those possessed by experts can lead to a superior modeling approach. Aiming at the classification problems, a novel integrated knowledge-based modeling methodology, oriented by experts and driven by data, is proposed. It starts from experts identifying modeling parameters, and then the input space is partitioned followed by fuzzification. Afterwards, single rules are generated and then aggregated to form a rule base. on which a fuzzy inference mechanism is proposed. The experts are allowed to make necessary changes on the rule base to improve the model accuracy. A real-world application, welding fault diagnosis, is presented to demonstrate the effectiveness of the methodology.

  4. Application of Holdridge life-zone model based on the terrain factor in Xinjiang Automous Region

    Institute of Scientific and Technical Information of China (English)

    NI Yong-ming; OUYANG Zhi-yun; WANG Xiao-ke

    2005-01-01

    This study improved the application of the Holdridge life-zone model to simulate the distribution of desert vegetation in China which gives statistics to support eco-recovery and ecosystem reconstruction in desert area. This study classified the desert vegetation into four types: (1) LAD: little arbor desert; (2) SD: shrub desert; (3) HLHSD: half-shrub, little half-shrub desert; (4) LHSCD: little halfshrub cushion desert. Based on the classification of Xinjiang desert vegetation, the classical Holdridge life-zone model was used to simulate Xinjiang desert vegetation's distribution and compare the Kappa coefficient result of the model with table of accuracy represented by Kappa values. The Kappa value of the model was only 0.19, it means the simulation result was poor. To improve the life-zone model application to Xinjiang desert vegetation type, a set of plot standards for terrain factors was developed by using the plot standard as the reclassification criterion to climate sub-regime. Then the desert vegetation in Xinjiang was simulated. The average Kappa value of the second simulation to the respective climate regime was 0.45. The Kappa value of final modeling result was 0.64, which is the better value.The modification of the model made it in more application region. In the end, the model' s ecological relevance to the Xinjiang desert vegetation types was studied.

  5. [Study on the Application of NAS-Based Algorithm in the NIR Model Optimization].

    Science.gov (United States)

    Geng, Ying; Xiang, Bing-ren; He, Lan

    2015-10-01

    In this paper, net analysis signal (NAS)-based concept was introduced to the analysis of multi-component Ginkgo biloba leaf extracts. NAS algorithm was utilized for the preprocessing of spectra, and NAS-based two-dimensional correlation analysis was used for the optimization of NIR model building. Simultaneous quantitative models for three flavonol aglycones: quercetin, keampferol and isorhamnetin were established respectively. The NAS vectors calculated using two algorithms introduced from Lorber and Goicoechea and Olivieri (HLA/GO) were applied in the development of calibration models, the reconstructed spectra were used as input of PLS modeling. For the first time, NAS-based two-dimensional correlation spectroscopy was used for wave number selection. The regions appeared in the main diagonal were selected as useful regions for model building. The results implied that two NAS-based preprocessing methods were successfully used for the analysis of quercetin, keampferol and isorhamnetin with a decrease of factor number and an improvement of model robustness. NAS-based algorithm was proven to be a useful tool for the preprocessing of spectra and for optimization of model calibration. The above research showed a practical application value for the NIRS in the analysis of complex multi-component petrochemical medicine with unknown interference.

  6. Analysis and Application of Mechanical System Reliability Model Based on Copula Function

    Directory of Open Access Journals (Sweden)

    An Hai

    2016-10-01

    Full Text Available There is complicated correlations in mechanical system. By using the advantages of copula function to solve the related issues, this paper proposes the mechanical system reliability model based on copula function. And makes a detailed research for the serial and parallel mechanical system model and gets their reliability function respectively. Finally, the application research is carried out for serial mechanical system reliability model to prove its validity by example. Using Copula theory to make mechanical system reliability modeling and its expectation, studying the distribution of the random variables (marginal distribution of the mechanical product’ life and associated structure of variables separately, can reduce the difficulty of multivariate probabilistic modeling and analysis to make the modeling and analysis process more clearly.

  7. A unified charge-based model for SOI MOSFETs applicable from intrinsic to heavily doped channel

    Institute of Scientific and Technical Information of China (English)

    Zhang Jian; Han Yu; Chan Mansun; He Jin; Zhou Xing-Ye; Zhang Li-Ning; Ma Yu-Tao; Chen Qin; Zhang Xu-Kai; Yang Zhang; Wang Rui-Fei

    2012-01-01

    A unified charge-based model for fully depleted silicon-on-insulator (SOI) metal-oxide semiconductor field-effect transistors (MOSFETs) is presented.The proposed model is accurate and applicable from intrinsic to heavily doped channels with various structure parameters.The framework starts from the one-dimensional Poisson-Boltzmann equation,and based on the full depletion approximation,an accurate inversion charge density equation is obtained.With the inversion charge density solution,the unified drain current expression is derived,and a unified terminal charge and intrinsic capacitance model is also derived in the quasi-static case.The validity and accuracy of the presented analytic model is proved by numerical simulations.

  8. GRace: a MATLAB-based application for fitting the discrimination-association model.

    Science.gov (United States)

    Stefanutti, Luca; Vianello, Michelangelo; Anselmi, Pasquale; Robusto, Egidio

    2014-10-28

    The Implicit Association Test (IAT) is a computerized two-choice discrimination task in which stimuli have to be categorized as belonging to target categories or attribute categories by pressing, as quickly and accurately as possible, one of two response keys. The discrimination association model has been recently proposed for the analysis of reaction time and accuracy of an individual respondent to the IAT. The model disentangles the influences of three qualitatively different components on the responses to the IAT: stimuli discrimination, automatic association, and termination criterion. The article presents General Race (GRace), a MATLAB-based application for fitting the discrimination association model to IAT data. GRace has been developed for Windows as a standalone application. It is user-friendly and does not require any programming experience. The use of GRace is illustrated on the data of a Coca Cola-Pepsi Cola IAT, and the results of the analysis are interpreted and discussed.

  9. An emission source inversion model based on satellite data and its application in air quality forecasts

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    This paper aims at constructing an emission source inversion model using a variational processing method and adaptive nudging scheme for the Community Multiscale Air Quality Model (CMAQ) based on satellite data to investigate the applicability of high resolution OMI (Ozone Monitoring Instrument) column concentration data for air quality forecasts over the North China. The results show a reasonable consistency and good correlation between the spatial distributions of NO2 from surface and OMI satellite measurements in both winter and summer. Such OMI products may be used to implement integrated variational analysis based on observation data on the ground. With linear and variational corrections made, the spatial distribution of OMI NO2 clearly revealed more localized distributing characteristics of NO2 concentration. With such information, emission sources in the southwest and southeast of North China are found to have greater impacts on air quality in Beijing. When the retrieved emission source inventory based on high-resolution OMI NO2 data was used, the coupled Weather Research Forecasting CMAQ model (WRF-CMAQ) performed significantly better in forecasting NO2 concentration level and its tendency as reflected by the more consistencies between the NO2 concentrations from surface observation and model result. In conclusion, satellite data are particularly important for simulating NO2 concentrations on urban and street-block scale. High-resolution OMI NO2 data are applicable for inversing NOx emission source inventory, assessing the regional pollution status and pollution control strategy, and improving the model forecasting results on urban scale.

  10. A Reaction-Based River/Stream Water Quality Model: Reaction Network Decomposition and Model Application

    OpenAIRE

    2012-01-01

    This paper describes details of an automatic matrix decomposition approach for a reaction-based stream water quality model. The method yields a set of equilibrium equations, a set of kinetic-variable transport equations involving kinetic reactions only, and a set of component transport equations involving no reactions. Partial decomposition of the system of water quality constituent transport equations is performed via Gauss-Jordan column reduction of the reaction network by pivoting on equil...

  11. Model Test Based Soil Spring Model and Application in Pipeline Thermal Buckling Analysis

    Institute of Scientific and Technical Information of China (English)

    GAO Xi-feng; LIU Run; YAN Shu-wang

    2011-01-01

    The buckling of submarine pipelines may occur due to the action of axial soil frictional force caused by relative movement of soil and pipeline,which is induced by the thermal and internal pressure.The likelihood of occurrence of this buckling phenomenon is largely determined by soil resistance.A series of large-scale model tests were carried out to facilitate the establishment of substantial data base for a variety of burial pipeline relationships.Based on the test data,nonlinear soil spring can be adopted to simulate the soil behavior during the pipeline movement.For uplift resistance,an ideal elasticity plasticity model is recommended in the case of H/D (depth-to-diameter ratio)>5 and an elasticity softened model is recommended in the case of H/D≤5.The soil resistance along the pipeline axial direction can be simulated by an ideal elasticity plasticity model.The numerical analyzing results show that the capacity of pipeline against thermal buckling decreases with its initial imperfection enlargement and increases with the burial depth enhancement.

  12. Application of geometry based hysteresis modelling in compensation of hysteresis of piezo bender actuator

    Science.gov (United States)

    Milecki, Andrzej; Pelic, Marcin

    2016-10-01

    This paper presents results of studies of an application of a new method of piezo bender actuators modelling. A special hysteresis simulation model was developed and is presented. The model is based on a geometrical deformation of main hysteresis loop. The piezoelectric effect is described and the history of the hysteresis modelling is briefly reviewed. Firstly, a simple model for main loop modelling is proposed. Then, a geometrical description of the non-saturated hysteresis is presented and its modelling method is introduced. The modelling makes use of the function describing the geometrical shape of the two hysteresis main curves, which can be defined theoretically or obtained by measurement. These main curves are stored in the memory and transformed geometrically in order to obtain the minor curves. Such model was prepared in the Matlab-Simulink software, but can be easily implemented using any programming language and applied in an on-line controller. In comparison to the other known simulation methods, the one presented in the paper is easy to understand, and uses simple arithmetical equations, allowing to quickly obtain the inversed model of hysteresis. The inversed model was further used for compensation of a non-saturated hysteresis of the piezo bender actuator and results have also been presented in the paper.

  13. Dynamic Modeling and Performance Analysis of PMSG based Wind Generation System for Residential Applications

    Directory of Open Access Journals (Sweden)

    Rashmi S

    2014-03-01

    Full Text Available This paper proposes the Dynamic modeling and performance analysis of Permanent magnet synchronous generator (PMSG based Wind Generation System (WGS. This system consists of Wind Turbine, PMSG, Diode Rectifier, Buck- Boost converter, Voltage source Inverter (VSI. PMSG and Buck Boost converter are employed in WGS to get efficient output according to the load requirement without damaging the system. The output of the VSI is injected to the grid and used for Home Application. The proposed model dynamic simulation results are tested in MATLAB Simulink

  14. Constructing a raster-based spatio-temporal hierarchical data model for marine fisheries application

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Marine information has been increasing quickly. The traditional database technologies have disadvantages in manipulating large amounts of marine information which relates to the position in 3-D with the time. Recently, greater emphasis has been placed on GIS (geographical information system)to deal with the marine information. The GIS has shown great success for terrestrial applications in the last decades, but its use in marine fields has been far more restricted. One of the main reasons is that most of the GIS systems or their data models are designed for land applications. They cannot do well with the nature of the marine environment and for the marine information. And this becomes a fundamental challenge to the traditional GIS and its data structure. This work designed a data model,the raster-based spatio-temporal hierarchical data model (RSHDM), for the marine information system, or for the knowledge discovery from spatio-temporal data, which bases itself on the nature of the marine data and overcomes the shortages of the current spatio-temporal models when they are used in the field. As an experiment, the marine fishery data warehouse (FDW) for marine fishery management was set up, which was based on the RSHDM. The experiment proved that the RSHDM can do well with the data and can extract easily the aggregations that the management needs at different levels.

  15. Using rule-based shot dose assignment in model-based MPC applications

    Science.gov (United States)

    Bork, Ingo; Buck, Peter; Wang, Lin; Müller, Uwe

    2014-10-01

    Shrinking feature sizes and the need for tighter CD (Critical Dimension) control require the introduction of new technologies in mask making processes. One of those methods is the dose assignment of individual shots on VSB (Variable Shaped Beam) mask writers to compensate CD non-linearity effects and improve dose edge slope. Using increased dose levels only for most critical features, generally only for the smallest CDs on a mask, the change in mask write time is minimal while the increase in image quality can be significant. This paper describes a method combining rule-based shot dose assignment with model-based shot size correction. This combination proves to be very efficient in correcting mask linearity errors while also improving dose edge slope of small features. Shot dose assignment is based on tables assigning certain dose levels to a range of feature sizes. The dose to feature size assignment is derived from mask measurements in such a way that shape corrections are kept to a minimum. For example, if a 50nm drawn line on mask results in a 45nm chrome line using nominal dose, a dose level is chosen which is closest to getting the line back on target. Since CD non-linearity is different for lines, line-ends and contacts, different tables are generated for the different shape categories. The actual dose assignment is done via DRC rules in a pre-processing step before executing the shape correction in the MPC engine. Dose assignment to line ends can be restricted to critical line/space dimensions since it might not be required for all line ends. In addition, adding dose assignment to a wide range of line ends might increase shot count which is undesirable. The dose assignment algorithm is very flexible and can be adjusted based on the type of layer and the best balance between accuracy and shot count. These methods can be optimized for the number of dose levels available for specific mask writers. The MPC engine now needs to be able to handle different dose

  16. Model Based Optimization of Integrated Low Voltage DC-DC Converter for Energy Harvesting Applications

    Science.gov (United States)

    Jayaweera, H. M. P. C.; Muhtaroğlu, Ali

    2016-11-01

    A novel model based methodology is presented to determine optimal device parameters for the fully integrated ultra low voltage DC-DC converter for energy harvesting applications. The proposed model feasibly contributes to determine the maximum efficient number of charge pump stages to fulfill the voltage requirement of the energy harvester application. The proposed DC-DC converter based power consumption model enables the analytical derivation of the charge pump efficiency when utilized simultaneously with the known LC tank oscillator behavior under resonant conditions, and voltage step up characteristics of the cross-coupled charge pump topology. The verification of the model has been done using a circuit simulator. The optimized system through the established model achieves more than 40% maximum efficiency yielding 0.45 V output with single stage, 0.75 V output with two stages, and 0.9 V with three stages for 2.5 kΩ, 3.5 kΩ and 5 kΩ loads respectively using 0.2 V input.

  17. Flexibility Support for Homecare Applications Based on Models and Multi-Agent Technology

    Directory of Open Access Journals (Sweden)

    Aintzane Armentia

    2015-12-01

    Full Text Available In developed countries, public health systems are under pressure due to the increasing percentage of population over 65. In this context, homecare based on ambient intelligence technology seems to be a suitable solution to allow elderly people to continue to enjoy the comforts of home and help optimize medical resources. Thus, current technological developments make it possible to build complex homecare applications that demand, among others, flexibility mechanisms for being able to evolve as context does (adaptability, as well as avoiding service disruptions in the case of node failure (availability. The solution proposed in this paper copes with these flexibility requirements through the whole life-cycle of the target applications: from design phase to runtime. The proposed domain modeling approach allows medical staff to design customized applications, taking into account the adaptability needs. It also guides software developers during system implementation. The application execution is managed by a multi-agent based middleware, making it possible to meet adaptation requirements, assuring at the same time the availability of the system even for stateful applications.

  18. Flexibility Support for Homecare Applications Based on Models and Multi-Agent Technology.

    Science.gov (United States)

    Armentia, Aintzane; Gangoiti, Unai; Priego, Rafael; Estévez, Elisabet; Marcos, Marga

    2015-12-17

    In developed countries, public health systems are under pressure due to the increasing percentage of population over 65. In this context, homecare based on ambient intelligence technology seems to be a suitable solution to allow elderly people to continue to enjoy the comforts of home and help optimize medical resources. Thus, current technological developments make it possible to build complex homecare applications that demand, among others, flexibility mechanisms for being able to evolve as context does (adaptability), as well as avoiding service disruptions in the case of node failure (availability). The solution proposed in this paper copes with these flexibility requirements through the whole life-cycle of the target applications: from design phase to runtime. The proposed domain modeling approach allows medical staff to design customized applications, taking into account the adaptability needs. It also guides software developers during system implementation. The application execution is managed by a multi-agent based middleware, making it possible to meet adaptation requirements, assuring at the same time the availability of the system even for stateful applications.

  19. Environmental Comparison of Straw Applications Based on a Life Cycle Assessment Model and Emergy Evaluation

    Directory of Open Access Journals (Sweden)

    Juan Gao

    2014-11-01

    Full Text Available Straw is considered to be a renewable resource for bioenergy and biomaterial. However, about 70% of straw is burned in fields, which causes serious air pollution in China. In this study, a life cycle assessment (LCA model, together with emergy evaluation, was built to compare four straw applications after harvest vs. direct burning, including bioethanol (BE, combined heat and power plant (CHP, corrugated base paper (CP, and medium-density fiberboard (MDF. The results showed that BE and MDF would avoid greenhouse gas (GHG emissions by 82% and 36%, respectively, while CHP and CP would emit 57% and 152% more GHG , respectively, compared with direct straw burning. Bioethanol had the highest renewability indicator (RI of 47.7%, and MDF obtained the greatest profit of 657 Yuan•bale-1. The applications CHP and CP had low RI (< 10.3% and profit (< 180 Yuan•bale-1. Due to water recycling and electrical power as a coproduct, BE had the lowest value (3 × 1011 sej•Yuan-1 of EmPM (emergy per unit money profit; the EmPM value of CP was 18.6 times higher than that of BE. The four straw applications would also greatly reduce particles emission (57 to 98% to air. BE was judged to be the most environmentally friendly application among the four straw applications. Imposing a carbon tax would encourage investment in BE, but discourage the applications CHP and CP.

  20. A Multitarget Land Use Change Simulation Model Based on Cellular Automata and Its Application

    Directory of Open Access Journals (Sweden)

    Jun Yang

    2014-01-01

    Full Text Available Based on the analysis of the existing land use change simulation model, combined with macroland use change driving factors and microlocal land use competition, and through the application of Python language integrated technical approaches such as CA, GIS, AHP, and Markov, a multitarget land use change simulation model based on cellular automata(CA is established. This model was applied to conduct scenario simulation of land use/cover change of the Jinzhou New District, based on 1:10000 map scale land use, planning, topography, statistics, and other data collected in the year of 1988, 2003, and 2012. The simulation results indicate the following: (1 this model can simulate the mutual transformation of multiple land use types in a relatively satisfactory way; it takes land use system as a whole and simultaneously takes the land use demand in the macrolevel and the land use suitability in the local scale into account; and (2 the simulation accuracy of the model reaches 72%, presenting higher creditability. The model is capable of providing auxiliary decision-making support for coastal regions with the analysis of the land use change driving mechanism, prediction of land use change tendencies, and establishment of land resource sustainable utilization policies.

  1. A Stress Vector-Based Constitutive Model for Cohesionless Soil( Ⅱ )-Application

    Institute of Scientific and Technical Information of China (English)

    史宏彦; 谢定义; 白琳

    2002-01-01

    The stress vector-based constitutive model for cohesionless soil, proposed by SHI Hong-yan et al., was applied to analyze the deformation behaviors of materials subjected to various stress paths. The result of analysis shows that the constitutive model can capture well the main deformation behavior of cohesionless soil, such as stress-strain nonlinearity,hardening property, dilatancy , stress path dependency, non- coaxiality between the principal stress and the principal strain increment directions, and the coupling of mean effective and deviatoric stress with deformation. In addition, the model can also take into account the rotation of principal stress axes and the influence of intermediate principal stress on deformation and strength of soil simultaneously. The excellent agreement between the predicted and measured behavior indicates the comprehensive applicability of the model.

  2. Multiple Linear Regression Model Based on Neural Network and Its Application in the MBR Simulation

    Directory of Open Access Journals (Sweden)

    Chunqing Li

    2012-01-01

    Full Text Available The computer simulation of the membrane bioreactor MBR has become the research focus of the MBR simulation. In order to compensate for the defects, for example, long test period, high cost, invisible equipment seal, and so forth, on the basis of conducting in-depth study of the mathematical model of the MBR, combining with neural network theory, this paper proposed a three-dimensional simulation system for MBR wastewater treatment, with fast speed, high efficiency, and good visualization. The system is researched and developed with the hybrid programming of VC++ programming language and OpenGL, with a multifactor linear regression model of affecting MBR membrane fluxes based on neural network, applying modeling method of integer instead of float and quad tree recursion. The experiments show that the three-dimensional simulation system, using the above models and methods, has the inspiration and reference for the future research and application of the MBR simulation technology.

  3. Physicologically Based Toxicokinetic Models of Tebuconazole and Application in Human Risk Assessment

    DEFF Research Database (Denmark)

    Jonsdottir, Svava Osk; Reffstrup, Trine Klein; Petersen, Annette

    2016-01-01

    A series of physiologically based toxicokinetic (PBTK) models for tebuconazole were developed in four species, rat, rabbit, rhesus monkey, and human. The developed models were analyzed with respect to the application of the models in higher tier human risk assessment, and the prospect of using...... (ADME) of tebuconazole. The developed models were validated on in vivo half-life data for rabbit with good results, and on plasma and tissue concentration-time course data of tebuconazole after i.v. administration in rabbit. In most cases, the predicted concentration levels were seen to be within...... a factor of 2 compared to the experimental data, which is the threshold set for the use of PBTK simulation results in risk assessment. An exception to this was seen for one of the target organs, namely, the liver, for which tebuconazole concentration was significantly underestimated, a trend also seen...

  4. Generalized poroviscoelastic model based on effective Biot theory and its application to borehole guided wave analysis

    Science.gov (United States)

    Liu, Xu; Greenhalgh, Stewart; Zhou, Bing; Heinson, Graham

    2016-12-01

    A method using modified attenuation factor function is suggested to determine the parameters of the generalized Zener model approximating the attenuation factor function. This method is applied to constitute the poroviscoelastic model based on the effective Biot theory which considers the attenuative solid frame of reservoir. In the poroviscoelastic model, frequency-dependent bulk modulus and shear modulus of solid frame are represented by generalized Zener models. As an application, the borehole logging dispersion equations from Biot theory are extended to include effects from the intrinsic body attenuation in formation media in full-frequency range. The velocity dispersions of borehole guided waves are calculated to investigate the influence from attenuative bore fluid, attenuative solid frame of the formation and impermeable bore wall.

  5. Web-based Services for Earth Observing and Model Data in National Applications and Hazards

    Science.gov (United States)

    Kafatos, M.; Boybeyi, Z.; Cervone, G.; di, L.; Sun, D.; Yang, C.; Yang, R.

    2005-12-01

    The ever-growing large volumes of Earth system science data, collected by Earth observing platforms, in situ stations and as model output data, are increasingly being used by discipline scientists and by wider classes of users. In particular, applications of Earth system science data to environmental and hazards as well as other national applications, require tailored or specialized data, as well as web-based tools and infrastructure. The latter are driven by applications and usage drivers which include ease of access, visualization of complex data, ease of producing value-added data, GIS and open source analysis usage, metadata, etc. Here we present different aspects of such web-based services and access, and discuss several applications in the hazards and environmental areas, including earthquake signatures and observations and model runs of hurricanes. Examples and lessons learned from the consortium Mid-Atlantic Geospatial Information Consortium will be presented. We discuss a NASA-funded, open source on-line data analysis system that is being applied to climate studies for the ESIP Federation. Since enhanced, this project and the next-generation Metadata Integrated Data Analysis System allow users not only to identify data but also to generate new data products on-the-fly. The functionalities extend from limited predefined functions, to sophisticated functions described by general-purposed GrADS (Grid Analysis and Display System) commands. The Federation system also allows third party data products to be combined with local data. Software component are available for converting the output from MIDAS (OPenDAP) into OGC compatible software. The on-going Grid efforts at CEOSR and LAITS in the School of Computational Sciences (SCS) include enhancing the functions of Globus to provide support for a geospatial system so the system can share the computing power to handle problems with different peak access times and improve the stability and flexibility of a rapid

  6. Models and frameworks: a synergistic association for developing component-based applications.

    Science.gov (United States)

    Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.

  7. Extended QoS modelling based on multi-application environment in network on chip

    Science.gov (United States)

    Saadaoui, Abdelkader; Nasri, Salem

    2015-01-01

    Until now, there is no standard method of the quality of service (QoS) measurement and fewer techniques have been used to provide its definition. Therefore, researchers are looking for a projection of QoS on quantifiable space, since it is qualitative, subjective and not measurable. However, a few tentatives have studied QoS parameter estimation. Many applications in network on chip (NoC) present variable QoS parameters such as packet loss rate (PLR), end-to-end delay (EED) and throughput (Thp). However, there are a few papers that have developed different methods to modelise QoS in NoC. Their QoS presentation does not provide a multi-application parameter arbiter. Independently of the approach used, an important challenge associated with QoS provision is the development of an efficient and flexible way to monitor QoS. The originality of our approach is based on a proposition of a QoS-intellectual property module in NoC architecture to improve network performances. We implement an extended approach of QoS metrics modelling for NoC on multi-parameter and multi-application environment. The QoS metrics model is based on QoS parameters such as PLR, EED and Thp for different applications. To validate this work, a dynamic routing simulation for 4 × 4 mesh NoC behaviour under three different applications, namely transmission control protocol, variable bit rate and constant bit rate, is considered. To achieve an ideal network behaviour, load balancing on NoC with multiple concurrent applications is improved using QoS metrics measurement based on dynamic routing. The results have shown that extended QoS modelling approach is easy and cheap to implement in hardware-software quantifiable representation. Thus, implementing a quantifiable representation of QoS can be used to provide a NoC services arbiter. QoS arbiter interacts with other routers to ensure flit flow and QoS modelling to provide a QoS value.

  8. AN APPLICATION OF HYBRID CLUSTERING AND NEURAL BASED PREDICTION MODELLING FOR DELINEATION OF MANAGEMENT ZONES

    Directory of Open Access Journals (Sweden)

    Babankumar S. Bansod

    2011-02-01

    Full Text Available Starting from descriptive data on crop yield and various other properties, the aim of this study is to reveal the trends on soil behaviour, such as crop yield. This study has been carried out by developing web application that uses a well known technique- Cluster Analysis. The cluster analysis revealed linkages between soil classes for the same field as well as between different fields, which can be partly assigned to crops rotation and determination of variable soil input rates. A hybrid clustering algorithm has been developed taking into account the traits of two clustering technologies: i Hierarchical clustering, ii K-means clustering. This hybrid clustering algorithm is applied to sensor- gathered data about soil and analysed, resulting in the formation of well delineatedmanagement zones based on various properties of soil, such as, ECa , crop yield, etc. One of the purposes of the study was to identify the main factors affecting the crop yield and the results obtained were validated with existing techniques. To accomplish this purpose, geo-referenced soil information has been examined. Also, based on this data, statistical method has been used to classify and characterize the soil behaviour. This is done using a prediction model, developed to predict the unknown behaviour of clusters based on the known behaviour of other clusters. In predictive modeling, data has been collected for the relevant predictors, a statistical model has been formulated, predictions were made and the model can be validated (or revised as additional data becomes available. The model used in the web application has been formed taking into account neural network based minimum hamming distance criterion.

  9. Criterion of applicable models for planar type Cherenkov laser based on quantum mechanical treatments

    Energy Technology Data Exchange (ETDEWEB)

    Yamada, Minoru [Faculty of Electrical and Computer Engineering, Institute of Science and Engineering Kanazawa University, Kakuma-machi, Kanazawa 920-1192 (Japan); Fares, Hesham, E-mail: fares_fares4@yahoo.com [Faculty of Electrical and Computer Engineering, Institute of Science and Engineering Kanazawa University, Kakuma-machi, Kanazawa 920-1192 (Japan); Department of Physics, Faculty of Science, Assiut University, Assiut 71516 (Egypt)

    2013-05-01

    A generalized theoretical analysis for amplification mechanism in the planar-type Cherenkov laser is given. An electron is represented to be a material wave having temporal and spatial varying phases with finite spreading length. Interaction between the electrons and the electromagnetic (EM) wave is analyzed by counting the quantum statistical properties. The interaction mechanism is classified into the Velocity and Density Modulation (VDM) model and the Energy Level Transition (ELT) model basing on the relation between the wavelength of the EM wave and the electron spreading length. The VDM model is applicable when the wavelength of the EM wave is longer than the electron spreading length as in the microwave region. The dynamic equation of the electron, which is popularly used in the classical Newtonian mechanics, has been derived from the quantum mechanical Schrödinger equation. The amplification of the EM wave can be explained basing on the bunching effect of the electron density in the electron beam. The amplification gain and whose dispersion relation with respect to the electron velocity is given in this paper. On the other hand, the ELT model is applicable for the case that the wavelength of the EM wave is shorter than the electron spreading length as in the optical region. The dynamics of the electron is explained to be caused by the electron transition between different energy levels. The amplification gain and whose dispersion relation with respect to the electron acceleration voltage was derived on the basis of the quantum mechanical density matrix.

  10. Analysis of Applications to Improve the Energy Savings in Residential Buildings Based on Systemic Quality Model

    Directory of Open Access Journals (Sweden)

    Antoni Fonseca i Casas

    2016-10-01

    Full Text Available Creating a definition of the features and the architecture of a new Energy Management Software (EMS is complex because different professionals will be involved in creating that definition and in using the tool. To simplify this definition and aid in the eventual selection of an existing EMS to fit a specific need, a set of metrics that considers the primary issues and drawbacks of the EMS is decisive. This study proposes a set of metrics to evaluate and compare EMS applications. Using these metrics will allow professionals to highlight the tendencies and detect the drawbacks of current EMS applications and to eventually develop new EMS applications based on the results of the analysis. This study presents a list of the applications to be examined and describes the primary issues to be considered in the development of a new application. This study follows the Systemic Quality Model (SQMO, which has been used as a starting point to develop new EMS, but can also be used to select an existing EMS that fits the goals of a company. Using this type of analysis, we were able to detect the primary features desired in an EMS software. These features are numerically scaled, allowing professionals to select the most appropriate EMS that fits for their purposes. This allows the development of EMS utilizing an iterative and user-centric approach. We can apply this methodology to guide the development of future EMS and to define the priorities that are desired in this type of software.

  11. A Reaction-Based River/Stream Water Quality Model: Reaction Network Decomposition and Model Application

    Directory of Open Access Journals (Sweden)

    Fan Zhang

    2012-01-01

    Full Text Available This paper describes details of an automatic matrix decomposition approach for a reaction-based stream water quality model. The method yields a set of equilibrium equations, a set of kinetic-variable transport equations involving kinetic reactions only, and a set of component transport equations involving no reactions. Partial decomposition of the system of water quality constituent transport equations is performed via Gauss-Jordan column reduction of the reaction network by pivoting on equilibrium reactions to decouple equilibrium and kinetic reactions. This approach minimizes the number of partial differential advective-dispersive transport equations and enables robust numerical integration. Complete matrix decomposition by further pivoting on linearly independent kinetic reactions allows some rate equations to be formulated individually and explicitly enforces conservation of component species when component transport equations are solved. The methodology is demonstrated for a case study involving eutrophication reactions in the Des Moines River in Iowa, USA and for two hypothetical examples to illustrate the ability of the model to simulate sediment and chemical transport with both mobile and immobile water phases and with complex reaction networks involving both kinetic and equilibrium reactions.

  12. Model-based geostatistics

    CERN Document Server

    Diggle, Peter J

    2007-01-01

    Model-based geostatistics refers to the application of general statistical principles of modeling and inference to geostatistical problems. This volume provides a treatment of model-based geostatistics and emphasizes on statistical methods and applications. It also features analyses of datasets from a range of scientific contexts.

  13. Nonlinear combined forecasting model based on fuzzy adaptive variable weight and its application

    Institute of Scientific and Technical Information of China (English)

    JIANG Ai-hua; MEI Chi; E Jia-qiang; SHI Zhang-ming

    2010-01-01

    In order to enhance forecasting precision of problems about nonlinear time series in a complex industry system,a new nonlinear fuzzy adaptive variable weight combined forecasting model was established by using conceptions of the relative error,the change tendency of the forecasted object,gray basic weight and adaptive control coefficient on the basis of the method of fuzzy variable weight.Based on Visual Basic 6.0 platform,a fuzzy adaptive variable weight combined forecasting and management system was developed.The application results reveal that the forecasting precisions from the new nonlinear combined forecasting model are higher than those of other single combined forecasting models and the combined forecasting and management system is very powerful tool for the required decision in complex industry system.

  14. Agent Based Fuzzy T-S Multi-Model System and Its Applications

    Directory of Open Access Journals (Sweden)

    Xiaopeng Zhao

    2015-11-01

    Full Text Available Based on the basic concepts of agent and fuzzy T-S model, an agent based fuzzy T-S multi-model (ABFT-SMM system is proposed in this paper. Different from the traditional method, the parameters and the membership value of the agent can be adjusted along with the process. In this system, each agent can be described as a dynamic equation, which can be seen as the local part of the multi-model, and it can execute the task alone or collaborate with other agents to accomplish a fixed goal. It is proved in this paper that the agent based fuzzy T-S multi-model system can approximate any linear or nonlinear system at arbitrary accuracy. The applications to the benchmark problem of chaotic time series prediction, water heater system and waste heat utilizing process illustrate the viability and the efficiency of the mentioned approach. At the same time, the method can be easily used to a number of engineering fields, including identification, nonlinear control, fault diagnostics and performance analysis.

  15. Establishment of Winter Wheat Regional Simulation Model Based on Remote Sensing Data and Its Application

    Institute of Scientific and Technical Information of China (English)

    MA Yuping; WANG Shili; ZHANG Li; HOU Yingyu; ZHUANG Liwei; WANG Futang

    2006-01-01

    Accurate crop growth monitoring and yield forecasting are significant to the food security and the sus tainable development of agriculture. Crop yield estimation by remote sensing and crop growth simulation models have highly potential application in crop growth monitoring and yield forecasting. However, both of them have limitations in mechanism and regional application, respectively. Therefore, approach and methodology study on the combination of remote sensing data and crop growth simulation models are con cerned by many researchers. In this paper, adjusted and regionalized WOFOST (World Food Study) in North China and Scattering by Arbitrarily Inclined Leaves-a model of leaf optical PROperties SPECTra (SAIL-PROSFPECT) were coupled through LAI to simulate Soil Adjusted Vegetation Index (SAVI) of crop canopy, by which crop model was re-initialized by minimizing differences between simulated and synthesized SAVI from remote sensing data using an optimization software (FSEOPT). Thus, a regional remote-sensing crop-simulation-framework-model (WSPFRS) was established under potential production level (optimal soil water condition). The results were as follows: after re-initializing regional emergence date by using remote sensing data, anthesis, and maturity dates simulated by WSPFRS model were more close to measured values than simulated results of WOFOST; by re-initializing regional biomass weight at turn-green stage, the spa tial distribution of simulated storage organ weight was more consistent with measured yields and the area with high values was nearly consistent with actual high yield area. This research is a basis for developing regional crop model in water stress production level based on remote sensing data.

  16. Should big cities grow? Scenario-based cellular automata urban growth modeling and policy applications

    Directory of Open Access Journals (Sweden)

    ChengHe Guan

    2016-12-01

    Full Text Available The formation of ‘Urban Networks’ has become a wide-spread phenomenon around the world. In the study of metropolitan regions, there are competing or diverging views about management and control of environmental and land-use factors as well as about scales and arrangements of settlements. Especially in China, these matters alongside of regulatory aspects, infrastructure applications, and resource allocations, are important because of population concentrations and the overlapping of urban areas with other land resources. On the other hand, the increasing sophistication of models operating on iterative computational power and widely-available spatial information and analytical techniques make it possible to simulate and investigate the spatial distribution of urban territories at a regional scale. This research applies a scenario-based Cellular Automata model to a case study of the Changjiang Delta Region, which produces useful and predictive scenario-based projections within the region, using quantitative methods and baseline conditions that address issues of regional urban development. The contribution of the research includes the improvement of computer simulation of urban growth, the application of urban form and other indices to evaluate complex urban conditions, and a heightened understanding of the performance of an urban network in the Changjiang Delta Region composed of big, medium, and small-sized cities and towns.

  17. Application of uncertainty reasoning based on cloud model in time series prediction

    Institute of Scientific and Technical Information of China (English)

    张锦春; 胡谷雨

    2003-01-01

    Time series prediction has been successfully used in several application areas, such as meteoro-logical forecasting, market prediction, network traffic forecasting, etc. , and a number of techniques have been developed for modeling and predicting time series. In the traditional exponential smoothing method, a fixed weight is assigned to data history, and the trend changes of time series are ignored. In this paper, an uncertainty reasoning method, based on cloud model, is employed in time series prediction, which uses cloud logic controller to adjust the smoothing coefficient of the simple exponential smoothing method dynamically to fit the current trend of the time series. The validity of this solution was proved by experiments on various data sets.

  18. Application of uncertainty reasoning based on cloud model in time series prediction

    Institute of Scientific and Technical Information of China (English)

    张锦春; 胡谷雨

    2003-01-01

    Time series prediction has been successfully used in several application areas, such as meteorological forecasting, market prediction, network traffic forecasting, etc., and a number of techniques have been developed for modeling and predicting time series. In the traditional exponential smoothing method, a fixed weight is assigned to data history, and the trend changes of time series are ignored. In this paper, an uncertainty reasoning method, based on cloud model, is employed in time series prediction, which uses cloud logic controller to adjust the smoothing coefficient of the simple exponential smoothing method dynamically to fit the current trend of the time series. The validity of this solution was proved by experiments on various data sets.

  19. Physiologically based pharmacokinetic modeling using microsoft excel and visual basic for applications.

    Science.gov (United States)

    Marino, Dale J

    2005-01-01

    Abstract Physiologically based pharmacokinetic (PBPK) models are mathematical descriptions depicting the relationship between external exposure and internal dose. These models have found great utility for interspecies extrapolation. However, specialized computer software packages, which are not widely distributed, have typically been used for model development and utilization. A few physiological models have been reported using more widely available software packages (e.g., Microsoft Excel), but these tend to include less complex processes and dose metrics. To ascertain the capability of Microsoft Excel and Visual Basis for Applications (VBA) for PBPK modeling, models for styrene, vinyl chloride, and methylene chloride were coded in Advanced Continuous Simulation Language (ACSL), Excel, and VBA, and simulation results were compared. For styrene, differences between ACSL and Excel or VBA compartment concentrations and rates of change were less than +/-7.5E-10 using the same numerical integration technique and time step. Differences using VBA fixed step or ACSL Gear's methods were generally <1.00E-03, although larger differences involving very small values were noted after exposure transitions. For vinyl chloride and methylene chloride, Excel and VBA PBPK model dose metrics differed by no more than -0.013% or -0.23%, respectively, from ACSL results. These differences are likely attributable to different step sizes rather than different numerical integration techniques. These results indicate that Microsoft Excel and VBA can be useful tools for utilizing PBPK models, and given the availability of these software programs, it is hoped that this effort will help facilitate the use and investigation of PBPK modeling.

  20. Model-Based Speech Signal Coding Using Optimized Temporal Decomposition for Storage and Broadcasting Applications

    Directory of Open Access Journals (Sweden)

    Chandranath R. N. Athaudage

    2003-09-01

    Full Text Available A dynamic programming-based optimization strategy for a temporal decomposition (TD model of speech and its application to low-rate speech coding in storage and broadcasting is presented. In previous work with the spectral stability-based event localizing (SBEL TD algorithm, the event localization was performed based on a spectral stability criterion. Although this approach gave reasonably good results, there was no assurance on the optimality of the event locations. In the present work, we have optimized the event localizing task using a dynamic programming-based optimization strategy. Simulation results show that an improved TD model accuracy can be achieved. A methodology of incorporating the optimized TD algorithm within the standard MELP speech coder for the efficient compression of speech spectral information is also presented. The performance evaluation results revealed that the proposed speech coding scheme achieves 50%–60% compression of speech spectral information with negligible degradation in the decoded speech quality.

  1. Research on application of intelligent computation based LUCC model in urbanization process

    Science.gov (United States)

    Chen, Zemin

    2007-06-01

    Global change study is an interdisciplinary and comprehensive research activity with international cooperation, arising in 1980s, with the largest scopes. The interaction between land use and cover change, as a research field with the crossing of natural science and social science, has become one of core subjects of global change study as well as the front edge and hot point of it. It is necessary to develop research on land use and cover change in urbanization process and build an analog model of urbanization to carry out description, simulation and analysis on dynamic behaviors in urban development change as well as to understand basic characteristics and rules of urbanization process. This has positive practical and theoretical significance for formulating urban and regional sustainable development strategy. The effect of urbanization on land use and cover change is mainly embodied in the change of quantity structure and space structure of urban space, and LUCC model in urbanization process has been an important research subject of urban geography and urban planning. In this paper, based upon previous research achievements, the writer systematically analyzes the research on land use/cover change in urbanization process with the theories of complexity science research and intelligent computation; builds a model for simulating and forecasting dynamic evolution of urban land use and cover change, on the basis of cellular automation model of complexity science research method and multi-agent theory; expands Markov model, traditional CA model and Agent model, introduces complexity science research theory and intelligent computation theory into LUCC research model to build intelligent computation-based LUCC model for analog research on land use and cover change in urbanization research, and performs case research. The concrete contents are as follows: 1. Complexity of LUCC research in urbanization process. Analyze urbanization process in combination with the contents

  2. A Novel Application of Agent-based Modeling: Projecting Water Access and Availability Using a Coupled Hydrologic Agent-based Model in the Nzoia Basin, Kenya

    Science.gov (United States)

    Le, A.; Pricope, N. G.

    2015-12-01

    Projections indicate that increasing population density, food production, and urbanization in conjunction with changing climate conditions will place stress on water resource availability. As a result, a holistic understanding of current and future water resource distribution is necessary for creating strategies to identify the most sustainable means of accessing this resource. Currently, most water resource management strategies rely on the application of global climate predictions to physically based hydrologic models to understand potential changes in water availability. However, the need to focus on understanding community-level social behaviors that determine individual water usage is becoming increasingly evident, as predictions derived only from hydrologic models cannot accurately represent the coevolution of basin hydrology and human water and land usage. Models that are better equipped to represent the complexity and heterogeneity of human systems and satellite-derived products in place of or in conjunction with historic data significantly improve preexisting hydrologic model accuracy and application outcomes. We used a novel agent-based sociotechnical model that combines the Soil and Water Assessment Tool (SWAT) and Agent Analyst and applied it in the Nzoia Basin, an area in western Kenya that is becoming rapidly urbanized and industrialized. Informed by a combination of satellite-derived products and over 150 household surveys, the combined sociotechnical model provided unique insight into how populations self-organize and make decisions based on water availability. In addition, the model depicted how population organization and current management alter water availability currently and in the future.

  3. A Decision Model for Supplier Selection based on Business System Management and Safety Criteria and Application of the Model

    Directory of Open Access Journals (Sweden)

    Semih Coşkun

    2015-08-01

    Full Text Available In modern market conditions, sustainable and effective management of main manufacturers, suppliers and customer relationship is a necessity for competitiveness. Suppliers must satisfy customers’ expectations such as cost minimization, quality maximization, improved flexibility and achieved deadlines; which is also required for systematic management of products, work and environmental safety. The supplier selection process is consistently getting more complicated by the effect of increasing amount of suppliers and supplier selection criteria. Supplier selection decisions which take an important role in efficient supplier management will be more consistent by the application of decision making models which integrate the quantitative and qualitative evaluation factors. In this study, a dynamic process is designed and modeled for supplier selection. For this purpose, evaluation criteria are established according to the Balanced Scorecard perspectives, system sustainability and safety requirements. Fuzzy Analytic Hierarchy Process method is used for evaluating the importance of supplier selection criteria. A utility range-based interactive group decision making method is used for the selection of the best supplier. In order to test the proposed model, a representative from airport operation sector is selected. Finally, it is revealed that the application of the proposed model generates consistent results for supplier selection decisions.

  4. Kernel based model parametrization and adaptation with applications to battery management systems

    Science.gov (United States)

    Weng, Caihao

    With the wide spread use of energy storage systems, battery state of health (SOH) monitoring has become one of the most crucial challenges in power and energy research, as SOH significantly affects the performance and life cycle of batteries as well as the systems they are interacting with. Identifying the SOH and adapting of the battery energy/power management system accordingly are thus two important challenges for applications such as electric vehicles, smart buildings and hybrid power systems. This dissertation focuses on the identification of lithium ion battery capacity fading, and proposes an on-board implementable model parametrization and adaptation framework for SOH monitoring. Both parametric and non-parametric approaches that are based on kernel functions are explored for the modeling of battery charging data and aging signature extraction. A unified parametric open circuit voltage model is first developed to improve the accuracy of battery state estimation. Several analytical and numerical methods are then investigated for the non-parametric modeling of battery data, among which the support vector regression (SVR) algorithm is shown to be the most robust and consistent approach with respect to data sizes and ranges. For data collected on LiFePO 4 cells, it is shown that the model developed with the SVR approach is able to predict the battery capacity fading with less than 2% error. Moreover, motivated by the initial success of applying kernel based modeling methods for battery SOH monitoring, this dissertation further exploits the parametric SVR representation for real-time battery characterization supported by test data. Through the study of the invariant properties of the support vectors, a kernel based model parametrization and adaptation framework is developed. The high dimensional optimization problem in the learning algorithm could be reformulated as a parameter estimation problem, that can be solved by standard estimation algorithms such as the

  5. Modeling expectations in agent-based models: an application to central bank's communication and monetary policy

    NARCIS (Netherlands)

    Salle, I.L.

    2015-01-01

    Expectations play a major role in macroeconomic dynamics, especially regarding the conduct of monetary policy. Yet, modeling the interplay between communication, expectations and aggregate outcomes remains a challenging task, mainly because this requires deviation from the paradigm of rational expec

  6. A web-based federated neuroinformatics model for surgical planning and clinical research applications in epilepsy.

    Science.gov (United States)

    Cao, Xinhua; Wong, Stephen T C; Hoo, Kent Soo; Tjandra, Donny; Fu, J C; Lowenstein, Daniel H

    2004-01-01

    There is an increasing need to efficiently share diverse clinical and image data among different clinics, labs, and departments of a medical center enterprise to facilitate better quality care and more effective clinical research. In this paper, we describe a web-based, federated information model as a viable technical solution with applications in medical refractory epilepsy and other neurological disorders. We describe four such online applications developed in a federated system prototype: surgical planning, image analysis, statistical data analysis, and dynamic extraction, transforming, and loading (ETL) of data from a heterogeneous collection of data sources into an epilepsy multimedia data warehouse (EMDW). The federated information system adopts a three-tiered architecture, consisting of a user-interface layer, an application logic layer, and a data service layer. We implemented two complementary federated information technologies, i.e., XML (eXtensible Markup Language) and CORBA (Common Object Request Broker Architecture), in the prototype to enable multimedia data exchange and brain images transmission. The preliminary results show that the federated prototype system provides a uniform interface, heterogeneous information integration and efficient data sharing for users in our institution who are concerned with the care of patients with epilepsy and who pursue research in this area.

  7. Modelling space-based integral-field spectrographs and their application to Type Ia supernova cosmology

    Science.gov (United States)

    Shukla, Hemant; Bonissent, Alain

    2017-04-01

    We present the parameterized simulation of an integral-field unit (IFU) slicer spectrograph and its applications in spectroscopic studies, namely, for probing dark energy with type Ia supernovae. The simulation suite is called the fast-slicer IFU simulator (FISim). The data flow of FISim realistically models the optics of the IFU along with the propagation effects, including cosmological, zodiacal, instrumentation and detector effects. FISim simulates the spectrum extraction by computing the error matrix on the extracted spectrum. The applications for Type Ia supernova spectroscopy are used to establish the efficacy of the simulator in exploring the wider parametric space, in order to optimize the science and mission requirements. The input spectral models utilize the observables such as the optical depth and velocity of the Si II absorption feature in the supernova spectrum as the measured parameters for various studies. Using FISim, we introduce a mechanism for preserving the complete state of a system, called the partial p/partial f matrix, which allows for compression, reconstruction and spectrum extraction, we introduce a novel and efficient method for spectrum extraction, called super-optimal spectrum extraction, and we conduct various studies such as the optimal point spread function, optimal resolution, parameter estimation, etc. We demonstrate that for space-based telescopes, the optimal resolution lies in the region near R ∼ 117 for read noise of 1 e- and 7 e- using a 400 km s-1 error threshold on the Si II velocity.

  8. Anisotropic Sheet Forming Simulations Based on the ALAMEL Model: Application on Cup Deep Drawing and Ironing

    Science.gov (United States)

    Eyckens, P.; Gawad, J.; Xie, Q.; Van Bael, A.; Roose, D.; Samaey, G.; Moerman, J.; Vegter, H.; Van Houtte, P.

    2011-08-01

    The grain interaction ALAMEL model [1] allows predicting the evolution of the crystallographic texture and the accompanying evolution in plastic anisotropy. A FE constitutive law, based on this multilevel model, is presented and assessed for a cup deep drawing process followed by an ironing process. A Numisheet2011 benchmark (BM-1) is used for the application. The FE material model makes use of the Facet plastic potential [2] for a relatively fast evaluation of the yield locus. A multi-scale approach [3] has been recently developed in order to adaptively update the constitutive law by accommodating it to the evolution of the crystallographic texture. The identification procedure of the Facet coefficients, which describe instantaneous plastic anisotropy, is accomplished through virtual testing by means of the ALAMEL model, as described in more detail in the accompanying conference paper [4]. Texture evolution during deformation is included explicitly by re-identification of Facet coefficients in the course of the FE simulation. The focus of this paper lies on the texture-induced anisotropy and the resulting earing profile during both stages of the forming process. For the considered AKDQ steel material, it is seen that texture evolution during deep drawing is such that the anisotropic plastic flow evolves towards a more isotropic flow in the course of deformation. Texture evolution only slightly influences the obtained cup height for this material. The ironing step enlarges the earing height.

  9. Generic vehicle speed models based on traffic simulation: Development and application

    Energy Technology Data Exchange (ETDEWEB)

    Margiotta, R.; Cohen, H.; Elkins, G.; Rathi, A.; Venigalla, M.

    1994-12-15

    This paper summarizes the findings of a research project to develop new methods of estimating speeds for inclusion in the Highway Performance Monitoring System (HPMS) Analytical Process. The paper focuses on the effects of traffic conditions excluding incidents (recurring congestion) on daily average ed and excess fuel consumption. A review of the literature revealed that many techniques have been used to predict speeds as a function of congestion but most fail to address the effects of queuing. However, the method of Dowling and Skabardonis avoids this limitation and was adapted to the research. The methodology used the FRESIM and NETSIM microscopic traffic simulation models to develop uncongested speed functions and as a calibration base for the congested flow functions. The chief contributions of the new speed models are the simplicity of application and their explicit accounting for the effects of queuing. Specific enhancements include: (1) the inclusion of a queue discharge rate for freeways; (2) use of newly defined uncongested flow speed functions; (3) use of generic temporal distributions that account for peak spreading; and (4) a final model form that allows incorporation of other factors that influence speed, such as grades and curves. The main limitation of the new speed models is the fact that they are based on simulation results and not on field observations. They also do not account for the effect of incidents on speed. While appropriate for estimating average national conditions, the use of fixed temporal distributions may not be suitable for analyzing specific facilities, depending on observed traffic patterns. Finally, it is recommended that these and all future speed models be validated against field data where incidents can be adequately identified in the data.

  10. Model-Based Evaluation Of System Scalability: Bandwidth Analysis For Smartphone-Based Biosensing Applications

    DEFF Research Database (Denmark)

    Patou, François; Madsen, Jan; Dimaki, Maria

    2016-01-01

    -engineering efforts for scaling a system specification efficaciously. We demonstrate the value of our methodology by investigating a smartphone-based biosensing instrumentation platform. Specifically, we carry out scalability analysis for the system’s bandwidth specification: the maximum analog voltage waveform...

  11. Genetic Modeling of GIS-Based Cell Clusters and Its Application in Mineral Resources Prediction

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    This paper presents a synthetic analysis method for multi-sourced geological data from geographic information system (GIS). In the previous practices of mineral resources prediction, a usually adopted methodology has been statistical analysis of cells delimitated based on thoughts of random sampiing. That might lead to insufficient utilization of local spatial information, for a cell is treated as a point without internal structure. We now take "cell clusters", i. e. , spatial associations of cells, as basic units of statistics, thus the spatial configuration information of geological variables is easier to be detected and utilized, and the accuracy and reliability of prediction are improved. We build a linear multi-discriminating model for the clusters via genetic algorithm. Both the right-judgment rates and the in-class vs. between-class distance ratios are considered to form the evolutional adaptive values of the population. An application of the method in gold mineral resources prediction in east Xinjiang, China is presented.

  12. Modeling dependence based on mixture copulas and its application in risk management

    Institute of Scientific and Technical Information of China (English)

    OUYANG Zi-sheng; LIAO Hui; YANG Xiang-qun

    2009-01-01

    This paper is concerned with the statistical modeling of the dependence structure of multivariate financial data using the copula, and the application of copula functions in VaR valuation. After the introduction of the pure copula method and the maximum and minimum mixture copula method, authors present a new algorithm based on the more generalized mixture copula functions and the dependence measure, and apply the method to the portfolio of Shanghai stock composite index and Shenzhen stock component index. Comparing with the results from various methods, one can find that the mixture copula method is better than the pure Gaussia copula method and the maximum and minimum mixture copula method on different VaR level.

  13. Physiologically-based toxicokinetic modeling of zearalenone and its metabolites: application to the Jersey girl study.

    Directory of Open Access Journals (Sweden)

    Dwaipayan Mukherjee

    Full Text Available Zearalenone (ZEA, a fungal mycotoxin, and its metabolite zeranol (ZAL are known estrogen agonists in mammals, and are found as contaminants in food. Zeranol, which is more potent than ZEA and comparable in potency to estradiol, is also added as a growth additive in beef in the US and Canada. This article presents the development and application of a Physiologically-Based Toxicokinetic (PBTK model for ZEA and ZAL and their primary metabolites, zearalenol, zearalanone, and their conjugated glucuronides, for rats and for human subjects. The PBTK modeling study explicitly simulates critical metabolic pathways in the gastrointestinal and hepatic systems. Metabolic events such as dehydrogenation and glucuronidation of the chemicals, which have direct effects on the accumulation and elimination of the toxic compounds, have been quantified. The PBTK model considers urinary and fecal excretion and biliary recirculation and compares the predicted biomarkers of blood, urinary and fecal concentrations with published in vivo measurements in rats and human subjects. Additionally, the toxicokinetic model has been coupled with a novel probabilistic dietary exposure model and applied to the Jersey Girl Study (JGS, which involved measurement of mycoestrogens as urinary biomarkers, in a cohort of young girls in New Jersey, USA. A probabilistic exposure characterization for the study population has been conducted and the predicted urinary concentrations have been compared to measurements considering inter-individual physiological and dietary variability. The in vivo measurements from the JGS fall within the high and low predicted distributions of biomarker values corresponding to dietary exposure estimates calculated by the probabilistic modeling system. The work described here is the first of its kind to present a comprehensive framework developing estimates of potential exposures to mycotoxins and linking them with biologically relevant doses and biomarker

  14. Development, fabrication, and modeling of highly sensitive conjugated polymer based piezoresistive sensors in electronic skin applications

    Science.gov (United States)

    Khalili, Nazanin; Naguib, Hani E.; Kwon, Roy H.

    2016-04-01

    Human intervention can be replaced through development of tools resulted from utilizing sensing devices possessing a wide range of applications including humanoid robots or remote and minimally invasive surgeries. Similar to the five human senses, sensors interface with their surroundings to stimulate a suitable response or action. The sense of touch which arises in human skin is among the most challenging senses to emulate due to its ultra high sensitivity. This has brought forth novel challenging issues to consider in the field of biomimetic robotics. In this work, using a multiphase reaction, a polypyrrole (PPy) based hydrogel is developed as a resistive type pressure sensor with an intrinsically elastic microstructure stemming from three dimensional hollow spheres. Furthermore, a semi-analytical constriction resistance model accounting for the real contact area between the PPy hydrogel sensors and the electrode along with the dependency of the contact resistance change on the applied load is developed. The model is then solved using a Monte Carlo technique and the sensitivity of the sensor is obtained. The experimental results showed the good tracking ability of the proposed model.

  15. An MDA-based approach for behaviour modelling of context-aware mobile applications

    NARCIS (Netherlands)

    Daniele, Laura M.; Ferreira Pires, Luis; Sinderen, van Marten

    2009-01-01

    Most reported MDA approaches give much attention to structural aspects in PSMs and in generated code, and less attention to the PIM level and the behaviour of the modelled applications. Consequently, application behaviour is generally not (well) defined at the PIM level. This paper presents an MDA-b

  16. Central Puget Sound Ecopath/Ecosim model biological parameters - Developing food web models for ecosystem-based management applications in Puget Sound

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This project is developing food web models for ecosystem-based management applications in Puget Sound. It is primarily being done by NMFS FTEs and contractors, in...

  17. Central Puget Sound Ecopath/Ecosim model outputs - Developing food web models for ecosystem-based management applications in Puget Sound

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This project is developing food web models for ecosystem-based management applications in Puget Sound. It is primarily being done by NMFS FTEs and contractors, in...

  18. Novel Component-Based Development Model for SIP-Based Mobile Application (1202)

    CERN Document Server

    Barnawi, Ahmed; Qureshi, M Rizwan Jameel; Khan, Asif Irshad

    2012-01-01

    Universities and Institutions these days' deals with issues related to with assessment of large number of students. Various evaluation methods have been adopted by examiners in different institutions to examining the ability of an individual, starting from manual means of using paper and pencil to electronic, from oral to written, practical to theoretical and many others. There is a need to expedite the process of examination in order to meet the increasing enrolment of students at the universities and institutes. Sip Based Mass Mobile Examination System (SiBMMES) expedites the examination process by automating various activities in an examination such as exam paper setting, Scheduling and allocating examination time and evaluation (auto-grading for objective questions) etc. SiBMMES uses the IP Multimedia Subsystem (IMS) that is an IP communications framework providing an environment for the rapid development of innovative and reusable services Session Initial Protocol (SIP) is a signalling (request-response)...

  19. Novel Component Based Development Model For Sip-Based Mobile Application

    CERN Document Server

    Barnawi, Ahmed; Qureshi, M Rizwan Jameel; Khan, Asif Irshad; 10.5121/ijsea.2012.3107

    2012-01-01

    Universities and Institutions these days' deals with issues related to with assessment of large number of students. Various evaluation methods have been adopted by examiners in different institutions to examining the ability of an individual, starting from manual means of using paper and pencil to electronic, from oral to written, practical to theoretical and many others. There is a need to expedite the process of examination in order to meet the increasing enrolment of students at the universities and institutes. Sip Based Mass Mobile Examination System (SiBMMES) expedites the examination process by automating various activities in an examination such as exam paper setting, Scheduling and allocating examination time and evaluation (auto-grading for objective questions) etc. SiBMMES uses the IP Multimedia Subsystem (IMS) that is an IP communications framework providing an environment for the rapid development of innovative and reusable services Session Initial Protocol (SIP) is a signalling (request-response)...

  20. The applications of model-based geostatistics in helminth epidemiology and control.

    Science.gov (United States)

    Magalhães, Ricardo J Soares; Clements, Archie C A; Patil, Anand P; Gething, Peter W; Brooker, Simon

    2011-01-01

    Funding agencies are dedicating substantial resources to tackle helminth infections. Reliable maps of the distribution of helminth infection can assist these efforts by targeting control resources to areas of greatest need. The ability to define the distribution of infection at regional, national and subnational levels has been enhanced greatly by the increased availability of good quality survey data and the use of model-based geostatistics (MBG), enabling spatial prediction in unsampled locations. A major advantage of MBG risk mapping approaches is that they provide a flexible statistical platform for handling and representing different sources of uncertainty, providing plausible and robust information on the spatial distribution of infections to inform the design and implementation of control programmes. Focussing on schistosomiasis and soil-transmitted helminthiasis, with additional examples for lymphatic filariasis and onchocerciasis, we review the progress made to date with the application of MBG tools in large-scale, real-world control programmes and propose a general framework for their application to inform integrative spatial planning of helminth disease control programmes.

  1. Finite mathematics models and applications

    CERN Document Server

    Morris, Carla C

    2015-01-01

    Features step-by-step examples based on actual data and connects fundamental mathematical modeling skills and decision making concepts to everyday applicability Featuring key linear programming, matrix, and probability concepts, Finite Mathematics: Models and Applications emphasizes cross-disciplinary applications that relate mathematics to everyday life. The book provides a unique combination of practical mathematical applications to illustrate the wide use of mathematics in fields ranging from business, economics, finance, management, operations research, and the life and social sciences.

  2. Multi-Agent Application System Model Based on UML%UML与多Agent应用系统建模

    Institute of Scientific and Technical Information of China (English)

    孙华志

    2003-01-01

    In order to guarantee the quality and raising the reliability and maintainability of the system, we need to provide the support for designing the Agent-based software system. In view of the consistency of the Agent's conceptwith Object's, we analyze the thought of modeling on UML and then write this paper. This paper has made the help-ful attempt to build Multi-Agent application system model based on UML, involving the descriptions such as staticstructure and dynamic action. It lists the major steps and method about system modeling based on expanding UML,also.

  3. Enhancement Factors in Ozone Absorption Based on the Surface Renewal Model and its Application

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Based on the Danckwerts surface renewal model, a simple explicit expression of theenhancement factor in ozone absorption with a first order ozone self-decomposition and parallel secondorder ozonation reactions has been derived. The results are compared with our previous work based onthe film theory. The 2,4-dichlorophenol destruction rate by ozonation is predicted using the enhancementfactor model in this paper.

  4. A novel transport based model for wire media and its application to scattering problems

    Science.gov (United States)

    Forati, Ebrahim

    Artificially engineered materials, known as metamaterials, have attracted the interest of researchers because of the potential for novel applications. Effective modeling of metamaterials is a crucial step for analyzing and synthesizing devices. In this thesis, we focus on wire medium (both isotropic and uniaxial) and validate a novel transport based model for them. Scattering problems involving wire media are computationally intensive due to the spatially dispersive nature of homogenized wire media. However, it will be shown that using the new model to solve scattering problems can simplify the calculations a great deal. For scattering problems, an integro-differential equation based on a transport formulation is proposed instead of the convolution-form integral equation that directly comes from spatial dispersion. The integro-differential equation is much faster to solve than the convolution equation form, and its effectiveness is confirmed by solving several examples in one-, two-, and three-dimensions. Both the integro-differential equation formulation and the homogenized wire medium parameters are experimentaly confirmed. To do so, several isotropic connected wire medium spheres have been fabricated using a rapid-prototyping machine, and their measured extinction cross sections are compared with simulation results. Wire parameters (period and diameter) are varied to the point where homogenization theory breaks down, which is observed in the measurements. The same process is done for three-dimensional cubical objects made of a uniaxail wire medium, and their measured results are compared with the numerical results based on the new model. The new method is extremely fast compared to brute-force numerical methods such as FDTD, and provides more physical insight (within the limits of homogenization), including the idea of a Debye length for wire media. The limits of homogenization are examined by comparing homogenization results and measurement. Then, a novel

  5. APPLICATION OF ARCHITECTURE-BASED NEURAL NETWORKS IN MODELING AND PARAMETER OPTIMIZATION OF HYDRAULIC BUMPER

    Institute of Scientific and Technical Information of China (English)

    Yang Haiwei; Zhan Yongqi; Qiao Junwei; Shi Guanglin

    2003-01-01

    The dynamic working process of 52SFZ-140-207B type of hydraulic bumper is analyzed. The modeling method using architecture-based neural networks is introduced. Using this modeling method, the dynamic model of the hydraulic bumper is established; Based on this model the structural parameters of the hydraulic bumper are optimized with Genetic algorithm. The result shows that the performance of the dynamic model is close to that of the hydraulic bumper, and the dynamic performance of the hydraulic bumper is improved through parameter optimization.

  6. Gradient-based Kriging approximate model and its application research to optimization design

    Institute of Scientific and Technical Information of China (English)

    XUAN Ying; XIANG JunHua; ZHANG WeiHua; ZHANG YuLin

    2009-01-01

    In the process of multidisciplinary design optimization, there exits the calculation complexity problem due to frequently calling high fidelity system analysis models. The high fidelity system analysis models can be surrogated by approximate models. The sensitivity analysis and numerical noise filtering can be done easily by coupling approximate models to optimization. Approximate models can reduce the number of executions of the problem's simulation code during optimization, so the solution efficiency of the multidisciplinary design optimization problem can be improved. Most optimization methods are based on gradient. The gradients of the objective and constrain functions are gained easily. The gradient-based Kriging (GBK) approximate model can be constructed by using system response value and its gradients. The gradients can greatly improve prediction precision of system response. The hybrid optimization method is constructed by coupling GBK approximate models to gradient-based optimization methods. An aircraft aerodynamics shape optimization design example indicates that the methods of this paper can achieve good feasibility and validity.

  7. Proceedings First Workshop on Applications of Membrane computing, Concurrency and Agent-based modelling in POPulation biology

    CERN Document Server

    Milazzo, Paolo; 10.4204/EPTCS.33

    2010-01-01

    This volume contains the papers presented at the first International Workshop on Applications of Membrane Computing, Concurrency and Agent-based Modelling in Population Biology (AMCA-POP 2010) held in Jena, Germany on August 25th, 2010 as a satellite event of the 11th Conference on Membrane Computing (CMC11). The aim of the workshop is to investigate whether formal modelling and analysis techniques could be applied with profit to systems of interest for population biology and ecology. The considered modelling notations include membrane systems, Petri nets, agent-based notations, process calculi, automata-based notations, rewriting systems and cellular automata. Such notations enable the application of analysis techniques such as simulation, model checking, abstract interpretation and type systems to study systems of interest in disciplines such as population biology, ecosystem science, epidemiology, genetics, sustainability science, evolution and other disciplines in which population dynamics and interactions...

  8. Methodology and applications in non-linear model-based geostatistics

    DEFF Research Database (Denmark)

    Christensen, Ole Fredslund

    Today geostatistics is used in a number of research areas, among others agricultural and environmental sciences.This thesis concerns data and applications where the classical Gaussian spatial model is not appropriate. A transformation could be used in an attempt to obtain data that are approximat......Today geostatistics is used in a number of research areas, among others agricultural and environmental sciences.This thesis concerns data and applications where the classical Gaussian spatial model is not appropriate. A transformation could be used in an attempt to obtain data...

  9. Security Model for Microsoft Based Mobile Sales Management Application in Private Cloud Computing

    OpenAIRE

    Kuan Chee Houng; Bharanidharan Shanmugam; Ganthan Narayana Samy; Sameer Hasan Albakri; Azuan Ahmad

    2013-01-01

    The Microsoft-based mobile sales management application is a sales force management application that currently running on Windows Mobile 6.5. It handles sales-related activity and cuts down the administrative task of sales representative. Then, Windows launch a new mobile operating system, Windows Phone and stop providing support to Windows Mobile. This has become an obstacle for Windows Mobile development. From time to time, Windows Mobile will be eliminated from the market due to no support...

  10. Development and application of a model for analysis and design phases of Web-based system development

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    Despite a short history of the Web development, Web-related technologies are rapidly develop-ing. However, the Web application quality is improving slowly, which requires efficient methods for devel-oping Web systems. This study presents a model for Web-based software development for analysis and design phases based on the ISO/IEC 12207 standard. It describes the methods used to define processes and entities in order to reflect the contents in Web applications. It applies the methodology of Web-Road Map by KCC Information and Technology using this model to the public project. As a result, Web-Road Map is proven to be an efficient model to analyze and design Web-applications.

  11. Tensor Product Model Transformation-based Controller Design for Gantry Crane Control System – An Application Approach

    Directory of Open Access Journals (Sweden)

    Fetah Kolonic

    2006-10-01

    Full Text Available The Tensor Product (TP model transformation is a recently proposed techniquefor transforming given Linear Parameter Varying (LPV state-space models into polytopicmodel form, namely, to parameter varying convex combination of Linear Time Invariant(LTI systems. The main advantage of the TP model transformation is that it is executablein a few minutes and the Linear Matrix Inequality (LMI-based control design frameworkscan immediately be applied to the resulting polytopc models to yield controllers withtractable and guaranteed performance. Various applications of the TP modeltransformation-based design were studied via academic complex and benchmark problems,but no real experimental environment-based study was published. Thus, the main objectiveof this paper is to study how the TP model transformation performs in a real world problemand control setup. The laboratory concept for TP model-based controller design,simulation and real time running on an electromechanical system is presented.Development system for TP model-based controller with one hardware/software platformand target system with real-time hardware/ software support are connected in the uniquesystem. Proposed system is based on microprocessor of personal computer (PC forsimulation and software development as well as for real-time control. Control algorithm,designed and simulated in MATLAB/SIMULINK environment, use graphically orientedsoftware interface for real-time code generation. Some specific conflicting industrial tasksin real industrial crane application, such as fast load positioning control and load swingangle minimization, are considered and compared with other controller types.

  12. Inverse Modeling of Human Knee Joint Based on Geometry and Vision Systems for Exoskeleton Applications

    Directory of Open Access Journals (Sweden)

    Eduardo Piña-Martínez

    2015-01-01

    Full Text Available Current trends in Robotics aim to close the gap that separates technology and humans, bringing novel robotic devices in order to improve human performance. Although robotic exoskeletons represent a breakthrough in mobility enhancement, there are design challenges related to the forces exerted to the users’ joints that result in severe injuries. This occurs due to the fact that most of the current developments consider the joints as noninvariant rotational axes. This paper proposes the use of commercial vision systems in order to perform biomimetic joint design for robotic exoskeletons. This work proposes a kinematic model based on irregular shaped cams as the joint mechanism that emulates the bone-to-bone joints in the human body. The paper follows a geometric approach for determining the location of the instantaneous center of rotation in order to design the cam contours. Furthermore, the use of a commercial vision system is proposed as the main measurement tool due to its noninvasive feature and for allowing subjects under measurement to move freely. The application of this method resulted in relevant information about the displacements of the instantaneous center of rotation at the human knee joint.

  13. Convergence rates for rank-based models with applications to portfolio theory

    CERN Document Server

    Ichiba, Tomoyuki; Shkolnikov, Mykhaylo

    2011-01-01

    We determine rates of convergence of rank-based interacting diffusions and semimartingale reflecting Brownian motions to equilibrium. Convergence rate for the total variation metric is derived using Lyapunov functions. Sharp fluctuations of additive functionals are obtained using Transportation Cost-Information inequalities for Markov processes. We work out various applications to the rank-based abstract equity markets used in Stochastic Portfolio Theory. For example, we produce quantitative bounds, including constants, for fluctuations of market weights and occupation times of various ranks for individual coordinates. Another important application is the comparison of performance between symmetric functionally generated portfolios and the market portfolio. This produces estimates of probabilities of "beating the market".

  14. Space vector-based modeling and control of a modular multilevel converter in HVDC applications

    DEFF Research Database (Denmark)

    Bonavoglia, M.; Casadei, G.; Zarri, L.;

    2013-01-01

    Modular multilevel converter (MMC) is an emerging multilevel topology for high-voltage applications that has been developed in recent years. In this paper, the modeling and the control of MMCs are restated in terms of space vectors, which may allow a deeper understanding of the converter behavior...

  15. Recent approaches to quadrupole collectivity: models, solutions and applications based on the Bohr hamiltonian

    Science.gov (United States)

    Buganu, Petricǎ; Fortunato, Lorenzo

    2016-09-01

    We review and discuss several recent approaches to quadrupole collectivity and developments of collective models and their solutions with many applications, examples and references. We focus in particular on analytic and approximate solutions of the Bohr hamiltonian of the last decade, because most of the previously published material has been already reviewed in other publications.

  16. Supporting Seamful Development of Positioning Applications through Model Based Translucent Middleware

    DEFF Research Database (Denmark)

    Jensen, Jakob Langdal

    design for context aware applications in general are advocating that seams (problem areas caused by technologies interconnecting) can be exploited by end-users if they are made available to them. A system allowing this kind of interaction is said to be seamfully designed as opposed to the traditional...... goal of ubiquitous computing where seamlessness is advocated. Another challenge is to provide middleware designers a set of tools that allow them to build translucent middleware, i.e., middleware where the level of openness can be differentiated. Such middleware should provide application developers......Positioning technologies are becoming ever more pervasive, and they are used for a growing number of applications in a broad range of fields. We aim to support software developers who create position based applications. More specifically, how support can be provided through the use of specialized...

  17. Development and application of compact models of packages based on DELPHI methodology

    CERN Document Server

    Parry, J; Shidore, S

    1997-01-01

    The accurate prediction of the temperatures of critical electronic parts at the package- board- and system-level is seriously hampered by the lack of reliable, standardised input data for the characterisation of the thermal $9 behaviour of these parts. The recently completed collaborative European project, DELPHI has been concerned with the creation and experimental validation of thermal models (both detailed and compact) of a range of electronic parts, $9 including mono-chip packages. This paper demonstrates the reliable performance of thermal compact models in a range of applications, by comparison with the detailed models from which they were derived. (31 refs).

  18. Reliability Modeling Development and Its Applications for Ceramic Capacitors with Base-Metal Electrodes (BMEs)

    Science.gov (United States)

    Liu, Donhang

    2014-01-01

    This presentation includes a summary of NEPP-funded deliverables for the Base-Metal Electrodes (BMEs) capacitor task, development of a general reliability model for BME capacitors, and a summary and future work.

  19. Engine Modelling for Control Applications

    DEFF Research Database (Denmark)

    Hendricks, Elbert

    1997-01-01

    In earlier work published by the author and co-authors, a dynamic engine model called a Mean Value Engine Model (MVEM) was developed. This model is physically based and is intended mainly for control applications. In its newer form, it is easy to fit to many different engines and requires little...... engine data for this purpose. It is especially well suited to embedded model applications in engine controllers, such as nonlinear observer based air/fuel ratio and advanced idle speed control. After a brief review of this model, it will be compared with other similar models which can be found...

  20. Modeller subjectivity and calibration impacts on hydrological model applications: an event-based comparison for a road-adjacent catchment in south-east Norway.

    Science.gov (United States)

    Kalantari, Zahra; Lyon, Steve W; Jansson, Per-Erik; Stolte, Jannes; French, Helen K; Folkeson, Lennart; Sassner, Mona

    2015-01-01

    Identifying a 'best' performing hydrologic model in a practical sense is difficult due to the potential influences of modeller subjectivity on, for example, calibration procedure and parameter selection. This is especially true for model applications at the event scale where the prevailing catchment conditions can have a strong impact on apparent model performance and suitability. In this study, two lumped models (CoupModel and HBV) and two physically-based distributed models (LISEM and MIKE SHE) were applied to a small catchment upstream of a road in south-eastern Norway. All models were calibrated to a single event representing typical winter conditions in the region and then applied to various other winter events to investigate the potential impact of calibration period and methodology on model performance. Peak flow and event-based hydrographs were simulated differently by all models leading to differences in apparent model performance under this application. In this case-study, the lumped models appeared to be better suited for hydrological events that differed from the calibration event (i.e., events when runoff was generated from rain on non-frozen soils rather than from rain and snowmelt on frozen soil) while the more physical-based approaches appeared better suited during snowmelt and frozen soil conditions more consistent with the event-specific calibration. This was due to the combination of variations in subsurface conditions over the eight events considered, the subsequent ability of the models to represent the impact of the conditions (particularly when subsurface conditions varied greatly from the calibration event), and the different approaches adopted to calibrate the models. These results indicate that hydrologic models may not only need to be selected on a case-by-case basis but also have their performance evaluated on an application-by-application basis since how a model is applied can be equally important as inherent model structure.

  1. Enhancement Factors in Ozone Absorption Based on the Surface Renewal Model and its Application

    Institute of Scientific and Technical Information of China (English)

    程江; 杨卓如; 陈焕钦; C.H.Kuo; M.E.Zappi

    2000-01-01

    Based on the Danckwerts surface renewal model, a simple explicit expression of the enhancement factor in ozone absorption with a first order ozone self-decomposition and parallel second order ozonation reactions has been derived. The results are compared with our previous work based on the film theory. The 2,4-dichlorophenol destruction rate by ozonation is predicted using the enhancement factor model in this paper.

  2. Measuring Model-Based High School Science Instruction: Development and Application of a Student Survey

    Science.gov (United States)

    Fulmer, Gavin W.; Liang, Ling L.

    2013-02-01

    This study tested a student survey to detect differences in instruction between teachers in a modeling-based science program and comparison group teachers. The Instructional Activities Survey measured teachers' frequency of modeling, inquiry, and lecture instruction. Factor analysis and Rasch modeling identified three subscales, Modeling and Reflecting, Communicating and Relating, and Investigative Inquiry. As predicted, treatment group teachers engaged in modeling and inquiry instruction more than comparison teachers, with effect sizes between 0.55 and 1.25. This study demonstrates the utility of student report data in measuring teachers' classroom practices and in evaluating outcomes of a professional development program.

  3. An efficient binomial model-based measure for sequence comparison and its application.

    Science.gov (United States)

    Liu, Xiaoqing; Dai, Qi; Li, Lihua; He, Zerong

    2011-04-01

    Sequence comparison is one of the major tasks in bioinformatics, which could serve as evidence of structural and functional conservation, as well as of evolutionary relations. There are several similarity/dissimilarity measures for sequence comparison, but challenges remains. This paper presented a binomial model-based measure to analyze biological sequences. With help of a random indicator, the occurrence of a word at any position of sequence can be regarded as a random Bernoulli variable, and the distribution of a sum of the word occurrence is well known to be a binomial one. By using a recursive formula, we computed the binomial probability of the word count and proposed a binomial model-based measure based on the relative entropy. The proposed measure was tested by extensive experiments including classification of HEV genotypes and phylogenetic analysis, and further compared with alignment-based and alignment-free measures. The results demonstrate that the proposed measure based on binomial model is more efficient.

  4. Applicability of an exposure model for the determination of emissions from mobile phone base stations

    DEFF Research Database (Denmark)

    Breckenkamp, J; Neitzke, H P; Bornkessel, C

    2008-01-01

    of interpolated geo-coordinates were used to calculate the distance between households and base stations, which is one important parameter in modelling exposure. During the development of the exposure model, more precise input data were available for its internal validation, which yielded kappa values between 0...

  5. An Empirically Based Method of Q-Matrix Validation for the DINA Model: Development and Applications

    Science.gov (United States)

    de la Torre, Jimmy

    2008-01-01

    Most model fit analyses in cognitive diagnosis assume that a Q matrix is correct after it has been constructed, without verifying its appropriateness. Consequently, any model misfit attributable to the Q matrix cannot be addressed and remedied. To address this concern, this paper proposes an empirically based method of validating a Q matrix used…

  6. Variable Selection and Updating In Model-Based Discriminant Analysis for High Dimensional Data with Food Authenticity Applications*

    OpenAIRE

    Murphy, Thomas Brendan; Dean, Nema; Raftery, Adrian E.

    2010-01-01

    Food authenticity studies are concerned with determining if food samples have been correctly labeled or not. Discriminant analysis methods are an integral part of the methodology for food authentication. Motivated by food authenticity applications, a model-based discriminant analysis method that includes variable selection is presented. The discriminant analysis model is fitted in a semi-supervised manner using both labeled and unlabeled data. The method is shown to give ...

  7. Model Theory and Applications

    CERN Document Server

    Mangani, P

    2011-01-01

    This title includes: Lectures - G.E. Sacks - Model theory and applications, and H.J. Keisler - Constructions in model theory; and, Seminars - M. Servi - SH formulas and generalized exponential, and J.A. Makowski - Topological model theory.

  8. A temperature-precipitation based leafing model and its application in Northeast China.

    Directory of Open Access Journals (Sweden)

    Rong-Ping Li

    Full Text Available Plant phenology models, especially leafing models, play critical roles in evaluating the impact of climate change on the primary production of temperate plants. Existing models based on temperature alone could not accurately simulate plant leafing in arid and semi-arid regions. The objective of the present study was to test the suitability of the existing temperature-based leafing models in arid and semi-arid regions, and to develop a temperature-precipitation based leafing model (TP, based on the long-term (i.e., 12-27 years ground leafing observation data and meteorological data in Northeast China. The better simulation of leafing for all the plant species in Northeast China was given by TP with the fixed starting date (TPn than with the parameterized starting date (TPm, which gave the smallest average root mean square error (RMSE of 4.21 days. Tree leafing models were validated with independent data, and the coefficient of determination (R(2 was greater than 0.60 in 75% of the estimates by TP and the spring warming model (SW with the fixed starting date. The average RMSE of herb leafing simulated by TPn was 5.03 days, much lower than other models (>9.51 days, while the average R(2 of TPn and TPm were 0.68 and 0.57, respectively, much higher than the other models (<0.22. It indicates that TPn is a universal model and more suitable for simulating leafing of trees and herbs than the prior models. Furthermore, water is an important factor determining herb leafing in arid and semi-arid temperate regions.

  9. Object-oriented modelling with unified modelling language 2.0 for simple software application based on agile methodology

    CERN Document Server

    Warnars, Spits

    2010-01-01

    Unified modelling language (UML) 2.0 introduced in 2002 has been developing and influencing object-oriented software engineering and has become a standard and reference for information system analysis and design modelling. There are many concepts and theories to model the information system or software application with UML 2.0, which can make ambiguities and inconsistencies for a novice to learn to how to model the system with UML especially with UML 2.0. This article will discuss how to model the simple software application by using some of the diagrams of UML 2.0 and not by using the whole diagrams as suggested by agile methodology. Agile methodology is considered as convenient for novices because it can deliver the information technology environment to the end-user quickly and adaptively with minimal documentation. It also has the ability to deliver best performance software application according to the customer's needs. Agile methodology will make simple model with simple documentation, simple team and si...

  10. Development of a Model-Based Systems Engineering Application for the Ground Vehicle Robotics Sustainment Industrial Base

    Science.gov (United States)

    2013-02-04

    symbols, human perceptual processing, human eye properties, visual attention, the Gestalt laws of pattern perception, visual objects perception...visual grammars like Systems Modeling Language (SysML), are considered to be applications of Gestalt laws. Specialized software available in the market... psychological inertia (PI) associated with engineers stepping outside their background to observe useful patterns. The authors deemed that aligning the 39

  11. Modeling and Deployment of Model-Based Decentralized Embedded Diagnosis inside Vehicles: Application to Smart Distance Keeping Function

    Directory of Open Access Journals (Sweden)

    Othman Nasri

    2012-01-01

    Full Text Available The deployment of a fault diagnosis strategy in the Smart Distance Keeping (SDK system with a decentralized architecture is presented. The SDK system is an advanced Adaptive Cruise Control (ACC system implemented in a Renault-Volvo Trucks vehicle to increase safety by overcoming some ACC limitations. One of the main differences between this new system and the classical ACC is the choice of the safe distance. This latter is the distance between the vehicle equipped with the ACC or the SDK system and the obstacle-in-front (which may be another vehicle. It is supposed fixed in the case of the ACC, while variable in the case of the SDK. The variation of this distance depends essentially on the relative velocity between the vehicle and the obstacle-in-front. The main goal of this work is to analyze measurements, issued from the SDK elements, in order to detect, to localize, and to identify some faults that may occur. Our main contribution is the proposition of a decentralized approach permitting to carry out an on-line diagnosis without computing the global model and to achieve most of the work locally avoiding huge extra diagnostic information traffic between components. After a detailed description of the SDK system, this paper explains the model-based decentralized solution and its application to the embedded diagnosis of the SDK system inside Renault-Volvo Truck with five control units connected via a CAN-bus using “Hardware in the Loop” (HIL technique. We also discuss the constraints that must be fulfilled.

  12. A Danger Theory Based Mobile Virus Detection Model and Its Application in Inhibiting Virus

    Directory of Open Access Journals (Sweden)

    Tianliang Lu

    2012-08-01

    Full Text Available According to the propagation and destruction characteristics of mobile phone viruses, a virus detection model based on the Danger Theory is proposed. This model includes four phases: danger capture, antigen presentation, antibody generation and antibody distribution. In this model, local knowledge of mobile phones is exploited by the agents that are running in mobile phones to discover danger caused by viruses. The Antigen Presenting Cells (APCs present the antigen from mobile phones in the danger zone, and the Decision Center confirms the infection of viruses. After the antibody is generated by self-tolerating using the negative selection algorithm, the Decision Center distributes the antibody to mobile phones. Due to the distributed and cooperative mechanism of artificial immune system, the proposed model lowers the storage and computing consumption of mobile phones. The simulation results show that based on the mobile phone virus detection model, the proposed virus immunization strategy can effectively inhibit the propagation of mobile phone viruses.

  13. FANN-Based Surface Water Quality Evaluation Model and Its Application in the Shaoguan Area

    Institute of Scientific and Technical Information of China (English)

    YANG Meini; LI Dingfang; YANG Jinbo; XIONG Wei

    2007-01-01

    A fuzzy neural network model is proposed to evaluate water quality. The model contains two parts: first, fuzzy mathematics theory is used to standardize the samples; second, the RBF neural network and the BP neural network are used to train the standardized samples. The proposed model was applied to assess the water quality of 16 sections in 9 rivers in the Shaoguan area in 2005. The evaluation result was compared with that of the RBF neural network method and the reported results in the Shaoguan area in 2005. It indicated that the performance of the proposed fuzzy neural network model is practically feasible in the application of water quality assessment and its operation is simple.

  14. Scenario-based, closed-loop model predictive control with application to emergency vehicle scheduling

    Science.gov (United States)

    Goodwin, Graham. C.; Medioli, Adrian. M.

    2013-08-01

    Model predictive control has been a major success story in process control. More recently, the methodology has been used in other contexts, including automotive engine control, power electronics and telecommunications. Most applications focus on set-point tracking and use single-sequence optimisation. Here we consider an alternative class of problems motivated by the scheduling of emergency vehicles. Here disturbances are the dominant feature. We develop a novel closed-loop model predictive control strategy aimed at this class of problems. We motivate, and illustrate, the ideas via the problem of fluid deployment of ambulance resources.

  15. A new model for the grid size optimization of the finite element method --Based on its application to the water quality modeling of the topographically complicated river

    Institute of Scientific and Technical Information of China (English)

    ZENG Guangming; SU Xiaokang; HUANG Guohe; XIE Gengxin

    2003-01-01

    The finite element method is one of the typical methods that are used for numerical water quality modeling of the topographically complicated river. In this paper, based on the principle of probability theory the probability density of pollutants is introduced. A new model for the grid size optimization based on the finite element method is developed with the incorporation of the maximum information entropy theory when the length of the grid is given. Combined with the experiential evaluation approach of the flow discharge per unit river width, this model can be used to determine the grid size of the finite element method applied to water quality modeling of the topographically complicated river when the velocity field of the river is not given. The calculating results of the application of the model to an ideal river testified the correctness of the model. In a practical case-the application of the model to the Xingjian River (the Hengyang section of the Xiangjiang River), the optimized width of the grid of the finite element method was gained and the influence of parameters was studied, which demonstrated that the model reflected the real situation of the pollutants in the river, and that the model had many excellent characteristics such as stabilization, credibility and high applicability in practical applications.

  16. Evolving MCDM applications using hybrid expert-based ISM and DEMATEL models: an example of sustainable ecotourism.

    Science.gov (United States)

    Chuang, Huan-Ming; Lin, Chien-Ku; Chen, Da-Ren; Chen, You-Shyang

    2013-01-01

    Ecological degradation is an escalating global threat. Increasingly, people are expressing awareness and priority for concerns about environmental problems surrounding them. Environmental protection issues are highlighted. An appropriate information technology tool, the growing popular social network system (virtual community, VC), facilitates public education and engagement with applications for existent problems effectively. Particularly, the exploration of related involvement behavior of VC member engagement is an interesting topic. Nevertheless, member engagement processes comprise interrelated sub-processes that reflect an interactive experience within VCs as well as the value co-creation model. To address the top-focused ecotourism VCs, this study presents an application of a hybrid expert-based ISM model and DEMATEL model based on multi-criteria decision making tools to investigate the complex multidimensional and dynamic nature of member engagement. Our research findings provide insightful managerial implications and suggest that the viral marketing of ecotourism protection is concerned with practitioners and academicians alike.

  17. Evolving MCDM Applications Using Hybrid Expert-Based ISM and DEMATEL Models: An Example of Sustainable Ecotourism

    Directory of Open Access Journals (Sweden)

    Huan-Ming Chuang

    2013-01-01

    Full Text Available Ecological degradation is an escalating global threat. Increasingly, people are expressing awareness and priority for concerns about environmental problems surrounding them. Environmental protection issues are highlighted. An appropriate information technology tool, the growing popular social network system (virtual community, VC, facilitates public education and engagement with applications for existent problems effectively. Particularly, the exploration of related involvement behavior of VC member engagement is an interesting topic. Nevertheless, member engagement processes comprise interrelated sub-processes that reflect an interactive experience within VCs as well as the value co-creation model. To address the top-focused ecotourism VCs, this study presents an application of a hybrid expert-based ISM model and DEMATEL model based on multi-criteria decision making tools to investigate the complex multidimensional and dynamic nature of member engagement. Our research findings provide insightful managerial implications and suggest that the viral marketing of ecotourism protection is concerned with practitioners and academicians alike.

  18. Gradient-based Kriging approximate model and its application research to optimization design

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    In the process of multidisciplinary design optimization, there exits the calculation complexity problem due to frequently calling high fidelity system analysis models. The high fidelity system analysis models can be surrogated by approximate models. The sensitivity analysis and numerical noise filtering can be done easily by coupling approximate models to optimization. Approximate models can reduce the number of executions of the problem’s simulation code during optimization, so the solution efficiency of the multidisciplinary design optimization problem can be improved. Most optimization methods are based on gradient. The gradients of the objective and constrain functions are gained easily. The gra- dient-based Kriging (GBK) approximate model can be constructed by using system response value and its gradients. The gradients can greatly improve prediction precision of system response. The hybrid optimization method is constructed by coupling GBK approximate models to gradient-based optimiza- tion methods. An aircraft aerodynamics shape optimization design example indicates that the methods of this paper can achieve good feasibility and validity.

  19. Reaction invariant-based reduction of the activated sludge model ASM1 for batch applications

    DEFF Research Database (Denmark)

    Santa Cruz, Judith A.; Mussati, Sergio F.; Scenna, Nicolás J.

    2016-01-01

    that are unaffected by the reaction progress, i.e. so-called reaction invariants. The reaction invariant concept can be used to reduce the number of ordinary differential equations (ODEs) involved in batch bioreactor models. In this paper, a systematic methodology of model reduction based on this concept is applied...... to batch activated sludge processes described by the Activated Sludge Model No. 1 (ASM1) for carbon and nitrogen removal. The objective of the model reduction is to describe the exact dynamics of the states predicted by the original model with a lower number of ODEs. This leads to a reduction...... of the numerical complexity as nonlinear ODEs are replaced by linear algebraic relationships predicting the exact dynamics of the original model....

  20. Mathematical modeling and simulation in animal health - Part II: principles, methods, applications, and value of physiologically based pharmacokinetic modeling in veterinary medicine and food safety assessment.

    Science.gov (United States)

    Lin, Z; Gehring, R; Mochel, J P; Lavé, T; Riviere, J E

    2016-10-01

    This review provides a tutorial for individuals interested in quantitative veterinary pharmacology and toxicology and offers a basis for establishing guidelines for physiologically based pharmacokinetic (PBPK) model development and application in veterinary medicine. This is important as the application of PBPK modeling in veterinary medicine has evolved over the past two decades. PBPK models can be used to predict drug tissue residues and withdrawal times in food-producing animals, to estimate chemical concentrations at the site of action and target organ toxicity to aid risk assessment of environmental contaminants and/or drugs in both domestic animals and wildlife, as well as to help design therapeutic regimens for veterinary drugs. This review provides a comprehensive summary of PBPK modeling principles, model development methodology, and the current applications in veterinary medicine, with a focus on predictions of drug tissue residues and withdrawal times in food-producing animals. The advantages and disadvantages of PBPK modeling compared to other pharmacokinetic modeling approaches (i.e., classical compartmental/noncompartmental modeling, nonlinear mixed-effects modeling, and interspecies allometric scaling) are further presented. The review finally discusses contemporary challenges and our perspectives on model documentation, evaluation criteria, quality improvement, and offers solutions to increase model acceptance and applications in veterinary pharmacology and toxicology.

  1. Developments in model-based optimization and control distributed control and industrial applications

    CERN Document Server

    Grancharova, Alexandra; Pereira, Fernando

    2015-01-01

    This book deals with optimization methods as tools for decision making and control in the presence of model uncertainty. It is oriented to the use of these tools in engineering, specifically in automatic control design with all its components: analysis of dynamical systems, identification problems, and feedback control design. Developments in Model-Based Optimization and Control takes advantage of optimization-based formulations for such classical feedback design objectives as stability, performance and feasibility, afforded by the established body of results and methodologies constituting optimal control theory. It makes particular use of the popular formulation known as predictive control or receding-horizon optimization. The individual contributions in this volume are wide-ranging in subject matter but coordinated within a five-part structure covering material on: · complexity and structure in model predictive control (MPC); · collaborative MPC; · distributed MPC; · optimization-based analysis and desi...

  2. Parameter estimation based synchronization for an epidemic model with application to tuberculosis in Cameroon

    Energy Technology Data Exchange (ETDEWEB)

    Bowong, Samuel, E-mail: sbowong@gmail.co [Laboratory of Applied Mathematics, Department of Mathematics and Computer Science, Faculty of Science, University of Douala, P.O. Box 24157 Douala (Cameroon); Postdam Institute for Climate Impact Research (PIK), Telegraphenberg A 31, 14412 Potsdam (Germany); Kurths, Jurgen [Postdam Institute for Climate Impact Research (PIK), Telegraphenberg A 31, 14412 Potsdam (Germany); Department of Physics, Humboldt Universitat zu Berlin, 12489 Berlin (Germany)

    2010-10-04

    We propose a method based on synchronization to identify the parameters and to estimate the underlying variables for an epidemic model from real data. We suggest an adaptive synchronization method based on observer approach with an effective guidance parameter to update rule design only from real data. In order, to validate the identifiability and estimation results, numerical simulations of a tuberculosis (TB) model using real data of the region of Center in Cameroon are performed to estimate the parameters and variables. This study shows that some tools of synchronization of nonlinear systems can help to deal with the parameter and state estimation problem in the field of epidemiology. We exploit the close link between mathematical modelling, structural identifiability analysis, synchronization, and parameter estimation to obtain biological insights into the system modelled.

  3. Parameter estimation based synchronization for an epidemic model with application to tuberculosis in Cameroon

    Science.gov (United States)

    Bowong, Samuel; Kurths, Jurgen

    2010-10-01

    We propose a method based on synchronization to identify the parameters and to estimate the underlying variables for an epidemic model from real data. We suggest an adaptive synchronization method based on observer approach with an effective guidance parameter to update rule design only from real data. In order, to validate the identifiability and estimation results, numerical simulations of a tuberculosis (TB) model using real data of the region of Center in Cameroon are performed to estimate the parameters and variables. This study shows that some tools of synchronization of nonlinear systems can help to deal with the parameter and state estimation problem in the field of epidemiology. We exploit the close link between mathematical modelling, structural identifiability analysis, synchronization, and parameter estimation to obtain biological insights into the system modelled.

  4. Application of Transfer Matrix Approach to Modeling and Decentralized Control of Lattice-Based Structures

    Science.gov (United States)

    Cramer, Nick; Swei, Sean Shan-Min; Cheung, Kenny; Teodorescu, Mircea

    2015-01-01

    This paper presents a modeling and control of aerostructure developed by lattice-based cellular materials/components. The proposed aerostructure concept leverages a building block strategy for lattice-based components which provide great adaptability to varying ight scenarios, the needs of which are essential for in- ight wing shaping control. A decentralized structural control design is proposed that utilizes discrete-time lumped mass transfer matrix method (DT-LM-TMM). The objective is to develop an e ective reduced order model through DT-LM-TMM that can be used to design a decentralized controller for the structural control of a wing. The proposed approach developed in this paper shows that, as far as the performance of overall structural system is concerned, the reduced order model can be as e ective as the full order model in designing an optimal stabilizing controller.

  5. Physiologically Based Pharmacokinetic Modeling: Methodology, Applications, and Limitations with a Focus on Its Role in Pediatric Drug Development

    Directory of Open Access Journals (Sweden)

    Feras Khalil

    2011-01-01

    Full Text Available The concept of physiologically based pharmacokinetic (PBPK modeling was introduced years ago, but it has not been practiced significantly. However, interest in and implementation of this modeling technique have grown, as evidenced by the increased number of publications in this field. This paper demonstrates briefly the methodology, applications, and limitations of PBPK modeling with special attention given to discuss the use of PBPK models in pediatric drug development and some examples described in detail. Although PBPK models do have some limitations, the potential benefit from PBPK modeling technique is huge. PBPK models can be applied to investigate drug pharmacokinetics under different physiological and pathological conditions or in different age groups, to support decision-making during drug discovery, to provide, perhaps most important, data that can save time and resources, especially in early drug development phases and in pediatric clinical trials, and potentially to help clinical trials become more “confirmatory” rather than “exploratory”.

  6. Process-based modelling of fluvial system response to rapid climate change: 1. model formulation and generic applications

    NARCIS (Netherlands)

    Bogaart, P.W.; Balen, van R.T.; Kasse, C.; Vandenberghe, J.

    2003-01-01

    A comprehensive model strategy is presented which enables the prediction of catchment hydrology and the dynamics of sediment transport within the alluvial river systems draining these catchments. The model is driven by AGCM-based weather predictions, generalised by using a stochastic weather generat

  7. Spatial Rule-Based Modeling: A Method and Its Application to the Human Mitotic Kinetochore

    Directory of Open Access Journals (Sweden)

    Jan Huwald

    2013-07-01

    Full Text Available A common problem in the analysis of biological systems is the combinatorial explosion that emerges from the complexity of multi-protein assemblies. Conventional formalisms, like differential equations, Boolean networks and Bayesian networks, are unsuitable for dealing with the combinatorial explosion, because they are designed for a restricted state space with fixed dimensionality. To overcome this problem, the rule-based modeling language, BioNetGen, and the spatial extension, SRSim, have been developed. Here, we describe how to apply rule-based modeling to integrate experimental data from different sources into a single spatial simulation model and how to analyze the output of that model. The starting point for this approach can be a combination of molecular interaction data, reaction network data, proximities, binding and diffusion kinetics and molecular geometries at different levels of detail. We describe the technique and then use it to construct a model of the human mitotic inner and outer kinetochore, including the spindle assembly checkpoint signaling pathway. This allows us to demonstrate the utility of the procedure, show how a novel perspective for understanding such complex systems becomes accessible and elaborate on challenges that arise in the formulation, simulation and analysis of spatial rule-based models.

  8. A microcantilever-based alcohol vapor sensor-application and response model

    DEFF Research Database (Denmark)

    Jensenius, Henriette; Thaysen, Jacob; Rasmussen, Anette Alsted

    2000-01-01

    is a direct measure of the molecular concentration of alcohol vapor. On the basis of the model the detection limit of this cantilever-based sensor is determined to be below 10 ppm for alcohol vapor measurements. Furthermore, the time response of the cantilever can be used to distinguish between different...... reference cantilever background noise is subtracted directly in the measurement. A polymer coated cantilever has been exposed to vapors of various alcohols and the resulting cantilever response has been interpreted using a simple evaporation model. The model indicates that the cantilever response...... alcohols due to a difference in the evaporation rates. (C) 2000 American Institute of Physics....

  9. The transient observation-based particle (TOP model and its potential application in radiation effects evaluation

    Directory of Open Access Journals (Sweden)

    Benck Sylvie

    2013-01-01

    Full Text Available The evaluation of the radiation hazards on components used in space environment is based on the knowledge of the radiation level encountered on orbit. The models that are widely used to assess the near-Earth environment for a given mission are empirical trapped radiation models derived from a compilation of spacecraft measurements. However, these models are static and hence are not suited for describing the short timescale variations of geomagnetic conditions. The transient observation-based particle (TOP-model tends to break with this classical approach by introducing dynamic features based on the observation and characterization of transient particle flux events in addition to classical mapping of steady-state flux levels. In order to get a preliminary version of an operational model (actually only available for electrons at low Earth orbit, LEO, (i the steady-state flux level, (ii the flux enhancements probability distribution functions, and (iii the flux decay-time constants (at given energy and positions in space were determined, and an original dynamic model skeleton with these input parameters has been developed. The methodology is fully described and first flux predictions from the model are presented. In order to evaluate the net effects of radiation on a component, it is important to have an efficient tool that calculates the transfer of the outer radiation environment through the spacecraft material, toward the location of the component under investigation. Using the TOP-model space radiation fluxes and the transmitted radiation environment characteristics derived through GEANT4 calculations, a case study for electron flux/dose variations in a small silicon volume is performed. Potential cases are assessed where the dynamic of the spacecraft radiation environment may have an impact on the observed radiation effects.

  10. Information exchange in global logistics chains : an application for model-based auditing,

    NARCIS (Netherlands)

    Veenstra, A.W.; Hulstijn, J.; Christiaanse, R.M.J.; Tan, Y.

    2013-01-01

    An integrated data pipeline has been proposed to meet requirements for visibility, supervision and control in global supply chains. How can data integration be used for risk assessment, monitoring and control in global supply chains? We argue that concepts from model-based auditing can be used to mo

  11. Developing and Evaluating Creativity Gamification Rehabilitation System: The Application of PCA-ANFIS Based Emotions Model

    Science.gov (United States)

    Su, Chung-Ho; Cheng, Ching-Hsue

    2016-01-01

    This study aims to explore the factors in a patient's rehabilitation achievement after a total knee replacement (TKR) patient exercises, using a PCA-ANFIS emotion model-based game rehabilitation system, which combines virtual reality (VR) and motion capture technology. The researchers combine a principal component analysis (PCA) and an adaptive…

  12. ARX-NNPLS Model Based Optimization Strategy and Its Application in Polymer Grade Transition Process

    Institute of Scientific and Technical Information of China (English)

    费正顺; 胡斌; 叶鲁彬; 梁军

    2012-01-01

    Since it is often difficult to build differential algebraic equations (DAEs) for chemical processes, a new data-based modeling approach is proposed using ARX (AutoRegressive with eXogenous inputs) combined with neural network under partial least squares framework (ARX-NNPLS), in which less specific knowledge of the process is required but the input and output data. To represent the dynamic and nonlinear behavior of the process, the ARX combined with neural network is used in the partial least squares (PLS) inner model between input and output latent variables. In the proposed dynamic optimization strategy based on the ARX-NNPLS model, neither parameterization nor iterative solving process for DAEs is needed as the ARX-NNPLS model gives a proper representation for the dynamic behavior of the process, and the computing time is greatly reduced compared to conventional control vector parameterization method. To demonstrate the ARX-NNPLS model based optimization strategy, the polyethylene grade transition in gas phase fluidized-bed reactor is taken into account. The optimization results show that the final optimal trajectory of quality index determined by the new approach moves faster to the target values and the computing time is much less.

  13. Function Based Nonlinear Least Squares and Application to Jelinski--Moranda Software Reliability Model

    CERN Document Server

    Liu, Jingwei

    2011-01-01

    A function based nonlinear least squares estimation (FNLSE) method is proposed and investigated in parameter estimation of Jelinski-Moranda software reliability model. FNLSE extends the potential fitting functions of traditional least squares estimation (LSE), and takes the logarithm transformed nonlinear least squares estimation (LogLSE) as a special case. A novel power transformation function based nonlinear least squares estimation (powLSE) is proposed and applied to the parameter estimation of Jelinski-Moranda model. Solved with Newton-Raphson method, Both LogLSE and powLSE of Jelinski-Moranda models are applied to the mean time between failures (MTBF) predications on six standard software failure time data sets. The experimental results demonstrate the effectiveness of powLSE with optimal power index compared to the classical least--squares estimation (LSE), maximum likelihood estimation (MLE) and LogLSE in terms of recursively relative error (RE) index and Braun statistic index.

  14. Density-based Monte Carlo filter and its applications in nonlinear stochastic differential equation models.

    Science.gov (United States)

    Huang, Guanghui; Wan, Jianping; Chen, Hui

    2013-02-01

    Nonlinear stochastic differential equation models with unobservable state variables are now widely used in analysis of PK/PD data. Unobservable state variables are usually estimated with extended Kalman filter (EKF), and the unknown pharmacokinetic parameters are usually estimated by maximum likelihood estimator. However, EKF is inadequate for nonlinear PK/PD models, and MLE is known to be biased downwards. A density-based Monte Carlo filter (DMF) is proposed to estimate the unobservable state variables, and a simulation-based M estimator is proposed to estimate the unknown parameters in this paper, where a genetic algorithm is designed to search the optimal values of pharmacokinetic parameters. The performances of EKF and DMF are compared through simulations for discrete time and continuous time systems respectively, and it is found that the results based on DMF are more accurate than those given by EKF with respect to mean absolute error.

  15. A new crossover sine model based on trigonometric model and its application to the crossover lattice equation of state

    Science.gov (United States)

    Lee, Yongjin; Shin, Moon Sam; Kim, Hwayong

    2008-12-01

    In this study, a new crossover sine model (CSM) n was developed from a trigonometric model [M. E. Fisher, S. Zinn, and P. J. Upton, Phys. Rev. B 59, 14533 (1999)]. The trigonometric model is a parametric formulation model that is used to represent the thermodynamic variables near a critical point. Although there are other crossover models based on this trigonometric model, such as the CSM and the analytical sine model, which is an analytic formulation of the CSM, the new sine model (NSM) employs a different approach from these two models in terms of the connections between the parametric variables of the trigonometric model and thermodynamic variables. In order to test the performance of the NSM, the crossover lattice equation of state [M. S. Shin, Y. Lee, and H. Kim, J. Chem. Thermodyn. 40, 174 (2008)] was applied using the NSM for correlations of various pure fluids and fluid mixtures. The results showed that over a wide range of states, the crossover lattice fluid (xLF)/NSM yields the saturated properties of pure fluids and the phase behavior of binary mixtures more accurately than the original lattice equation of state. Moreover, a comparison with the crossover lattice equation of state using the CSM (xLF/CSM) showed that the new model presents good correlation results that are comparable to the xLF/CSM.

  16. Is equine colic seasonal? Novel application of a model based approach

    Directory of Open Access Journals (Sweden)

    Proudman Christopher J

    2006-08-01

    Full Text Available Abstract Background Colic is an important cause of mortality and morbidity in domesticated horses yet many questions about this condition remain to be answered. One such question is: does season have an effect on the occurrence of colic? Time-series analysis provides a rigorous statistical approach to this question but until now, to our knowledge, it has not been used in this context. Traditional time-series modelling approaches have limited applicability in the case of relatively rare diseases, such as specific types of equine colic. In this paper we present a modelling approach that respects the discrete nature of the count data and, using a regression model with a correlated latent variable and one with a linear trend, we explored the seasonality of specific types of colic occurring at a UK referral hospital between January 1995–December 2004. Results Six- and twelve-month cyclical patterns were identified for all colics, all medical colics, epiploic foramen entrapment (EFE, equine grass sickness (EGS, surgically treated and large colon displacement/torsion colic groups. A twelve-month cyclical pattern only was seen in the large colon impaction colic group. There was no evidence of any cyclical pattern in the pedunculated lipoma group. These results were consistent irrespective of whether we were using a model including latent correlation or trend. Problems were encountered in attempting to include both trend and latent serial dependence in models simultaneously; this is likely to be a consequence of a lack of power to separate these two effects in the presence of small counts, yet in reality the underlying physical effect is likely to be a combination of both. Conclusion The use of a regression model with either an autocorrelated latent variable or a linear trend has allowed us to establish formally a seasonal component to certain types of colic presented to a UK referral hospital over a 10 year period. These patterns appeared to coincide

  17. [The application of cybernetic modeling methods for the forensic medical personality identification based on the voice and sounding speech characteristics].

    Science.gov (United States)

    Kaganov, A Sh; Kir'yanov, P A

    2015-01-01

    The objective of the present publication was to discuss the possibility of application of cybernetic modeling methods to overcome the apparent discrepancy between two kinds of the speech records, viz. initial ones (e.g. obtained in the course of special investigation activities) and the voice prints obtained from the persons subjected to the criminalistic examination. The paper is based on the literature sources and the materials of original criminalistics expertises performed by the authors.

  18. Risk Evaluation Approach and Application Research on Fuzzy-FMECA Method Based on Cloud Model

    Directory of Open Access Journals (Sweden)

    Zhengjie Xu

    2013-09-01

    Full Text Available In order to safeguard the safety of passengers and reducemaintenance costs, it is necessary to analyze and evaluate the security risk ofthe Railway Signal System. However, the conventional Fuzzy Analytical HierarchyProcess (FAHP can not describe the fuzziness and randomness of the judgment,accurately, and once the fuzzy sets are described using subjection degreefunction, the concept of fuzziness will be no longer fuzzy. Thus Fuzzy-FMECAmethod based on cloud model is put forward. Failure Modes Effects andCriticality Analysis (FMECA method is used to identify the risk and FAHP basedon cloud model is used for determining the subjection degree function in fuzzymethod, finally the group decision can be gained with the syntheticallyaggregated cloud model, the method’s feasibility and effectiveness are shown inthe practical examples. Finally Fuzzy-FMECA based on cloud model and theconventional FAHP are used to assess the risk respectively, evaluation resultsshow that the cloud model which is introduced into the risk assessment ofRailway Signal System can realize the transition between precise value andquality value by combining the fuzziness and randomness and provide moreabundant information than subjection degree function of the conventional FAHP.

  19. A modular perspective of protein structures; application to fragment based loop modeling

    Science.gov (United States)

    Fernandez-Fuentes, Narcis; Fiser, Andras

    2013-01-01

    Summary Proteins can be decomposed into supersecondary structure modules. We used a generic definition of supersecondary structure elements, so-called Smotifs, which are composed of two flanking regular secondary structures connected by a loop, to explore the evolution and current variety of structure building blocks. Here, we discuss recent observations about the saturation of Smotif geometries in protein structures and how it opens new avenues in protein structure modeling and design. As a first application of these observations we describe our loop conformation modeling algorithm, ArchPred that takes advantage of Smotifs classification. In this application, instead of focusing on specific loop properties the method narrows down possible template conformations in other, often not homologous structures, by identifying the most likely supersecondary structure environment that cradles the loop. Beyond identifying the correct starting supersecondary structure geometry, it takes into account information of fit of anchor residues, sterical clashes, match of predicted and observed dihedral angle preferences, and local sequence signal. PMID:22987351

  20. A Reference Model for Monitoring IoT WSN-Based Applications.

    Science.gov (United States)

    Capella, Juan Vicente; Campelo, José Carlos; Bonastre, Alberto; Ors, Rafael

    2016-10-30

    The Internet of Things (IoT) is, at this moment, one of the most promising technologies that has arisen for decades. Wireless Sensor Networks (WSNs) are one of the main pillars for many IoT applications, insofar as they require to obtain context-awareness information. The bibliography shows many difficulties in their real implementation that have prevented its massive deployment. Additionally, in IoT environments where data producers and data consumers are not directly related, compatibility and certification issues become fundamental. Both problems would profit from accurate knowledge of the internal behavior of WSNs that must be obtained by the utilization of appropriate tools. There are many ad-hoc proposals with no common structure or methodology, and intended to monitor a particular WSN. To overcome this problem, this paper proposes a structured three-layer reference model for WSN Monitoring Platforms (WSN-MP), which offers a standard environment for the design of new monitoring platforms to debug, verify and certify a WSN's behavior and performance, and applicable to every WSN. This model also allows the comparative analysis of the current proposals for monitoring the operation of WSNs. Following this methodology, it is possible to achieve a standardization of WSN-MP, promoting new research areas in order to solve the problems of each layer.

  1. The Application of PPE Model Based on RAGA in Benefit Evaluating of Rice Water Saving

    Institute of Scientific and Technical Information of China (English)

    FU Qiang; YANG Guang-lin; FU Hong

    2003-01-01

    Through applying PPE model based on RAGA to evaluate the benefit of rice water saving,the author turns multi-dimension data into low dimension space.So the optimum projection direction can stand for the best influence on the collectivity.Thus,the value of projection function can evaluate each item good or not.The PPE model can avoid jamming of weight matrix in the method of fuzzy synthesize judgement,and obtain better result.The author wants to provide a new method and thought for readers who are engaged in investment decision-making of water saving irrigation and other relative study.

  2. An algorithm of multi-model spatial overlay based on three-dimensional terrain model TIN and its application

    Institute of Scientific and Technical Information of China (English)

    WANG Shao-an; ZHANG Zi-ping; GONG Jian-ya

    2001-01-01

    3D-GIS spatial overlay analysis is being broadly concerned about in in ternational academe and is a research focus. It is one of the important function s of spatial analysis using GIS technology. An algorithm of multi-model spatial overlay based on three-dimensional terrain model TIN is introduced in this pape r which can be used to solve the TIN-based three-dimensional overlay operation in spatial analysis. The feasibility and validity of this algorithm is identified. This algorithm is used successfully in three-dimensional overlay and region va riation overlay analysis.

  3. An algorithm of multi-model spatial overlay based on three-dimensional terrain model TIN and its application

    Institute of Scientific and Technical Information of China (English)

    王少安; 张子平; 龚健雅

    2001-01-01

    3D-GIS spatial overlay analysis is being broadly concerned about in international academe and is a research focus. It is one of the important functions of spatial analysis using GIS technology. An algorithm of multi-model spatial overlay based on three-dimensional terrain model TIN is introduced in this paper which can be used to solve the TIN-based thrcc-dimensional overlay operation in spatial analysis. The feasibility arid validity of this algorithm is identified. This algorithm is used successfully in three-dimensional overlay and region variation overlay analysis.

  4. REST based mobile applications

    Science.gov (United States)

    Rambow, Mark; Preuss, Thomas; Berdux, Jörg; Conrad, Marc

    2008-02-01

    Simplicity is the major advantage of REST based webservices. Whereas SOAP is widespread in complex, security sensitive business-to-business aplications, REST is widely used for mashups and end-user centric applicatons. In that context we give an overview of REST and compare it to SOAP. Furthermore we apply the GeoDrawing application as an example for REST based mobile applications and emphasize on pros and cons for the use of REST in mobile application scenarios.

  5. Research and application of mineral resources assessment by weights of evidence model based on SIG

    Institute of Scientific and Technical Information of China (English)

    Yuanyuan Chuai; Keyan Xiao; Yihua Xuan; Shaobin Zhan

    2006-01-01

    Geological data are usually of the characteristics of multi-source, large amount and multi-scale. The construction of Spatial Information Grid overcomes the shortages of personal computers when dealing with geological data. The authors introduce the definition, architecture and flow of mineral resources assessment by weights of evidence model based on Spatial Information Grid (SIG). Meanwhile, a case study on the prediction of copper mineral occurrence in the Middle-Lower Yangtze metallogenic belt is given. The results show that mineral resources assessement based on SIG is an effective new method which provides a way of sharing and integrating distributed geospatial information and improves the efficiency greatly.

  6. Alternatives to accuracy and bias metrics based on percentage errors for radiation belt modeling applications

    Energy Technology Data Exchange (ETDEWEB)

    Morley, Steven Karl [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-01

    This report reviews existing literature describing forecast accuracy metrics, concentrating on those based on relative errors and percentage errors. We then review how the most common of these metrics, the mean absolute percentage error (MAPE), has been applied in recent radiation belt modeling literature. Finally, we describe metrics based on the ratios of predicted to observed values (the accuracy ratio) that address the drawbacks inherent in using MAPE. Specifically, we define and recommend the median log accuracy ratio as a measure of bias and the median symmetric accuracy as a measure of accuracy.

  7. Radioactive Threat Detection with Scattering Physics: A Model-Based Application

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J V; Chambers, D H; Breitfeller, E F; Guidry, B L; Verbeke, J M; Axelrod, M A; Sale, K E; Meyer, A M

    2010-01-21

    The detection of radioactive contraband is a critical problem in maintaining national security for any country. Emissions from threat materials challenge both detection and measurement technologies especially when concealed by various types of shielding complicating the transport physics significantly. The development of a model-based sequential Bayesian processor that captures both the underlying transport physics including scattering offers a physics-based approach to attack this challenging problem. It is shown that this processor can be used to develop an effective detection technique.

  8. Model based fault diagnosis in a centrifugal pump application using structural analysis

    DEFF Research Database (Denmark)

    Kallesøe, C. S.; Izadi-Zamanabadi, Roozbeh; Rasmussen, Henrik;

    2004-01-01

    A model based approach for fault detection and isolation in a centrifugal pump is proposed in this paper. The fault detection algorithm is derived using a combination of structural analysis, Analytical Redundant Relations (ARR) and observer designs. Structural considerations on the system are use...... it to an industrial benchmark. The benchmark tests have shown that the algorithm is capable of detection and isolation of five different faults in the mechanical and hydraulic parts of the pump.......A model based approach for fault detection and isolation in a centrifugal pump is proposed in this paper. The fault detection algorithm is derived using a combination of structural analysis, Analytical Redundant Relations (ARR) and observer designs. Structural considerations on the system are used...

  9. Model Based Fault Diagnosis in a Centrifugal Pump Application using Structural Analysis

    DEFF Research Database (Denmark)

    Kallesøe, C. S.; Izadi-Zamanabadi, Roozbeh; Rasmussen, Henrik;

    2004-01-01

    A model based approach for fault detection and isolation in a centrifugal pump is proposed in this paper. The fault detection algorithm is derived using a combination of structural analysis, Analytical Redundant Relations (ARR) and observer designs. Structural considerations on the system are use...... it to an industrial benchmark. The benchmark tests have shown that the algorithm is capable of detection and isolation of five different faults in the mechanical and hydraulic parts of the pump.......A model based approach for fault detection and isolation in a centrifugal pump is proposed in this paper. The fault detection algorithm is derived using a combination of structural analysis, Analytical Redundant Relations (ARR) and observer designs. Structural considerations on the system are used...

  10. Modelling of the Relaxation Least Squares-Based Neural Networks and Its Application

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    A relaxation least squares-based learning algorithm for neural networks is proposed. Not only does it have a fast convergence rate, but it involves less computation quantity. Therefore, it is suitable to deal with the case when a network has a large scale but the number of training data is very limited. It has been used in converting furnace process modelling, and impressive result has been obtained.

  11. Application of Attribute Based Access Control Model for Industrial Control Systems

    Directory of Open Access Journals (Sweden)

    Erkan Yalcinkaya

    2017-02-01

    Full Text Available The number of reported security vulnerabilities and incidents related to the industrial control systems (ICS has increased recent years. As argued by several researchers, authorization issues and poor access control are key incident vectors. The majority of ICS are not designed security in mind and they usually lack strong and granular access control mechanisms. The attribute based access control (ABAC model offers high authorization granularity, central administration of access policies with centrally consolidated and monitored logging properties. This research proposes to harness the ABAC model to address the present and future ICS access control challenges. The proposed solution is also implemented and rigorously tested to demonstrate the feasibility and viability of ABAC model for ICS.

  12. Model-based analysis for qualitative data: an application in Drosophila germline stem cell regulation.

    Directory of Open Access Journals (Sweden)

    Michael Pargett

    2014-03-01

    Full Text Available Discovery in developmental biology is often driven by intuition that relies on the integration of multiple types of data such as fluorescent images, phenotypes, and the outcomes of biochemical assays. Mathematical modeling helps elucidate the biological mechanisms at play as the networks become increasingly large and complex. However, the available data is frequently under-utilized due to incompatibility with quantitative model tuning techniques. This is the case for stem cell regulation mechanisms explored in the Drosophila germarium through fluorescent immunohistochemistry. To enable better integration of biological data with modeling in this and similar situations, we have developed a general parameter estimation process to quantitatively optimize models with qualitative data. The process employs a modified version of the Optimal Scaling method from social and behavioral sciences, and multi-objective optimization to evaluate the trade-off between fitting different datasets (e.g. wild type vs. mutant. Using only published imaging data in the germarium, we first evaluated support for a published intracellular regulatory network by considering alternative connections of the same regulatory players. Simply screening networks against wild type data identified hundreds of feasible alternatives. Of these, five parsimonious variants were found and compared by multi-objective analysis including mutant data and dynamic constraints. With these data, the current model is supported over the alternatives, but support for a biochemically observed feedback element is weak (i.e. these data do not measure the feedback effect well. When also comparing new hypothetical models, the available data do not discriminate. To begin addressing the limitations in data, we performed a model-based experiment design and provide recommendations for experiments to refine model parameters and discriminate increasingly complex hypotheses.

  13. Downsizer - A Graphical User Interface-Based Application for Browsing, Acquiring, and Formatting Time-Series Data for Hydrologic Modeling

    Science.gov (United States)

    Ward-Garrison, Christian; Markstrom, Steven L.; Hay, Lauren E.

    2009-01-01

    The U.S. Geological Survey Downsizer is a computer application that selects, downloads, verifies, and formats station-based time-series data for environmental-resource models, particularly the Precipitation-Runoff Modeling System. Downsizer implements the client-server software architecture. The client presents a map-based, graphical user interface that is intuitive to modelers; the server provides streamflow and climate time-series data from over 40,000 measurement stations across the United States. This report is the Downsizer user's manual and provides (1) an overview of the software design, (2) installation instructions, (3) a description of the graphical user interface, (4) a description of selected output files, and (5) troubleshooting information.

  14. Application of model-based spectral analysis to wind-profiler radar observations

    Directory of Open Access Journals (Sweden)

    E. Boyer

    Full Text Available A classical way to reduce a radar’s data is to compute the spectrum using FFT and then to identify the different peak contributions. But in case an overlapping between the different echoes (atmospheric echo, clutter, hydrometeor echo. . . exists, Fourier-like techniques provide poor frequency resolution and then sophisticated peak-identification may not be able to detect the different echoes. In order to improve the number of reduced data and their quality relative to Fourier spectrum analysis, three different methods are presented in this paper and applied to actual data. Their approach consists of predicting the main frequency-components, which avoids the development of very sophisticated peak-identification algorithms. The first method is based on cepstrum properties generally used to determine the shift between two close identical echoes. We will see in this paper that this method cannot provide a better estimate than Fourier-like techniques in an operational use. The second method consists of an autoregressive estimation of the spectrum. Since the tests were promising, this method was applied to reduce the radar data obtained during two thunder-storms. The autoregressive method, which is very simple to implement, improved the Doppler-frequency data reduction relative to the FFT spectrum analysis. The third method exploits a MUSIC algorithm, one of the numerous subspace-based methods, which is well adapted to estimate spectra composed of pure lines. A statistical study of performances of this method is presented, and points out the very good resolution of this estimator in comparison with Fourier-like techniques. Application to actual data confirms the good qualities of this estimator for reducing radar’s data.

    Key words. Meteorology and atmospheric dynamics (tropical meteorology- Radio science (signal processing- General (techniques applicable in three or more fields

  15. Variable Selection and Updating In Model-Based Discriminant Analysis for High Dimensional Data with Food Authenticity Applications.

    Science.gov (United States)

    Murphy, Thomas Brendan; Dean, Nema; Raftery, Adrian E

    2010-03-01

    Food authenticity studies are concerned with determining if food samples have been correctly labelled or not. Discriminant analysis methods are an integral part of the methodology for food authentication. Motivated by food authenticity applications, a model-based discriminant analysis method that includes variable selection is presented. The discriminant analysis model is fitted in a semi-supervised manner using both labeled and unlabeled data. The method is shown to give excellent classification performance on several high-dimensional multiclass food authenticity datasets with more variables than observations. The variables selected by the proposed method provide information about which variables are meaningful for classification purposes. A headlong search strategy for variable selection is shown to be efficient in terms of computation and achieves excellent classification performance. In applications to several food authenticity datasets, our proposed method outperformed default implementations of Random Forests, AdaBoost, transductive SVMs and Bayesian Multinomial Regression by substantial margins.

  16. Modeling non-saturated ferrite-based devices: Application to twin toroid ferrite phase shifters

    Science.gov (United States)

    Le Gouellec, A.; Vérissimo, G.; Laur, V.; Queffelec, P.; Albert, I.; Girard, T.

    2016-08-01

    This article describes a new set of tools developed to improve the conception and modeling of non-saturated ferrite-based devices such as twin toroid phase shifters. These new simulation tools benefit from a generalized permeability tensor model able to describe the permeability tensor of a ferrite sample whatever its magnetization state. This model is coupled to a homemade 3D multi-scale magnetostatic analysis program, which describes the evolution of the magnetization through the definition of a hysteresis loop in every mesh cell. These computed spectra are then integrated into 3D electromagnetic simulation software that retains the spatial variations of the ferrite properties by using freshly developed macro programming functions. This new approach allows the designers to accurately model complex ferrite devices such as twin toroid phase shifters. In particular, we demonstrated a good agreement between simulated and measured phase shifts as a function of applied current values with a predicted maximum phase shift of 0.96 times the measured value.

  17. A Common Reasoning Model and Its Application in Knowledge—Based System

    Institute of Scientific and Technical Information of China (English)

    郑方青

    1991-01-01

    To use reasoning knowledge accurately and efficiently,many reasoning methods have been proposed.However,the differences in form among the methods may obstruct the systematical analysis and harmonious integration of them.In this paper,a common reasoning model JUM(Judgement Model)is introduced.According to JUM,a common knowledge representation form is abstracted from different reasoning methods and its limitation is reduced.We also propose an algorithm for transforming one type of JUMs into another.In some cases,the algorithm can be used to resolve the key problem of integrating different types of JUM in one system.It is possible that a new architecture of knowledge-based system can be realized under JUM.

  18. RECURRENT NEURAL NETWORK MODEL BASED ON PROJECTIVE OPERATOR AND ITS APPLICATION TO OPTIMIZATION PROBLEMS

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    The recurrent neural network (RNN) model based on projective operator was studied. Different from the former study, the value region of projective operator in the neural network in this paper is a general closed convex subset of n-dimensional Euclidean space and it is not a compact convex set in general, that is, the value region of projective operator is probably unbounded. It was proved that the network has a global solution and its solution trajectory converges to some equilibrium set whenever objective function satisfies some conditions. After that, the model was applied to continuously differentiable optimization and nonlinear or implicit complementarity problems. In addition, simulation experiments confirm the efficiency of the RNN.

  19. On the applicability of STDP-based learning mechanisms to spiking neuron network models

    Science.gov (United States)

    Sboev, A.; Vlasov, D.; Serenko, A.; Rybka, R.; Moloshnikov, I.

    2016-11-01

    The ways to creating practically effective method for spiking neuron networks learning, that would be appropriate for implementing in neuromorphic hardware and at the same time based on the biologically plausible plasticity rules, namely, on STDP, are discussed. The influence of the amount of correlation between input and output spike trains on the learnability by different STDP rules is evaluated. A usability of alternative combined learning schemes, involving artificial and spiking neuron models is demonstrated on the iris benchmark task and on the practical task of gender recognition.

  20. A Parallel Decision Model Based on Support Vector Machines and Its Application to Fault Diagnosis

    Institute of Scientific and Technical Information of China (English)

    Yan Weiwu(阎威武); Shao Huihe

    2004-01-01

    Many industrial process systems are becoming more and more complex and are characterized by distributed features. To ensure such a system to operate under working order, distributed parameter values are often inspected from subsystems or different points in order to judge working conditions of the system and make global decisions. In this paper, a parallel decision model based on Support Vector Machine (PDMSVM) is introduced and applied to the distributed fault diagnosis in industrial process. PDMSVM is convenient for information fusion of distributed system and it performs well in fault diagnosis with distributed features. PDMSVM makes decision based on synthetic information of subsystems and takes the advantage of Support Vector Machine. Therefore decisions made by PDMSVM are highly reliable and accurate.

  1. Model-Based Reinforcement of Kinect Depth Data for Human Motion Capture Applications

    Directory of Open Access Journals (Sweden)

    Andreas Skiadopoulos

    2013-07-01

    Full Text Available Motion capture systems have recently experienced a strong evolution. New cheap depth sensors and open source frameworks, such as OpenNI, allow for perceiving human motion on-line without using invasive systems. However, these proposals do not evaluate the validity of the obtained poses. This paper addresses this issue using a model-based pose generator to complement the OpenNI human tracker. The proposed system enforces kinematics constraints, eliminates odd poses and filters sensor noise, while learning the real dimensions of the performer’s body. The system is composed by a PrimeSense sensor, an OpenNI tracker and a kinematics-based filter and has been extensively tested. Experiments show that the proposed system improves pure OpenNI results at a very low computational cost.

  2. MAIN REGULARITIES OF FAULTING IN LITHOSPHERE AND THEIR APPLICATION (BASED ON PHYSICAL MODELLING RESULTS

    Directory of Open Access Journals (Sweden)

    S. A. Bornyakov

    2015-09-01

    Full Text Available Results of long-term experimental studies and modelling of faulting are briefly reviewed, and research methods and the-state-of-art issues are described. The article presents the main results of faulting modelling with the use of non-transparent elasto-viscous plastic and optically active models. An area of active dynamic influence of fault (AADIF is the term introduced to characterise a fault as a 3D geological body. It is shown that AADIF's width (М is determined by thickness of the layer wherein a fault occurs (Н, its viscosity (η and strain rate (V. Multiple correlation equations are proposed to show relationships between AADIF's width (М, H, η and V for faults of various morphological and genetic types. The irregularity of AADIF in time and space is characterised in view of staged formation of the internal fault structure of such areas and geometric and dynamic parameters of AADIF which are changeable along the fault strike. The authors pioneered in application of the open system conception to find explanations of regularities of structure formation in AADIFs. It is shown that faulting is a synergistic process of continuous changes of structural levels of strain, which differ in manifestation of specific self-similar fractures of various scales. Such levels are changeable due to self-organization processes of fracture systems. Fracture dissipative structures (FDS is the term introduced to describe systems of fractures that are subject to self-organization. It is proposed to consider informational entropy and fractal dimensions in order to reveal FDS in AADIF. Studied are relationships between structure formation in AADIF and accompanying processes, such as acoustic emission and terrain development above zones wherein faulting takes place. Optically active elastic models were designed to simulate the stress-and-strain state of AADIF of main standard types of fault jointing zones and their analogues in nature, and modelling results are

  3. Modelling Foundations and Applications

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 8th European Conference on Modelling Foundations and Applications, held in Kgs. Lyngby, Denmark, in July 2012. The 20 revised full foundations track papers and 10 revised full applications track papers presented were carefully reviewed and sel...

  4. APPLICATION OF A MODIFIED QUICK SCHEME TO DEPTHAVERAGED k-( TURBULENCE MODEL BASED ON UNSTRUCTURED GRIDS

    Institute of Scientific and Technical Information of China (English)

    HUA Zu-lin; XING Ling-hang; GU Li

    2008-01-01

    The modified QUICK scheme on unstructured grid was used to improve the advection flux approximation, and the depth-averaged turbulence model with the scheme based on FVM by SIMPLE series algorithm was established and applied to spur-dike flow computation. In this model, the over-relaxed approach was adopted to estimate the diffusion flux in view of its advantages in reducing errors and sustaining numerical stability usually encountered in non-orthogonal meshes. Two spur-dike cases with different defection angles (90oand 135o) were analyzed to validate the model. Computed results show that the predicted velocities and recirculation lengths are in good agreement with the observed data. Moreover, the computations on structured and unstructured grids were compared in terms of the approximately equivalent grid numbers. It can be concluded that the precision with unstructured grids is higher than that with structured grids in spite that the CPU time required is slightly more with unstructured grids. Thus, it is significant to apply the method to numerical simulation of practical hydraulic engineering.

  5. An overview of current applications, challenges, and future trends in distributed process-based models in hydrology

    Science.gov (United States)

    Fatichi, Simone; Vivoni, Enrique R.; Odgen, Fred L; Ivanov, Valeriy Y; Mirus, Benjamin B.; Gochis, David; Downer, Charles W; Camporese, Matteo; Davison, Jason H; Ebel, Brian A.; Jones, Norm; Kim, Jongho; Mascaro, Giuseppe; Niswonger, Richard; Restrepo, Pedro; Rigon, Riccardo; Shen, Chaopeng; Sulis, Mauro; Tarboton, David

    2016-01-01

    Process-based hydrological models have a long history dating back to the 1960s. Criticized by some as over-parameterized, overly complex, and difficult to use, a more nuanced view is that these tools are necessary in many situations and, in a certain class of problems, they are the most appropriate type of hydrological model. This is especially the case in situations where knowledge of flow paths or distributed state variables and/or preservation of physical constraints is important. Examples of this include: spatiotemporal variability of soil moisture, groundwater flow and runoff generation, sediment and contaminant transport, or when feedbacks among various Earth’s system processes or understanding the impacts of climate non-stationarity are of primary concern. These are situations where process-based models excel and other models are unverifiable. This article presents this pragmatic view in the context of existing literature to justify the approach where applicable and necessary. We review how improvements in data availability, computational resources and algorithms have made detailed hydrological simulations a reality. Avenues for the future of process-based hydrological models are presented suggesting their use as virtual laboratories, for design purposes, and with a powerful treatment of uncertainty.

  6. An overview of current applications, challenges, and future trends in distributed process-based models in hydrology

    Science.gov (United States)

    Fatichi, Simone; Vivoni, Enrique R.; Ogden, Fred L.; Ivanov, Valeriy Y.; Mirus, Benjamin; Gochis, David; Downer, Charles W.; Camporese, Matteo; Davison, Jason H.; Ebel, Brian; Jones, Norm; Kim, Jongho; Mascaro, Giuseppe; Niswonger, Richard; Restrepo, Pedro; Rigon, Riccardo; Shen, Chaopeng; Sulis, Mauro; Tarboton, David

    2016-06-01

    Process-based hydrological models have a long history dating back to the 1960s. Criticized by some as over-parameterized, overly complex, and difficult to use, a more nuanced view is that these tools are necessary in many situations and, in a certain class of problems, they are the most appropriate type of hydrological model. This is especially the case in situations where knowledge of flow paths or distributed state variables and/or preservation of physical constraints is important. Examples of this include: spatiotemporal variability of soil moisture, groundwater flow and runoff generation, sediment and contaminant transport, or when feedbacks among various Earth's system processes or understanding the impacts of climate non-stationarity are of primary concern. These are situations where process-based models excel and other models are unverifiable. This article presents this pragmatic view in the context of existing literature to justify the approach where applicable and necessary. We review how improvements in data availability, computational resources and algorithms have made detailed hydrological simulations a reality. Avenues for the future of process-based hydrological models are presented suggesting their use as virtual laboratories, for design purposes, and with a powerful treatment of uncertainty.

  7. An Agent-Based Modeling Framework and Application for the Generic Nuclear Fuel Cycle

    Science.gov (United States)

    Gidden, Matthew J.

    Key components of a novel methodology and implementation of an agent-based, dynamic nuclear fuel cycle simulator, Cyclus , are presented. The nuclear fuel cycle is a complex, physics-dependent supply chain. To date, existing dynamic simulators have not treated constrained fuel supply, time-dependent, isotopic-quality based demand, or fuel fungibility particularly well. Utilizing an agent-based methodology that incorporates sophisticated graph theory and operations research techniques can overcome these deficiencies. This work describes a simulation kernel and agents that interact with it, highlighting the Dynamic Resource Exchange (DRE), the supply-demand framework at the heart of the kernel. The key agent-DRE interaction mechanisms are described, which enable complex entity interaction through the use of physics and socio-economic models. The translation of an exchange instance to a variant of the Multicommodity Transportation Problem, which can be solved feasibly or optimally, follows. An extensive investigation of solution performance and fidelity is then presented. Finally, recommendations for future users of Cyclus and the DRE are provided.

  8. Multimodal, high-dimensional, model-based, Bayesian inverse problems with applications in biomechanics

    Science.gov (United States)

    Franck, I. M.; Koutsourelakis, P. S.

    2017-01-01

    This paper is concerned with the numerical solution of model-based, Bayesian inverse problems. We are particularly interested in cases where the cost of each likelihood evaluation (forward-model call) is expensive and the number of unknown (latent) variables is high. This is the setting in many problems in computational physics where forward models with nonlinear PDEs are used and the parameters to be calibrated involve spatio-temporarily varying coefficients, which upon discretization give rise to a high-dimensional vector of unknowns. One of the consequences of the well-documented ill-posedness of inverse problems is the possibility of multiple solutions. While such information is contained in the posterior density in Bayesian formulations, the discovery of a single mode, let alone multiple, poses a formidable computational task. The goal of the present paper is two-fold. On one hand, we propose approximate, adaptive inference strategies using mixture densities to capture multi-modal posteriors. On the other, we extend our work in [1] with regard to effective dimensionality reduction techniques that reveal low-dimensional subspaces where the posterior variance is mostly concentrated. We validate the proposed model by employing Importance Sampling which confirms that the bias introduced is small and can be efficiently corrected if the analyst wishes to do so. We demonstrate the performance of the proposed strategy in nonlinear elastography where the identification of the mechanical properties of biological materials can inform non-invasive, medical diagnosis. The discovery of multiple modes (solutions) in such problems is critical in achieving the diagnostic objectives.

  9. A novel Q-based online model updating strategy and its application in statistical process control for rubber mixing

    Institute of Scientific and Technical Information of China (English)

    Chunying Zhang; Sun Chen; Fang Wu; Kai Song

    2015-01-01

    To overcome the large time-delay in measuring the hardness of mixed rubber, rheological parameters were used to predict the hardness. A novel Q-based model updating strategy was proposed as a universal platform to track time-varying properties. Using a few selected support samples to update the model, the strategy could dramat-ical y save the storage cost and overcome the adverse influence of low signal-to-noise ratio samples. Moreover, it could be applied to any statistical process monitoring system without drastic changes to them, which is practical for industrial practices. As examples, the Q-based strategy was integrated with three popular algorithms (partial least squares (PLS), recursive PLS (RPLS), and kernel PLS (KPLS)) to form novel regression ones, QPLS, QRPLS and QKPLS, respectively. The applications for predicting mixed rubber hardness on a large-scale tire plant in east China prove the theoretical considerations.

  10. Supply Chain Vulnerability Analysis Using Scenario-Based Input-Output Modeling: Application to Port Operations.

    Science.gov (United States)

    Thekdi, Shital A; Santos, Joost R

    2016-05-01

    Disruptive events such as natural disasters, loss or reduction of resources, work stoppages, and emergent conditions have potential to propagate economic losses across trade networks. In particular, disruptions to the operation of container port activity can be detrimental for international trade and commerce. Risk assessment should anticipate the impact of port operation disruptions with consideration of how priorities change due to uncertain scenarios and guide investments that are effective and feasible for implementation. Priorities for protective measures and continuity of operations planning must consider the economic impact of such disruptions across a variety of scenarios. This article introduces new performance metrics to characterize resiliency in interdependency modeling and also integrates scenario-based methods to measure economic sensitivity to sudden-onset disruptions. The methods will be demonstrated on a U.S. port responsible for handling $36.1 billion of cargo annually. The methods will be useful to port management, private industry supply chain planning, and transportation infrastructure management.

  11. The k-nearest neighbour-based GMDH prediction model and its applications

    Science.gov (United States)

    Li, Qiumin; Tian, Yixiang; Zhang, Gaoxun

    2014-11-01

    This paper centres on a new GMDH (group method of data handling) algorithm based on the k-nearest neighbour (k-NN) method. Instead of the transfer function that has been used in traditional GMDH, the k-NN kernel function is adopted in the proposed GMDH to characterise relationships between the input and output variables. The proposed method combines the advantages of the k-nearest neighbour (k-NN) algorithm and GMDH algorithm, and thus improves the predictive capability of the GMDH algorithm. It has been proved that when the bandwidth of the kernel is less than a certain constant C, the predictive capability of the new model is superior to that of the traditional one. As an illustration, it is shown that the new method can accurately forecast consumer price index (CPI).

  12. Application of Finite Element Modeling Methods in Magnetic Resonance Imaging-Based Research and Clinical Management

    Science.gov (United States)

    Fwu, Peter Tramyeon

    The medical image is very complex by its nature. Modeling built upon the medical image is challenging due to the lack of analytical solution. Finite element method (FEM) is a numerical technique which can be used to solve the partial differential equations. It utilized the transformation from a continuous domain into solvable discrete sub-domains. In three-dimensional space, FEM has the capability dealing with complicated structure and heterogeneous interior. That makes FEM an ideal tool to approach the medical-image based modeling problems. In this study, I will address the three modeling in (1) photon transport inside the human breast by implanting the radiative transfer equation to simulate the diffuse optical spectroscopy imaging (DOSI) in order to measurement the percent density (PD), which has been proven as a cancer risk factor in mammography. Our goal is to use MRI as the ground truth to optimize the DOSI scanning protocol to get a consistent measurement of PD. Our result shows DOSI measurement is position and depth dependent and proper scanning scheme and body configuration are needed; (2) heat flow in the prostate by implementing the Penne's bioheat equation to evaluate the cooling performance of regional hypothermia during the robot assisted radical prostatectomy for the individual patient in order to achieve the optimal cooling setting. Four factors are taken into account during the simulation: blood abundance, artery perfusion, cooling balloon temperature, and the anatomical distance. The result shows that blood abundance, prostate size, and anatomical distance are significant factors to the equilibrium temperature of neurovascular bundle; (3) shape analysis in hippocampus by using the radial distance mapping, and two registration methods to find the correlation between sub-regional change to the age and cognition performance, which might not reveal in the volumetric analysis. The result gives a fundamental knowledge of normal distribution in young

  13. Application of Model Based Prognostics to Pneumatic Valves in a Cryogenic Propellant Loading Testbed

    Data.gov (United States)

    National Aeronautics and Space Administration — Pneumatic-actuated valves are critical components in many applications, including cryogenic propellant loading for space operations. For these components, failures...

  14. A novel physical eco-hydrological model concept for preferential flow based on experimental applications.

    Science.gov (United States)

    Jackisch, Conrad; van Schaik, Loes; Graeff, Thomas; Zehe, Erwin

    2014-05-01

    Preferential flow through macropores often determines hydrological characteristics - especially regarding runoff generation and fast transport of solutes. Macropore settings may yet be very different in nature and dynamics, depending on their origin. While biogenic structures follow activity cycles (e.g. earth worms) and population conditions (e.g. roots), pedogenic and geogenic structures may depend on water stress (e.g. cracks) or large events (e.g. flushed voids between skeleton and soil pipes) or simply persist (e.g. bedrock interface). On the one hand, such dynamic site characteristics can be observed in seasonal changes in its reaction to precipitation. On the other hand, sprinkling experiments accompanied by tracers or time-lapse 3D Ground-Penetrating-Radar are suitable tools to determine infiltration patterns and macropore configuration. However, model representation of the macropore-matrix system is still problematic, because models either rely on effective parameters (assuming well-mixed state) or on explicit advection strongly simplifying or neglecting interaction with the diffusive flow domain. Motivated by the dynamic nature of macropores, we present a novel model approach for interacting diffusive and advective water, solutes and energy transport in structured soils. It solely relies on scale- and process-aware observables. A representative set of macropores (data from sprinkling experiments) determines the process model scale through 1D advective domains. These are connected to a 2D matrix domain which is defined by pedo-physical retention properties. Water is represented as particles. Diffusive flow is governed by a 2D random walk of these particles while advection may take place in the macropore domain. Macropore-matrix interaction is computed as dissipation of the advective momentum of a particle by its experienced drag from the matrix domain. Through a representation of matrix and macropores as connected diffusive and advective domains for water

  15. An Effective Security Mechanism for M-Commerce Applications Exploiting Ontology Based Access Control Model for Healthcare System

    Directory of Open Access Journals (Sweden)

    S.M. Roychoudri

    2016-09-01

    Full Text Available Health organizations are beginning to move mobile commerce services in recent years to enhance services and quality without spending much investment for IT infrastructure. Medical records are very sensitive and private to any individuals. Hence effective security mechanism is required. The challenges of our research work are to maintain privacy for the users and provide smart and secure environment for accessing the application. It is achieved with the help of personalization. Internet has provided the way for personalization. Personalization is a term which refers to the delivery of information that is relevant to individual or group of individuals in the format, layout specified and in time interval. In this paper we propose an Ontology Based Access Control (OBAC Model that can address the permitted access control among the service providers and users. Personal Health Records sharing is highly expected by the users for the acceptance in mobile commerce applications in health care systems.

  16. Modeling Evidence-Based Application: Using Team-Based Learning to Increase Higher Order Thinking in Nursing Research

    Directory of Open Access Journals (Sweden)

    Bridget Moore

    2015-06-01

    Full Text Available Nursing practice is comprised of knowledge, theory, and research [1]. Because of its impact on the profession, the appraisal of research evidence is critically important. Future nursing professionals must be introduced to the purpose and utility of nursing research, as early exposure provides an opportunity to embed evidence-based practice (EBP into clinical experiences. The AACN requires baccalaureate education to include an understanding of the research process to integrate reliable evidence to inform practice and enhance clinical judgments [1]. Although the importance of these knowledge competencies are evident to healthcare administrators and nursing leaders within the field, undergraduate students at the institution under study sometimes have difficulty understanding the relevance of nursing research to the baccalaureate prepared nurse, and struggle to grasp advanced concepts of qualitative and quantitative research design and methodologies. As undergraduate nursing students generally have not demonstrated an understanding of the relationship between theoretical concepts found within the undergraduate nursing curriculum and the practical application of these concepts in the clinical setting, the research team decided to adopt an effective pedagogical active learning strategy, team-based learning (TBL. Team-based learning shifts the traditional course design to focus on higher thinking skills to integrate desired knowledge [2]. The purpose of this paper is to discuss the impact of course design with the integration of TBL in an undergraduate nursing research course on increasing higher order thinking. [1] American Association of Colleges of Nursing, The Essentials of Baccalaureate Education for Professional Nursing Practice, Washington, DC: American Association of Colleges of Nursing, 2008. [2] B. Bloom, Taxonomy of Educational Objectives, Handbook I: Cognitive Domain, New York: McKay, 1956.

  17. Behaviour model identification based on inverse modeling and using Optical Full Field Measurements (OFFM): application on rubber and steel

    Science.gov (United States)

    Velay, V.; Robert, L.; Schmidt, F.; Hmida, S.; Vallet, T.

    2007-04-01

    Biaxial properties of materials (polymer or steel) used in many industrial processes are often difficult to measure. However, these properties are useful for the numerical simulations of plastic-processing operations like blow moulding or thermoforming for polymers and superplastic forming or single point incremental forming for steels. Today, Optical Full Field Measurements (OFFM) are promising tools for experimental analysis of materials. Indeed, they are able to provide a very large amount of data (displacement or strain) spatially distributed. In this paper, a mixed numerical and experimental investigation is proposed in order to identify multi-axial constitutive behaviour models. The procedure is applied on two different materials commonly used in forming processes: polymer (rubber in this first approach) and steel. Experimental tests are performed on various rubber and steel structural specimens (notched and open-hole plate samples) in order to generate heterogeneous displacement field. Two different behaviour models are considered. On the one hand, a Money-Rivlin hyperelastic law is investigated to describe the high levels of strain induced in tensile test performed on a rubber open-hole specimen. On the other hand, Ramberg-Osgood law allows to reproduce elasto-plastic behaviour of steel on a specimen that induces heterogeneous strain fields. Each parameter identification is based on a same Finite Element Model Updated (FEMU) procedure which consists in comparing results provided by the numerical simulation (ABAQUS™) with full field measurements obtained by the DISC (Digital Image Stereo-Correlation) technique (Vic-3D®).

  18. Modelling Foundations and Applications

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 8th European Conference on Modelling Foundations and Applications, held in Kgs. Lyngby, Denmark, in July 2012. The 20 revised full foundations track papers and 10 revised full applications track papers presented were carefully reviewed......, as well as the high quality of the results presented in these accepted papers, demonstrate the maturity and vibrancy of the field....

  19. Application of a process-based shallow landslide hazard model over a broad area in Central Italy

    Science.gov (United States)

    Gioia, Eleonora; Speranza, Gabriella; Ferretti, Maurizio; Godt, Jonathan W.; Baum, Rex L.; Marincioni, Fausto

    2015-01-01

    Process-based models are widely used for rainfall-induced shallow landslide forecasting. Previous studies have successfully applied the U.S. Geological Survey’s Transient Rainfall Infiltration and Grid-Based Regional Slope-Stability (TRIGRS) model (Baum et al. 2002) to compute infiltration-driven changes in the hillslopes’ factor of safety on small scales (i.e., tens of square kilometers). Soil data input for such models are difficult to obtain across larger regions. This work describes a novel methodology for the application of TRIGRS over broad areas with relatively uniform hydrogeological properties. The study area is a 550-km2 region in Central Italy covered by post-orogenic Quaternary sediments. Due to the lack of field data, we assigned mechanical and hydrological property values through a statistical analysis based on literature review of soils matching the local lithologies. We calibrated the model using rainfall data from 25 historical rainfall events that triggered landslides. We compared the variation of pressure head and factor of safety with the landslide occurrence to identify the best fitting input conditions. Using calibrated inputs and a soil depth model, we ran TRIGRS for the study area. Receiver operating characteristic (ROC) analysis, comparing the model’s output with a shallow landslide inventory, shows that TRIGRS effectively simulated the instability conditions in the post-orogenic complex during historical rainfall scenarios. The implication of this work is that rainfall-induced landslides over large regions may be predicted by a deterministic model, even where data on geotechnical and hydraulic properties as well as temporal changes in topography or subsurface conditions are not available.

  20. Application of a mixing-ratios based formulation to model mixing-driven dissolution experiments

    Science.gov (United States)

    Guadagnini, Alberto; Sanchez-Vila, Xavier; Saaltink, Maarten W.; Bussini, Michele; Berkowitz, Brian

    2009-05-01

    We address the question of how one can combine theoretical and numerical modeling approaches with limited measurements from laboratory flow cell experiments to realistically quantify salient features of complex mixing-driven multicomponent reactive transport problems in porous media. Flow cells are commonly used to examine processes affecting reactive transport through porous media, under controlled conditions. An advantage of flow cells is their suitability for relatively fast and reliable experiments, although measuring spatial distributions of a state variable within the cell is often difficult. In general, fluid is sampled only at the flow cell outlet, and concentration measurements are usually interpreted in terms of integrated reaction rates. In reactive transport problems, however, the spatial distribution of the reaction rates within the cell might be more important than the bulk integrated value. Recent advances in theoretical and numerical modeling of complex reactive transport problems [De Simoni M, Carrera J, Sanchez-Vila X, Guadagnini A. A procedure for the solution of multicomponent reactive transport problems. Water Resour Res 2005;41:W11410. doi: 10.1029/2005WR004056, De Simoni M, Sanchez-Vila X, Carrera J, Saaltink MW. A mixing ratios-based formulation for multicomponent reactive transport. Water Resour Res 2007;43:W07419. doi: 10.1029/2006WR005256] result in a methodology conducive to a simple exact expression for the space-time distribution of reaction rates in the presence of homogeneous or heterogeneous reactions in chemical equilibrium. The key points of the methodology are that a general reactive transport problem, involving a relatively high number of chemical species, can be formulated in terms of a set of decoupled partial differential equations, and the amount of reactants evolving into products depends on the rate at which solutions mix. The main objective of the current study is to show how this methodology can be used in conjunction

  1. Development and Application of a Life Cycle-Based Model to Evaluate Greenhouse Gas Emissions of Oil Sands Upgrading Technologies.

    Science.gov (United States)

    Pacheco, Diana M; Bergerson, Joule A; Alvarez-Majmutov, Anton; Chen, Jinwen; MacLean, Heather L

    2016-12-20

    A life cycle-based model, OSTUM (Oil Sands Technologies for Upgrading Model), which evaluates the energy intensity and greenhouse gas (GHG) emissions of current oil sands upgrading technologies, is developed. Upgrading converts oil sands bitumen into high quality synthetic crude oil (SCO), a refinery feedstock. OSTUM's novel attributes include the following: the breadth of technologies and upgrading operations options that can be analyzed, energy intensity and GHG emissions being estimated at the process unit level, it not being dependent on a proprietary process simulator, and use of publicly available data. OSTUM is applied to a hypothetical, but realistic, upgrading operation based on delayed coking, the most common upgrading technology, resulting in emissions of 328 kg CO2e/m(3) SCO. The primary contributor to upgrading emissions (45%) is the use of natural gas for hydrogen production through steam methane reforming, followed by the use of natural gas as fuel in the rest of the process units' heaters (39%). OSTUM's results are in agreement with those of a process simulation model developed by CanmetENERGY, other literature, and confidential data of a commercial upgrading operation. For the application of the model, emissions are found to be most sensitive to the amount of natural gas utilized as feedstock by the steam methane reformer. OSTUM is capable of evaluating the impact of different technologies, feedstock qualities, operating conditions, and fuel mixes on upgrading emissions, and its life cycle perspective allows easy incorporation of results into well-to-wheel analyses.

  2. An Optimal Decision Assessment Model Based on the Acceptable Maximum LGD of Commercial Banks and Its Application

    Directory of Open Access Journals (Sweden)

    Baofeng Shi

    2016-01-01

    Full Text Available This paper introduces a novel decision assessment method which is suitable for customers’ credit risk evaluation and credit decision. First of all, the paper creates an optimal credit rating model, and it consisted of an objective function and two constraint conditions. The first constraint condition of the strictly increasing LGDs eliminates the unreasonable phenomenon that the higher the credit rating is, the higher the LGD (loss given default is. Secondly, on the basis of the credit rating results, a credit decision-making assessment model based on measuring the acceptable maximum LGD of commercial banks is established. Thirdly, empirical results using the data on 2817 farmers’ microfinance of a Chinese commercial bank suggest that the proposed approach can accurately find out the good customers from all the loan applications. Moreover, our approach contributes to providing a reference for decision assessment of customers in other commercial banks in the world.

  3. Application of Nonlinear Predictive Control Based on RBF Network Predictive Model in MCFC Plant

    Institute of Scientific and Technical Information of China (English)

    CHEN Yue-hua; CAO Guang-yi; ZHU Xin-jian

    2007-01-01

    This paper described a nonlinear model predictive controller for regulating a molten carbonate fuel cell (MCFC). A detailed mechanism model of output voltage of a MCFC was presented at first. However, this model was too complicated to be used in a control system. Consequently, an off line radial basis function (RBF) network was introduced to build a nonlinear predictive model. And then, the optimal control sequences were obtained by applying golden mean method. The models and controller have been realized in the MATLAB environment. Simulation results indicate the proposed algorithm exhibits satisfying control effect even when the current densities vary largely.

  4. Fuzzy Identification Based on T-S Fuzzy Model and Its Application for SCR System

    Science.gov (United States)

    Zeng, Fanchun; Zhang, Bin; Zhang, Lu; Ji, Jinfu; Jin, Wenjing

    An improved T-S model was introduced to identify the model of SCR system. Model structure was selected by physical analyzes and mathematics tests. Three different clustering algorithms were introduced to obtain space partitions. Then, space partitions were amended by mathematics methods. At last, model parameters were identified by least square method. Train data was sampled in 1000MW coal-fired unit SCR system. T-S model of it is identified by three cluster methods. Identify results are proved effective. The merit and demerit among them are analyzed in the end.

  5. Dynamic Phasors-Based Modeling and Stability Analysis of Droop-Controlled Inverters for Microgrid Applications

    DEFF Research Database (Denmark)

    Guo, Xiaoqiang; Lu, Zhigang; Wang, Baocheng

    2014-01-01

    System modeling and stability analysis is one of the most important issues of inverter-dominated microgrids. It is useful to determine the system stability and optimize the control parameters. The complete small signal models for the inverter-dominated microgrids have been developed which are very...... accurate and could be found in literature. However, the modeling procedure will become very complex when the number of inverters in microgrid is large. One possible solution is to use the reduced-order small signal models for the inverter-dominated microgrids. Unfortunately, the reduced-order small signal...... models fail to predict the system instabilities. In order to solve the problem, a new modeling approach for inverter-dominated microgrids by using dynamic phasors is presented in this paper. Our findings indicate that the proposed dynamic phasor model is able to predict accurately the stability margins...

  6. Sequential application of ligand and structure based modeling approaches to index chemicals for their hH4R antagonism.

    Directory of Open Access Journals (Sweden)

    Matteo Pappalardo

    Full Text Available The human histamine H4 receptor (hH4R, a member of the G-protein coupled receptors (GPCR family, is an increasingly attractive drug target. It plays a key role in many cell pathways and many hH4R ligands are studied for the treatment of several inflammatory, allergic and autoimmune disorders, as well as for analgesic activity. Due to the challenging difficulties in the experimental elucidation of hH4R structure, virtual screening campaigns are normally run on homology based models. However, a wealth of information about the chemical properties of GPCR ligands has also accumulated over the last few years and an appropriate combination of these ligand-based knowledge with structure-based molecular modeling studies emerges as a promising strategy for computer-assisted drug design. Here, two chemoinformatics techniques, the Intelligent Learning Engine (ILE and Iterative Stochastic Elimination (ISE approach, were used to index chemicals for their hH4R bioactivity. An application of the prediction model on external test set composed of more than 160 hH4R antagonists picked from the chEMBL database gave enrichment factor of 16.4. A virtual high throughput screening on ZINC database was carried out, picking ∼ 4000 chemicals highly indexed as H4R antagonists' candidates. Next, a series of 3D models of hH4R were generated by molecular modeling and molecular dynamics simulations performed in fully atomistic lipid membranes. The efficacy of the hH4R 3D models in discrimination between actives and non-actives were checked and the 3D model with the best performance was chosen for further docking studies performed on the focused library. The output of these docking studies was a consensus library of 11 highly active scored drug candidates. Our findings suggest that a sequential combination of ligand-based chemoinformatics approaches with structure-based ones has the potential to improve the success rate in discovering new biologically active GPCR drugs and

  7. Application and testing of a GIS-based sediment connectivity model in the Venosta valley (Eastern Italian Alps)

    Science.gov (United States)

    Cavalli, Marco; Goldin, Beatrice; Crema, Stefano; Marchi, Lorenzo

    2014-05-01

    Sediment connectivity plays a significant role in geomorphic systems since it reflects the potential of sediment, deriving from soil erosion and remobilization of storages, to be transferred within or between landscape compartments. Understanding sediment movement and delivery to given areas of interest or sinks (e.g. channel network, urbanized area, catchment outlet) is an important issue for efficient management strategies. Thanks to the availability of high-resolution Digital Terrain Models (DTMs) different methods for mapping connectivity have been developed, but few examples of their application over large areas are available so far. In this study, a GIS-based model of sediment connectivity developed following the approach of Borselli et al. (2008) with ad hoc refinements devised to adapt the model to mountain catchments using high-resolution DTMs (Cavalli et al., 2013), has been applied to the upper and middle sectors of the Venosta Valley (1096 km2) in the Eastern Italian Alps. The output of the model is a topography-based index aiming at evaluating the potential connection between hillslopes and features acting as targets (e.g. catchment outlet, roads) or storage areas (sinks, retention basin) for transported sediment. The index is composed by an upslope and a downslope component. The first represents the forcing for downward routing of the sediment potentially available upslope and the latter considers the flow path length that a sediment particle has to travel to reach the nearest target or sink. In both components, two weighting factors are used: the slope and a proxy of the impedance to sediment fluxes. In the application to the Venosta valley two different impedance factors were tested: one based on the surface roughness and one derived from tabled values of hydraulic roughness (Manning's n). The main objective of the study is to test the applicability of the model to a regional context which encompasses areas with a large variability in topography and

  8. An open-source Java-based Toolbox for environmental model evaluation: The MOUSE Software Application

    Science.gov (United States)

    A consequence of environmental model complexity is that the task of understanding how environmental models work and identifying their sensitivities/uncertainties, etc. becomes progressively more difficult. Comprehensive numerical and visual evaluation tools have been developed such as the Monte Carl...

  9. Towards a social media-based model of trust and its application

    NARCIS (Netherlands)

    Boertjes, E.M.; Gerrits, B.M.; Kooij, R.E.; Maanen, P.P. van; Raaijmakers, S.A.; Wit, J.J. de

    2012-01-01

    In this paper we describe the development of a model for measuring consumer trust in certain topics on the basis of social media. Specifically, we propose a model for trust that takes into account both textually expressed sentiment and source authority, and illustrate it on a specific case: the iClo

  10. Rigorous model-based uncertainty quantification with application to terminal ballistics, part I: Systems with controllable inputs and small scatter

    Science.gov (United States)

    Kidane, A.; Lashgari, A.; Li, B.; McKerns, M.; Ortiz, M.; Owhadi, H.; Ravichandran, G.; Stalzer, M.; Sullivan, T. J.

    2012-05-01

    This work is concerned with establishing the feasibility of a data-on-demand (DoD) uncertainty quantification (UQ) protocol based on concentration-of-measure inequalities. Specific aims are to establish the feasibility of the protocol and its basic properties, including the tightness of the predictions afforded by the protocol. The assessment is based on an application to terminal ballistics and a specific system configuration consisting of 6061-T6 aluminum plates struck by spherical S-2 tool steel projectiles at ballistic impact speeds. The system's inputs are the plate thickness and impact velocity and the perforation area is chosen as the sole performance measure of the system. The objective of the UQ analysis is to certify the lethality of the projectile, i.e., that the projectile perforates the plate with high probability over a prespecified range of impact velocities and plate thicknesses. The net outcome of the UQ analysis is an M/U ratio, or confidence factor, of 2.93, indicative of a small probability of no perforation of the plate over its entire operating range. The high-confidence (>99.9%) in the successful operation of the system afforded the analysis and the small number of tests (40) required for the determination of the modeling-error diameter, establishes the feasibility of the DoD UQ protocol as a rigorous yet practical approach for model-based certification of complex systems.

  11. Location Based Application Availability

    Science.gov (United States)

    Naeem Akram, Raja; Markantonakis, Konstantinos; Mayes, Keith

    Smart cards are being integrated into a diverse range of industries: ranging from banking, telecom, transport, home/office access control to health and E-passport. Traditionally, cardholders are required to carry a smart card for each application. However, recent developments in the Near Field Communication (NFC) have renewed the interest in multiple applications for different services on a single device. This paper builds onto the NFC initiative and avoids the smart card ownership issues that hinder the adoption of such devices. The proposal integrates the Global Positioning System with the NFC in mobile phones to provide a ubiquitously and flexible service access model.

  12. Quality-Driven Model-Based Design of MultiProcessor Embedded Systems for Highlydemanding Applications

    DEFF Research Database (Denmark)

    Jozwiak, Lech; Madsen, Jan

    2013-01-01

    opportunities have been created. The traditional applications can be served much better and numerous new sorts of embedded systems became technologically feasible and economically justified. Various monitoring, control, communication or multi-media systems that can be put on or embedded in (mobile, poorly...... unusual silicon and system complexity. The combination of the huge complexity with the stringent application requirements results in numerous serious design and development challenges, such as: accounting in design for more aspects and changed relationships among aspects, complex multi-objective MPSo......The recent spectacular progress in modern nano-dimension semiconductor technology enabled implementation of a complete complex multi-processor system on a single chip (MPSoC), global networking and mobile wire-less communication, and facilitated a fast progress in these areas. New important...

  13. Applications for Mission Operations Using Multi-agent Model-based Instructional Systems with Virtual Environments

    Science.gov (United States)

    Clancey, William J.

    2004-01-01

    This viewgraph presentation provides an overview of past and possible future applications for artifical intelligence (AI) in astronaut instruction and training. AI systems have been used in training simulation for the Hubble Space Telescope repair, the International Space Station, and operations simulation for the Mars Exploration Rovers. In the future, robots such as may work as partners with astronauts on missions such as planetary exploration and extravehicular activities.

  14. Application of Novel Rotation Angular Model for 3D Mouse System Based on MEMS Accelerometers

    Institute of Scientific and Technical Information of China (English)

    QIAN Li; CHEN Wen-yuan; XU Guo-ping

    2009-01-01

    A new scheme is proposed to model 3D angular motion of a revolving regular object with miniature, low-cost micro electro mechanical systems (MEMS) accelerometers (instead of gyroscope), which is employed in 3D mouse system. To sense 3D angular motion, the static property of MEMS accelerometer, sensitive to gravity acceleration, is exploited. With the three outputs of configured accelerometers, the proposed model is implemented to get the rotary motion of the rigid object. In order to validate the effectiveness of the proposed model, an input device is developed with the configuration of the scheme. Experimental results show that a simulated 3D cube can accurately track the rotation of the input device. The result indicates the feasibility and effectiveness of the proposed model in the 3D mouse system.

  15. A Finite Element Cable Model and Its Applications Based on the Cubic Spline Curve

    Institute of Scientific and Technical Information of China (English)

    方子帆; 贺青松; 向兵飞; 肖化攀; 何孔德; 杜义贤

    2013-01-01

    For accurate prediction of the deformation of cable in the towed system, a new finite element model is presented that provides a representation of both the bending and torsional effects. In this paper, the cubic spline interpolation function is applied as the trial solution. By using a weighted residual approach, the discretized motion equations for the new finite element model are developed. The model is calculated with the computation program complier by Matlab. Several numerical examples are presented to illustrate the numerical schemes. The results of numerical simulation are stable and valid, and consistent with the mechanical properties of the cable. The model can be applied to kinematics analysis and the design of ocean cable, such as mooring lines, towing, and ROV umbilical cables.

  16. Quality Model and Artificial Intelligence Base Fuel Ratio Management with Applications to Automotive Engine

    Directory of Open Access Journals (Sweden)

    Mojdeh Piran

    2014-01-01

    Full Text Available In this research, manage the Internal Combustion (IC engine modeling and a multi-input-multi-output artificial intelligence baseline chattering free sliding mode methodology scheme is developed with guaranteed stability to simultaneously control fuel ratios to desired levels under various air flow disturbances by regulating the mass flow rates of engine PFI and DI injection systems. Modeling of an entire IC engine is a very important and complicated process because engines are nonlinear, multi inputs-multi outputs and time variant. One purpose of accurate modeling is to save development costs of real engines and minimizing the risks of damaging an engine when validating controller designs. Nevertheless, developing a small model, for specific controller design purposes, can be done and then validated on a larger, more complicated model. Analytical dynamic nonlinear modeling of internal combustion engine is carried out using elegant Euler-Lagrange method compromising accuracy and complexity. A baseline estimator with varying parameter gain is designed with guaranteed stability to allow implementation of the proposed state feedback sliding mode methodology into a MATLAB simulation environment, where the sliding mode strategy is implemented into a model engine control module (“software”. To estimate the dynamic model of IC engine fuzzy inference engine is applied to baseline sliding mode methodology. The fuzzy inference baseline sliding methodology performance was compared with a well-tuned baseline multi-loop PID controller through MATLAB simulations and showed improvements, where MATLAB simulations were conducted to validate the feasibility of utilizing the developed controller and state estimator for automotive engines. The proposed tracking method is designed to optimally track the desired FR by minimizing the error between the trapped in-cylinder mass and the product of the desired FR and fuel mass over a given time interval.

  17. An Application of an IDEFO Model to Improve the Process of Base Closure: A Case Study

    Science.gov (United States)

    1993-12-01

    DOCUMENTATION PAGE Form Approved OMB No. 0704 Public reporting burden for this collection of information is estimated to average I hour per respse., including the...34 since the facilities will be turned over to another government organizacion , NASA.(Henderson, 1992) Evaluating the process as a closure is still valid...over the base and used it for an Air Corps training facility for seven years. In 1942 the Navy regained control of the base as the west coast

  18. Nonlinear modeling, strength-based design, and testing of flexible piezoelectric energy harvesters under large dynamic loads for rotorcraft applications

    Science.gov (United States)

    Leadenham, Stephen; Erturk, Alper

    2014-04-01

    There has been growing interest in enabling wireless health and usage monitoring for rotorcraft applications, such as helicopter rotor systems. Large dynamic loads and acceleration fluctuations available in these environments make the implementation of vibration-based piezoelectric energy harvesters a very promising choice. However, such extreme loads transmitted to the harvester can also be detrimental to piezoelectric laminates and overall system reliability. Particularly flexible resonant cantilever configurations tuned to match the dominant excitation frequency can be subject to very large deformations and failure of brittle piezoelectric laminates due to excessive bending stresses at the root of the harvester. Design of resonant piezoelectric energy harvesters for use in these environments require nonlinear electroelastic dynamic modeling and strength-based analysis to maximize the power output while ensuring that the harvester is still functional. This paper presents a mathematical framework to design and analyze the dynamics of nonlinear flexible piezoelectric energy harvesters under large base acceleration levels. A strength-based limit is imposed to design the piezoelectric energy harvester with a proof mass while accounting for material, geometric, and dissipative nonlinearities, with a focus on two demonstrative case studies having the same linear fundamental resonance frequency but different overhang length and proof mass values. Experiments are conducted at different excitation levels for validation of the nonlinear design approach proposed in this work. The case studies in this work reveal that harvesters exhibiting similar behavior and power generation performance at low excitation levels (e.g. less than 0.1g) can have totally different strength-imposed performance limitations under high excitations (e.g. above 1g). Nonlinear modeling and strength-based design is necessary for such excitation levels especially when using resonant cantilevers with no

  19. Soft Sensing Modelling Based on Optimal Selection of Secondary Variables and Its Application

    Institute of Scientific and Technical Information of China (English)

    Qi Li; Cheng Shao

    2009-01-01

    The composition of the distillation column is a very important quality value in refineries, unfortunately, few hardware sensors are available on-line to measure the distillation compositions. In this paper, a novel method using sensitivity matrix analysis and kernel ridge regression (KRR) to implement on-line soft sensing of distillation compositions is proposed. In this approach, the sensitivity matrix analysis is presented to select the most suitable secondary variables to be used as the soft sensor's input. The KRR is used to build the composition soft sensor. Application to a simulated distillation column demonstrates the effectiveness of the method.

  20. Modelling, Analysis, and Design of a Frequency-Droop-Based Virtual Synchronous Generator for Microgrid Applications

    DEFF Research Database (Denmark)

    Du, Yan; Guerrero, Josep M.; Chang, Liuchen;

    2013-01-01

    of a synchronous generator (SG) by implementing the swing equation of SG with a primary frequency controller. In addition, a generalized model of the active power generation dynamics is developed in order to analyze the stability and to design the main control parameters. In contrast with the conventional droop...

  1. Cloud computing models and their application in LTE based cellular systems

    NARCIS (Netherlands)

    Staring, A.J.; Karagiannis, G.

    2013-01-01

    As cloud computing emerges as the next novel concept in computer science, it becomes clear that the model applied in large data storage systems used to resolve issues coming forth from an increasing demand, could also be used to resolve the very high bandwidth requirements on access network, core ne

  2. Application of the probability-based covering algorithm model in text classification

    Institute of Scientific and Technical Information of China (English)

    ZHOU; Ying

    2009-01-01

    The probability-based covering algorithm(PBCA)is a new algorithm based on probability distribution.It decides,by voting,the class of the tested samples on the border of the coverage area,based on the probability of training samples.When using the original covering algorithm(CA),many tested samples that are located on the border of the coverage cannot be classified by the spherical neighborhood gained.The network structure of PBCA is a mixed structure composed of both a feed-forward network and a feedback network.By using this method of adding some heterogeneous samples and enlarging the coverage radius,it is possible to decrease the number of rejected samples and improve the rate of recognition accuracy.Relevant computer experiments indicate that the algorithm improves the study precision and achieves reasonably good results in text classification.

  3. Overview of Dioxin Kinetics and Application of Dioxin Physiologically Based Phannacokinetic (PBPK) Models to Risk Assessment

    Science.gov (United States)

    The available data on the pharmacokinetics of 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) in animals and humans have been thoroughly reviewed in literature. It is evident based on these reviews and other analyses that three distinctive features of TCDD play important roles in dete...

  4. Application of physiologically based pharmacokinetic (PBPK) model of trichloroethylene in rats for estimation of internal dose

    Science.gov (United States)

    Potential human health risk from chemical exposure must often be assessed for conditions for which suitable human or animal data are not available, requiring extrapolation across duration and concentration. The default method for exposure-duration adjustment is based on Haber's r...

  5. Identifying plausible genetic models based on association and linkage results: application to type 2 diabetes.

    Science.gov (United States)

    Guan, Weihua; Boehnke, Michael; Pluzhnikov, Anna; Cox, Nancy J; Scott, Laura J

    2012-12-01

    When planning resequencing studies for complex diseases, previous association and linkage studies can constrain the range of plausible genetic models for a given locus. Here, we explore the combinations of causal risk allele frequency (RAFC ) and genotype relative risk (GRRC ) consistent with no or limited evidence for affected sibling pair (ASP) linkage and strong evidence for case-control association. We find that significant evidence for case-control association combined with no or moderate evidence for ASP linkage can define a lower bound for the plausible RAFC . Using data from large type 2 diabetes (T2D) linkage and genome-wide association study meta-analyses, we find that under reasonable model assumptions, 23 of 36 autosomal T2D risk loci are unlikely to be due to causal variants with combined RAFC < 0.005, and four of the 23 are unlikely to be due to causal variants with combined RAFC < 0.05.

  6. DARPA Ensemble-Based Modeling Large Graphs & Applications to Social Networks

    Science.gov (United States)

    2015-07-29

    find an MDS efficiently. Our implementation of a greedy MDS search algorithm is designed with this priority in mind. We use bucket sort with hashed...constructs simple MOAs with constraint M, strength M-d for d>1, basically with rounding , instead of algebraic techniques. In another work [41] Czabarka...Team, Final Report, 2015 57 [Handcock2003] M.S. Handcock, G.L. Robins , T.A.B. Snijders. Assessing degeneracy in statistical models of social

  7. Scenario Based Municipal Wastewater Estimation: Development and Application of a Dynamic Simulation Model

    Directory of Open Access Journals (Sweden)

    Yong Zhang

    2016-01-01

    Full Text Available This paper develops causal loop diagrams and a system dynamics model for estimation of wastewater quantity changes as a function of future socioeconomic development and the municipal water environment of the city under the influence of several key factors. Using Wuhan (a city with population more than 10 million in China as a case study, the variability of Wuhan’s wastewater quantity and water environment is modeled under different development patterns by year 2030. Nine future scenarios are designed by assigning different values to those key factors, including GDP growth rate, water consumption of annual ten thousand GDP, and wastewater treatment fee. The results show that (1 GDP growth leads to an increase in municipal wastewater quantity, but an increase in wastewater treatment fee can be in favor of reducing urban water pollution, and (2 the impact of per ten thousand yuan GDP water consumption on the amount of municipal wastewater is larger in the near future, while the impact of GDP growth rate is much larger in the long term. The dynamic model has proven to be reliable for simulating the municipal wastewater changes, and it could help decision makers to make the scientific and reasonable decisions.

  8. Time-frequency representation based on time-varying autoregressive model with applications to non-stationary rotor vibration analysis

    Indian Academy of Sciences (India)

    Long Zhang; Guoliang Xiong; Hesheng Liu; Huijun Zou; Weizhong Guo

    2010-04-01

    A parametric time-frequency representation is presented based on timevarying autoregressive model (TVAR), followed by applications to non-stationary vibration signal processing. The identification of time-varying model coefficients and the determination of model order, are addressed by means of neural networks and genetic algorithms, respectively. Firstly, a simulated signal which mimic the rotor vibration during run-up stages was processed for a comparative study on TVAR and other non-parametric time-frequency representations such as Short Time Fourier Transform, Continuous Wavelet Transform, Empirical Mode Decomposition, Wigner–Ville Distribution and Choi–Williams Distribution, in terms of their resolutions, accuracy, cross term suppression as well as noise resistance. Secondly, TVAR was applied to analyse non-stationary vibration signals collected from a rotor test rig during run-up stages, with an aim to extract fault symptoms under non-stationary operating conditions. Simulation and experimental results demonstrate that TVAR is an effective solution to non-stationary signal analysis and has strong capability in signal time-frequency feature extraction.

  9. LINKING SATELLITE REMOTE SENSING BASED ENVIRONMENTAL PREDICTORS TO DISEASE: AN APPLICATION TO THE SPATIOTEMPORAL MODELLING OF SCHISTOSOMIASIS IN GHANA

    Directory of Open Access Journals (Sweden)

    M. Wrable

    2016-06-01

    Full Text Available 90% of the worldwide schistosomiasis burden falls on sub-Saharan Africa. Control efforts are often based on infrequent, small-scale health surveys, which are expensive and logistically difficult to conduct. Use of satellite imagery to predictively model infectious disease transmission has great potential for public health applications. Transmission of schistosomiasis requires specific environmental conditions to sustain freshwater snails, however has unknown seasonality, and is difficult to study due to a long lag between infection and clinical symptoms. To overcome this, we employed a comprehensive 8-year time-series built from remote sensing feeds. The purely environmental predictor variables: accumulated precipitation, land surface temperature, vegetative growth indices, and climate zones created from a novel climate regionalization technique, were regressed against 8 years of national surveillance data in Ghana. All data were aggregated temporally into monthly observations, and spatially at the level of administrative districts. The result of an initial mixed effects model had 41% explained variance overall. Stratification by climate zone brought the R2 as high as 50% for major zones and as high as 59% for minor zones. This can lead to a predictive risk model used to develop a decision support framework to design treatment schemes and direct scarce resources to areas with the highest risk of infection. This framework can be applied to diseases sensitive to climate or to locations where remote sensing would be better suited than health surveys.

  10. Linking Satellite Remote Sensing Based Environmental Predictors to Disease: AN Application to the Spatiotemporal Modelling of Schistosomiasis in Ghana

    Science.gov (United States)

    Wrable, M.; Liss, A.; Kulinkina, A.; Koch, M.; Biritwum, N. K.; Ofosu, A.; Kosinski, K. C.; Gute, D. M.; Naumova, E. N.

    2016-06-01

    90% of the worldwide schistosomiasis burden falls on sub-Saharan Africa. Control efforts are often based on infrequent, small-scale health surveys, which are expensive and logistically difficult to conduct. Use of satellite imagery to predictively model infectious disease transmission has great potential for public health applications. Transmission of schistosomiasis requires specific environmental conditions to sustain freshwater snails, however has unknown seasonality, and is difficult to study due to a long lag between infection and clinical symptoms. To overcome this, we employed a comprehensive 8-year time-series built from remote sensing feeds. The purely environmental predictor variables: accumulated precipitation, land surface temperature, vegetative growth indices, and climate zones created from a novel climate regionalization technique, were regressed against 8 years of national surveillance data in Ghana. All data were aggregated temporally into monthly observations, and spatially at the level of administrative districts. The result of an initial mixed effects model had 41% explained variance overall. Stratification by climate zone brought the R2 as high as 50% for major zones and as high as 59% for minor zones. This can lead to a predictive risk model used to develop a decision support framework to design treatment schemes and direct scarce resources to areas with the highest risk of infection. This framework can be applied to diseases sensitive to climate or to locations where remote sensing would be better suited than health surveys.

  11. Numerial modelling based on the multiscale homogenization theory. Application in composite materials and structures

    OpenAIRE

    Badillo Almaraz, Hiram

    2012-01-01

    A multi-domain homogenization method is proposed and developed in this thesis based on a two-scale technique. The method is capable of analyzing composite structures with several periodic distributions by partitioning the entire domain of the composite into substructures making use of the classical homogenization theory following a first-order standard continuum mechanics formulation. The need to develop the multi-domain homogenization method arose because current homogenization methods are b...

  12. B-SPLINE-BASED SVM MODEL AND ITS APPLICATIONS TO OIL WATER-FLOODED STATUS IDENTIFICATION

    Institute of Scientific and Technical Information of China (English)

    Shang Fuhua; Zhao Tiejun; Yi Xiongying

    2007-01-01

    A method of B-spline transform for signal feature extraction is developed. With the B-spline,the log-signal space is mapped into the vector space. An efficient algorithm based on Support Vector Machine (SVM) to automatically identify the water-flooded status of oil-saturated stratum is described.The experiments show that this algorithm can improve the performances for the identification and the generalization in the case of a limited set of samples.

  13. Application of PPE Model in Land Adaptability Appraisal in Small Basin Based on RAGA

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    The paper applies multi-dimension to lower technology-projection pursuit evaluation model in the conservation of water and soil discipline domain, optimizes the projection direction using the improved acceleration genetic algorithms, transforms the multi-dimensional data target to lower sub-space, and values soil adaptability of Dongdagou basin in Keshan County by searching the optimal projection direction and the projection function data. The paper provides a new notion and method for the conservation of water and soil in small basin.

  14. APPLICATION OF GRAY EVALUATION MODEL BASED ON AHP IN ATM SYSTEM

    Institute of Scientific and Technical Information of China (English)

    Wu Zhijun; Pan Wen

    2008-01-01

    This paper presents a hierarchy model of Air Traffic Management (ATM) according to the security requirements in ATM system, analyzes it by grey assessment and Analytic Hierarchy Process(AHP), and evaluates it in details. It also provides theoretical support for building an effective evaluation system. The basic idea is to use AHP and Grey Assessment to obtain the weights of the indicators, and count grey evaluation coefficients with whitening function. The compositive clustering coefficients are obtained by combining the weights and the grey evaluation coefficients. Evaluation result can be gotten from the compositive clustering coefficients.

  15. Application of T2 Control Charts and Hidden Markov Models in Condition-Based Maintenance at Thermoelectric Power Plants

    Directory of Open Access Journals (Sweden)

    Emilija Kisić

    2015-01-01

    Full Text Available An innovative approach to condition-based maintenance of coal grinding subsystems at thermoelectric power plants is proposed in the paper. Coal mill grinding tables become worn over time and need to be replaced through time-based maintenance, after a certain number of service hours. At times such replacement is necessary earlier or later than prescribed, depending on the quality of the coal and of the grinding table itself. Considerable financial losses are incurred when the entire coal grinding subsystem is shut down and the grinding table found to not actually require replacement. The only way to determine whether replacement is necessary is to shut down and open the entire subsystem for visual inspection. The proposed algorithm supports condition-based maintenance and involves the application of T2 control charts to distinct acoustic signal parameters in the frequency domain and the construction of Hidden Markov Models whose observations are coded samples from the control charts. In the present research, the acoustic signals were collected by coal mill monitoring at the thermoelectric power plant “Kostolac” in Serbia. The proposed approach provides information about the current condition of the grinding table.

  16. Eye gaze in intelligent user interfaces gaze-based analyses, models and applications

    CERN Document Server

    Nakano, Yukiko I; Bader, Thomas

    2013-01-01

    Remarkable progress in eye-tracking technologies opened the way to design novel attention-based intelligent user interfaces, and highlighted the importance of better understanding of eye-gaze in human-computer interaction and human-human communication. For instance, a user's focus of attention is useful in interpreting the user's intentions, their understanding of the conversation, and their attitude towards the conversation. In human face-to-face communication, eye gaze plays an important role in floor management, grounding, and engagement in conversation.Eye Gaze in Intelligent User Interfac

  17. Towards quantum-based modeling of enzymatic reaction pathways: Application to the acetylholinesterase catalysis

    Science.gov (United States)

    Polyakov, Igor V.; Grigorenko, Bella L.; Moskovsky, Alexander A.; Pentkovski, Vladimir M.; Nemukhin, Alexander V.

    2013-01-01

    We apply computational methods aiming to approach a full quantum mechanical treatment of chemical reactions in proteins. A combination of the quantum mechanical - molecular mechanical methodology for geometry optimization and the fragment molecular orbital approach for energy calculations is examined for an example of acetylcholinesterase catalysis. The codes based on the GAMESS(US) package operational on the 'RSC Tornado' computational cluster are applied to determine that the energy of the reaction intermediate upon hydrolysis of acetylcholine is lower than that of the enzyme-substrate complex. This conclusion is consistent with the experiments and it is free from the empirical force field contributions.

  18. An extended car-following model based on intelligent transportation system application

    Science.gov (United States)

    Ge, H. X.; Dai, S. Q.; Dong, L. Y.

    2006-06-01

    The jams in the congested traffic reveal various density waves. Some of them are described by the nonlinear wave equations: the Korteweg-de-Vries (KdV) equation, the Burgers equation and the modified KdV equation. An extended car following model are proposed in previous work, and the kink-antikink solution has been obtained from the mKdV equation. We continue to derive the KdV equation near the neutral stability line by applying the reductive perturbation method. The traffic jam could be thus described by the soliton solution, and the analysis result is consistent with the previous one. From the numerical simulations results, the soliton waves are found, and traffic jam is suppressed efficiently as encounter big disturbances.

  19. Application of a Collaborative Filtering Recommendation Algorithm Based on Cloud Model in Intrusion Detection

    Directory of Open Access Journals (Sweden)

    Deguang Wang

    2011-02-01

    Full Text Available Intrusion detection is a computer network system that collects information on several key points. and it gets these information from the security audit, monitoring, attack recognition and response aspects, check if there are some the behavior and signs against the network security policy. The classification of data acquisition is a key part of intrusion detection. In this article, we use the data cloud model to classify the invasion, effectively maintaining a continuous data on the qualitative ambiguity of the concept and evaluation phase of the invasion against the use of the coordination level filtering recommendation algorithm greatly improves the intrusion detection system in the face of massive data processing efficiency suspicious intrusion.

  20. MODEL OF MOBILE TRANSLATOR APPLICATION OF ENGLISH TO BAHASA INDONESIA WITH RULE-BASED AND J2ME

    Directory of Open Access Journals (Sweden)

    Dian Puspita Tedjosurya

    2014-05-01

    Full Text Available Along with the development of information technology in recent era, a number of new applications emerge, especially on mobile phones. The use of mobile phones, besides as communication media, is also as media of learning, such as translator application. Translator application can be a tool to learn a language, such as English to Bahasa Indonesia translator application. The purpose of this research is to allow user to be able to translate English to Bahasa Indonesia on mobile phone easily. Translator application on this research was developed using Java programming language (especially J2ME because of its advantage that can run on various operating systems and its open source that can be easily developed and distributed. In this research, data collection was done through literature study, observation, and browsing similar application. Development of the system used object-oriented analysis and design that can be described by using case diagrams, class diagrams, sequence diagrams, and activity diagrams. The translation process used rule-based method. Result of this research is the application of Java-based translator which can translate English sentence into Indonesian sentence. The application can be accessed using a mobile phone with Internet connection. The application has spelling check feature that is able to check the wrong word and provide alternative word that approaches the word input. Conclusion of this research is the application can translate sentence in daily conversation quite well with the sentence structure corresponds and is close to its original meaning.

  1. Principles of models based engineering

    Energy Technology Data Exchange (ETDEWEB)

    Dolin, R.M.; Hefele, J.

    1996-11-01

    This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.

  2. Application of damage mechanics modeling to strain based design with respect to ductile crack initiation

    Energy Technology Data Exchange (ETDEWEB)

    Ishikawa, Nobuyuki; Sueyoshi, Hitoshi; Igi, Satoshi [Steel Research Laboratory, JFE Steel Corporation (Japan)

    2010-07-01

    In the oil and gas sector, with the increase in demand, more and more pipelines are now constructed in permafrost and seismic regions. When installed in such harsh environments, pipelines must be resistant to buckling and weld fracture and the strain based design methodology is preferably used. The aim of this paper is to study the critical condition for ductile crack initiation. Both notched round bar and wide plate tests were carried out on X80 and X100 steel pipes and welds; the equivalent plastic strain criterion and Gurson Tvergaard mechanical damage analysis were used. It was found that to determine ductile crack initiation that is not affected by specimen geometry, the critical equivalent plastic strain can be used as the local criterion. In addition, when ductile crack initiation is independent of specimen geometry, the void volume fraction can be used as a criterion. This paper provided useful information on which criterion to use for ductile crack initiation.

  3. Developing a Robotic Service System by Applying a User Model-Based Application for Supporting Daily Life

    Directory of Open Access Journals (Sweden)

    Yihsin Ho

    2012-10-01

    Full Text Available We developed a robotic service system by applying a user model‐based application for supporting daily life. Our robotic service system is designed to provide appropriate services to users depending on their needs; thus, we applied a user model‐based application, which can help to select and filter user information for our system in order to provide appropriate services to users.

  4. Hybrid Microgrid Model based on Solar Photovoltaics with Batteries and Fuel Cells system for intermittent applications

    Science.gov (United States)

    Patterson, Maxx

    Microgrids are a subset of the modern power structure; using distributed generation (DG) to supply power to communities rather than vast regions. The reduced scale mitigates loss allowing the power produced to do more with better control, giving greater security, reliability, and design flexibility. This paper explores the performance and cost viability of a hybrid grid-tied microgrid that utilizes Photovoltaic (PV), batteries, and fuel cell (FC) technology. The concept proposes that each community home is equipped with more PV than is required for normal operation. As the homes are part of a microgrid, excess or unused energy from one home is collected for use elsewhere within the microgrid footprint. The surplus power that would have been discarded becomes a community asset, and is used to run intermittent services. In this paper, the modeled community does not have parking adjacent to each home allowing for the installment of a privately owned slower Level 2 charger, making EV ownership option untenable. A solution is to provide a Level 3 DC Quick Charger (DCQC) as the intermittent service. The addition of batteries and Fuel Cells are meant to increase load leveling, reliability, and instill limited island capability.

  5. Multiscale enhanced path sampling based on the Onsager-Machlup action: Application to a model polymer

    CERN Document Server

    Fujisaki, Hiroshi; Moritsugu, Kei; Kidera, Akinori

    2013-01-01

    We propose a novel path sampling method based on the Onsager-Machlup (OM) action by generalizing the multiscale enhanced sampling (MSES) technique suggested by Moritsugu and coworkers (J. Chem. Phys. 133, 224105 (2010)). The basic idea of this method is that the system we want to study (for example, some molecular system described by molecular mechanics) is coupled to a coarse-grained (CG) system, which can move more quickly and computed more efficiently than the original system. We simulate this combined system (original + CG system) using (underdamped) Langevin dynamics where different heat baths are coupled to the two systems. When the coupling is strong enough, the original system is guided by the CG system, and able to sample the configuration and path space more efficiency. We need to correct the bias caused by the coupling, however, by employing the Hamiltonian replica exchange where we prepare many path replica with different coupling strengths. As a result, an unbiased path ensemble for the original ...

  6. X-parameter Based GaN Device Modeling and its Application to a High-efficiency PA Design

    DEFF Research Database (Denmark)

    Wang, Yelin; Nielsen, Troels Studsgaard; Jensen, Ole Kiel

    2014-01-01

    X-parameters are supersets of S-parameters and applicable to both linear and nonlinear system modeling. In this paper, a packaged 6 W Gallium Nitride (GaN) RF power transistor is modeled using load-dependent X-parameters by simulations. During the device characterization the load impedance is tuned...

  7. A Codon-Based Model of Host-Specific Selection in Parasites, with an Application to the Influenza A Virus

    DEFF Research Database (Denmark)

    Forsberg, Ronald; Christiansen, Freddy Bugge

    2003-01-01

    involved in hostspecific adaptation. We discuss the applicability of the model to the more general problem of ascertaining whether the selective regime differs between two groups of related organisms. The utility of the model is illustrated on a dataset of nucleoprotein sequences from the influenza A virus...

  8. Application of Model-Based Systems Engineering (MBSE) to Compare Legacy and Future Forces in Mine Warfare (MIW) Missions

    Science.gov (United States)

    2014-12-01

    performance pa- rameters for use within the model . A simple proof-of-concept model was first created us- ing Microsoft Excel; this was followed by a...cavitating disk in- side a venturi tube MK105 Magnetic sweep consisting of a gas turbine generator mounted on a sled MK106 Combination sweep consists... mathematical or logical functions, allowing it to be applied to many potential modeling problems. It can implement either time-based or event-based modeling

  9. Modeling and Experimental Validation of a Low-Cost Radiation Sensor Based on the Photovoltaic Effect for Building Applications

    Directory of Open Access Journals (Sweden)

    Ángel Gómez-Moreno

    2016-11-01

    Full Text Available The energy consumed to cool buildings is very elevated and solar gains represent a high percentage of these cooling loads. To minimize the thermal load it is necessary to control external shading systems. This control requires continuous measurement of solar radiation in different locations of the building. However, for such applications the use of conventional irradiance sensors increases the cost and reduces the profitability of the installation. This paper is focused on the development, modeling, and experimental validation of low cost irradiation sensors based on photovoltaic effect in order to reduce the costs of dynamic external shading devices and to improve the profitability of the system. With this proposal, firstly, small commercial photovoltaic cells have been adapted for use as an irradiation measurement device. Subsequently, quasi-stationary and continuous experimental measurements of these silicon cells, facing south and installed horizontally, have been carried out in Jaén (Spain in 2009 and 2010. Finally, a nonlinear multiparameter function has been developed to evaluate the irradiance using the electric current generated by the cell, cell temperature, ambient temperature, and absolute humidity. A favorable agreement between the model predictions and experimental data has been observed with a coefficient of determination around 0.996 for all cells.

  10. Use of Unified Modeling Language (UML) in Model-Based Development (MBD) For Safety-Critical Applications

    Science.gov (United States)

    2014-12-01

    requirements development phases. Although not fully realized today, the direction seems to be to produce fewer stacks of paper documents and have the...model be the artifact or be able to be used to produce the artifacts. Naturally, this produces fewer stand-alone paper artifacts and makes greater use...followed. For example, RTCA DO-178C [7], NASA-STD-8719.13B [27], and International Electrotechnical Commission (IEC) 61508 (specifically, Parts 3 [28

  11. PEM Fuel Cells - Fundamentals, Modeling and Applications

    Directory of Open Access Journals (Sweden)

    Maher A.R. Sadiq Al-Baghdadi

    2013-01-01

    Full Text Available Part I: Fundamentals Chapter 1: Introduction. Chapter 2: PEM fuel cell thermodynamics, electrochemistry, and performance. Chapter 3: PEM fuel cell components. Chapter 4: PEM fuel cell failure modes. Part II: Modeling and Simulation Chapter 5: PEM fuel cell models based on semi-empirical simulation. Chapter 6: PEM fuel cell models based on computational fluid dynamics. Part III: Applications Chapter 7: PEM fuel cell system design and applications.

  12. A Model for Water Quality Assessment Based on the Information Entropy and Its Application in the Case of Huiji River

    Institute of Scientific and Technical Information of China (English)

    BingdongZhao; QingliangZhao; JianhuaMa; HuaGuan

    2004-01-01

    Based on the information entropy, a model for water quality assessment is Using this model, the paper gives a case study on the water quality assessment River. The space-time variation law of the water quality is analyzed also in this result indicates that the model possesses some clear mathematic and physical and it is simple, practical and accurate.

  13. A knowledge- and model-based system for automated weaning from mechanical ventilation: technical description and first clinical application.

    Science.gov (United States)

    Schädler, Dirk; Mersmann, Stefan; Frerichs, Inéz; Elke, Gunnar; Semmel-Griebeler, Thomas; Noll, Oliver; Pulletz, Sven; Zick, Günther; David, Matthias; Heinrichs, Wolfgang; Scholz, Jens; Weiler, Norbert

    2014-10-01

    To describe the principles and the first clinical application of a novel prototype automated weaning system called Evita Weaning System (EWS). EWS allows an automated control of all ventilator settings in pressure controlled and pressure support mode with the aim of decreasing the respiratory load of mechanical ventilation. Respiratory load takes inspired fraction of oxygen, positive end-expiratory pressure, pressure amplitude and spontaneous breathing activity into account. Spontaneous breathing activity is assessed by the number of controlled breaths needed to maintain a predefined respiratory rate. EWS was implemented as a knowledge- and model-based system that autonomously and remotely controlled a mechanical ventilator (Evita 4, Dräger Medical, Lübeck, Germany). In a selected case study (n = 19 patients), ventilator settings chosen by the responsible physician were compared with the settings 10 min after the start of EWS and at the end of the study session. Neither unsafe ventilator settings nor failure of the system occurred. All patients were successfully transferred from controlled ventilation to assisted spontaneous breathing in a mean time of 37 ± 17 min (± SD). Early settings applied by the EWS did not significantly differ from the initial settings, except for the fraction of oxygen in inspired gas. During the later course, EWS significantly modified most of the ventilator settings and reduced the imposed respiratory load. A novel prototype automated weaning system was successfully developed. The first clinical application of EWS revealed that its operation was stable, safe ventilator settings were defined and the respiratory load of mechanical ventilation was decreased.

  14. Mathematical modeling with multidisciplinary applications

    CERN Document Server

    Yang, Xin-She

    2013-01-01

    Features mathematical modeling techniques and real-world processes with applications in diverse fields Mathematical Modeling with Multidisciplinary Applications details the interdisciplinary nature of mathematical modeling and numerical algorithms. The book combines a variety of applications from diverse fields to illustrate how the methods can be used to model physical processes, design new products, find solutions to challenging problems, and increase competitiveness in international markets. Written by leading scholars and international experts in the field, the

  15. A Robust Design Applicability Model

    DEFF Research Database (Denmark)

    Ebro, Martin; Lars, Krogstie; Howard, Thomas J.

    2015-01-01

    This paper introduces a model for assessing the applicability of Robust Design (RD) in a project or organisation. The intention of the Robust Design Applicability Model (RDAM) is to provide support for decisions by engineering management considering the relevant level of RD activities. The applic...

  16. A Unified ASrchitecture Model of Web Applications

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    With the increasing popularity,scale and complexity of web applications,design and development of web applications are becoming more and more difficult,However,the current state of their design and development is characterized by anarchy and ad hoc methodologies,One of the causes of this chaotic situation is that different researchers and designers have different understanding of web applications.In this paper,based on an explicit understanding of web applications,we present a unified architecture model of wed applications,the four-view model,which addresses the analysis and design issues of web applications from four perspectives,namely,logical view,data view,navigation view and presentation view,each addrssing a specific set of concerns of web applications,the purpose of the model is to provide a clear picture of web applications to alleviate the chaotic situation and facilitate its analysis,design and implementation.

  17. Application of remote sensing-based two-source energy balance model for mapping field surface fluxes with composite and component surface temperatures

    Science.gov (United States)

    Operational application of a remote sensing-based two source energy balance model (TSEB) to estimate evaportranspiration (ET) and the components evaporation (E), transpiration (T) at a range of space and time scales is very useful for managing water resources in arid and semiarid watersheds. The TSE...

  18. Application of model studies for quality control of bottom pressure based GLOSS sea level gauge at Takoradi Harbour (Ghana, West Africa)

    Digital Repository Service at National Institute of Oceanography (India)

    Joseph, A.; Mehra, P.; Desai, R.G.P.; Dotse, J.; Odammetey, J.T.; Nkebi, E.K.; VijayKumar, K.; Prabhudesai, S.

    Application of Model Studies for Quality Control of Bottom Pressure Based GLOSS Sea Level Gauge at Takoradi Harbour (Ghana, West Africa) Antony Joseph 1 , Prakash Mehra 1 , R. G. Prabhudesai 1 , Jean Dotse 2 , Joseph T. Odammetey 2 , Emmanuel K. Nkebi 2...

  19. Application of thermodynamics-based rate-dependent constitutive models of concrete in the seismic analysis of concrete dams

    Directory of Open Access Journals (Sweden)

    Fei LENG

    2008-09-01

    Full Text Available This paper discusses the seismic analysis of concrete dams with consideration of material nonlinearity. Based on a consistent rate-dependent model and two thermodynamics-based models, two thermodynamics-based rate-dependent constitutive models were developed with consideration of the influence of the strain rate. They can describe the dynamic behavior of concrete and be applied to nonlinear seismic analysis of concrete dams taking into account the rate sensitivity of concrete. With the two models, a nonlinear analysis of the seismic response of the Koyna Gravity Dam and the Dagangshan Arch Dam was conducted. The results were compared with those of a linear elastic model and two rate-independent thermodynamics-based constitutive models, and the influences of constitutive models and strain rate on the seismic response of concrete dams were discussed. It can be concluded from the analysis that, during seismic response, the tensile stress is the control stress in the design and seismic safety evaluation of concrete dams. In different models, the plastic strain and plastic strain rate of concrete dams show a similar distribution. When the influence of the strain rate is considered, the maximum plastic strain and plastic strain rate decrease.

  20. Wind-Climate Estimation Based on Mesoscale and Microscale Modeling: Statistical-Dynamical Downscaling for Wind Energy Applications

    DEFF Research Database (Denmark)

    Badger, Jake; Frank, Helmut; Hahmann, Andrea N.;

    2014-01-01

    turbine site. The method is divided into two parts: 1) preprocessing, in which the configurations for the mesoscale model simulations are determined, and 2) postprocessing, in which the data from the mesoscale simulations are prepared for wind energy application. Results from idealized mesoscale modeling...... experiments for a challenging wind farm site in northern Spain are presented to support the preprocessing method. Comparisons of modeling results with measurements from the same wind farm site are presented to support the postprocessing method. The crucial element in postprocessing is the bridging...... of mesoscale modeling data to microscale modeling input data, via a so-called generalization method. With this method, very high-resolution wind resource mapping can be achieved....

  1. A nonparametric urn-based approach to interacting failing systems with an application to credit risk modeling

    CERN Document Server

    Cirillo, Pasquale; Muliere, Pietro

    2010-01-01

    In this paper we propose a new nonparametric approach to interacting failing systems (FS), that is systems whose probability of failure is not negligible in a fixed time horizon, a typical example being firms and financial bonds. The main purpose when studying a FS is to calculate the probability of default and the distribution of the number of failures that may occur during the observation period. A model used to study a failing system is defined default model. In particular, we present a general recursive model constructed by the means of inter- acting urns. After introducing the theoretical model and its properties we show a first application to credit risk modeling, showing how to assess the idiosyncratic probability of default of an obligor and the joint probability of failure of a set of obligors in a portfolio of risks, that are divided into reliability classes.

  2. Logic models to predict continuous outputs based on binary inputs with an application to personalized cancer therapy

    Science.gov (United States)

    Knijnenburg, Theo A.; Klau, Gunnar W.; Iorio, Francesco; Garnett, Mathew J.; McDermott, Ultan; Shmulevich, Ilya; Wessels, Lodewyk F. A.

    2016-11-01

    Mining large datasets using machine learning approaches often leads to models that are hard to interpret and not amenable to the generation of hypotheses that can be experimentally tested. We present ‘Logic Optimization for Binary Input to Continuous Output’ (LOBICO), a computational approach that infers small and easily interpretable logic models of binary input features that explain a continuous output variable. Applying LOBICO to a large cancer cell line panel, we find that logic combinations of multiple mutations are more predictive of drug response than single gene predictors. Importantly, we show that the use of the continuous information leads to robust and more accurate logic models. LOBICO implements the ability to uncover logic models around predefined operating points in terms of sensitivity and specificity. As such, it represents an important step towards practical application of interpretable logic models.

  3. Advances in inline quantification of co-eluting proteins in chromatography: Process-data-based model calibration and application towards real-life separation issues.

    Science.gov (United States)

    Brestrich, Nina; Sanden, Adrian; Kraft, Axel; McCann, Karl; Bertolini, Joseph; Hubbuch, Jürgen

    2015-07-01

    Pooling decisions in preparative liquid chromatography for protein purification are usually based on univariate UV absorption measurements that are not able to differentiate between product and co-eluting contaminants. This can result in inconsistent pool purities or yields, if there is a batch-to-batch variability of the feedstock. To overcome this analytical bottleneck, a tool for selective inline quantification of co-eluting model proteins using mid-UV absorption spectra and Partial Least Squares Regression (PLS) was presented in a previous study and applied for real-time pooling decisions. In this paper, a process-data-based method for the PLS model calibration will be introduced that allows the application of the tool towards chromatography steps of real-life processes. The process-data-based calibration method uses recorded inline mid-UV absorption spectra that are correlated with offline fraction analytics to calibrate PLS models. In order to generate average spectra from the inline data, a Visual Basic for Application macro was successfully developed. The process-data-based model calibration was established using a ternary model protein system. Afterwards, it was successfully demonstrated in two case studies that the calibration method is applicable towards real-life separation issues. The calibrated PLS models allowed a successful quantification of the co-eluting species in a cation-exchange-based aggregate and fraction removal during the purification of monoclonal antibodies and of co-eluting serum proteins in an anion-exchange-based purification of Cohn supernatant I. Consequently, the presented process-data-based PLS model calibration in combination with the tool for selective inline quantification has a great potential for the monitoring of future chromatography steps and may contribute to manage batch-to-batch variability by real-time pooling decisions.

  4. A Model of Application System for Man-Machine-Environment System Engineering in Vessels Based on IDEF0

    Institute of Scientific and Technical Information of China (English)

    Zhen Shang; Changhua Qiu; Shifan Zhu

    2011-01-01

    Applying man-machine-environment system engineering (MMESE) in vessels is a method to improve the effectiveness of the interaction between equipment,environment,and humans for the purpose of advancing operating efficiency,performance,safety,and habitability of a vessel and its subsystems.In the following research,the life cycle of vessels was divided into 9 phases,and 15 research subjects were also identified from among these phases.The 15 subjects were systemized,and then the man-machine-environment engineering system application model for vessels was developed using the ICAM definition method 0 (IDEF0),which is a systematical modeling method.This system model bridges the gap between the data and information flow of every two associated subjects with the major basic research methods and approaches included,which brings the formerly relatively independent subjects together as a whole.The application of this systematic model should facilitate the application of man-machine-environment system engineering in vessels,especially at the conceptual and embodiment design phases.The managers and designers can deal with detailed tasks quickly and efficiently while reducing repetitive work.

  5. Roadway management plan based on rockfall modelling calibration and validation. Application along the Ma-10 road in Mallorca (Spain)

    Science.gov (United States)

    Mateos, Rosa Maria; Garcia, Inmaculada; Reichenbach, Paola; Herrera, Gerardo; Sarro, Roberto; Rius, Joan; Aguilo, Raul

    2016-04-01

    The Tramuntana range, in the northwestern sector of the island of Mallorca (Spain), is frequently affected by rockfalls which have caused significant damage, mainly along the road network. The Ma-10 road constitutes the main transportation corridor on the range with a heavy traffic estimated at 7,200 vehicles per day on average. With a length of 111 km and a tortuous path, the road is the connecting track for 12 municipalities and constitutes a strategic road on the island for many tourist resorts. For the period spanning from 1995 to current times, 63 rockfalls have affected the Ma-10 road with volumes ranging from 0.3m3 to 30,000 m3. Fortunately, no fatalities occurred but numerous blockages on the road took place which caused significant economic losses, valued of around 11 MEuro (Mateos el al., 2013). In this work we present the procedure we have applied to calibrate and validate rockfall modelling in the Tramuntana region, using 103 cases of the available detailed rockfall inventory (Mateos, 2006). We have exploited STONE (Guzzetti et al. 2002), a GIS based rockfall simulation software which computes 2D and 3D rockfall trajectories starting from a DTM and maps of the dynamic rolling friction coefficient and of the normal and tangential energy restitution coefficients. The appropriate identification of these parameters determines the accuracy of the simulation. To calibrate them, we have selected 40 rockfalls along the range which include a wide variety of outcropping lithologies. Coefficients values have been changed in numerous attempts in order to select those where the extent and shape of the simulation matched the field mapping. Best results were summarized with the average statistical values for each parameter and for each geotechnical unit, determining that mode values represent more precisely the data. Initially, for the validation stage, 10 well- known rockfalls exploited in the calibration phase have been selected. Confidence tests have been applied

  6. Model Based Definition

    Science.gov (United States)

    Rowe, Sidney E.

    2010-01-01

    In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.

  7. A Habitat-based Wind-Wildlife Collision Model with Application to the Upper Great Plains Region

    Energy Technology Data Exchange (ETDEWEB)

    Forcey, Greg, M.

    2012-08-28

    compared among species, our model outputs provide a convenient and easy landscape-level tool to quickly screen for siting issues at a high level. The model resolution is suitable for state or multi-county siting but users are cautioned against using these models for micrositing. The U.S. Fish and Wildlife Service recently released voluntary land-based wind energy guidelines for assessing impacts of a wind facility to wildlife using a tiered approach. The tiered approach uses an iterative approach for assessing impacts to wildlife in levels of increasing detail from landscape-level screening to site-specific field studies. Our models presented in this paper would be applicable to be used as tools to conduct screening at the tier 1 level and would not be appropriate to complete smaller scale tier 2 and tier 3 level studies. For smaller scale screening ancillary field studies should be conducted at the site-specific level to validate collision predictions.

  8. MIRAGE: a functional genomics-based approach for metabolic network model reconstruction and its application to cyanobacteria networks.

    Science.gov (United States)

    Vitkin, Edward; Shlomi, Tomer

    2012-11-29

    Genome-scale metabolic network reconstructions are considered a key step in quantifying the genotype-phenotype relationship. We present a novel gap-filling approach, MetabolIc Reconstruction via functionAl GEnomics (MIRAGE), which identifies missing network reactions by integrating metabolic flux analysis and functional genomics data. MIRAGE's performance is demonstrated on the reconstruction of metabolic network models of E. coli and Synechocystis sp. and validated via existing networks for these species. Then, it is applied to reconstruct genome-scale metabolic network models for 36 sequenced cyanobacteria amenable for constraint-based modeling analysis and specifically for metabolic engineering. The reconstructed network models are supplied via standard SBML files.

  9. Structural equation modeling methods and applications

    CERN Document Server

    Wang, Jichuan

    2012-01-01

    A reference guide for applications of SEM using Mplus Structural Equation Modeling: Applications Using Mplus is intended as both a teaching resource and a reference guide. Written in non-mathematical terms, this book focuses on the conceptual and practical aspects of Structural Equation Modeling (SEM). Basic concepts and examples of various SEM models are demonstrated along with recently developed advanced methods, such as mixture modeling and model-based power analysis and sample size estimate for SEM. The statistical modeling program, Mplus, is also featured and provides researchers with a

  10. Generic Models of Wind Turbine Generators for Advanced Applications in a VSC-based Offshore HVDC Network

    DEFF Research Database (Denmark)

    Zeni, Lorenzo; Margaris, Ioannis; Hansen, Anca Daniela;

    This paper focuses on generic Type 4 wind turbine generators models, their applicability in modern HVDC connections and their capability to provide advanced ancillary services therefrom. A point-to-point HVDC offshore connection is considered. Issues concerning coordinated HVDC and wind farm...... involving the HVDC converters- The performance against frequency disturbances of the two presented configurations is assessed and discussed by means of simulations....

  11. 基于MDA的模型转换研究与应用%Research and Application of Model Transformation Based on MDA

    Institute of Scientific and Technical Information of China (English)

    王永涛; 刘勇

    2011-01-01

    模型驱动方法解决了软件开发的效率低、可移植性差等问题,其中的模型转换是开发基于模型驱动构架(MDA)应用工具的关键技术.为此,在模型驱动方法的基础上,提出基于模式的平台无关模型到平台相关模型的模型转换方法,并根据该转换方法确立转换规则,在一个MDA应用系统开发实例中进行验证,实现从平台无关层模型到J2EE平台相关层EJB模型的转换.%Model driven methods are to solve the low efficiency, poor portability in software development, model transformation is the key technology in development of application tools based on Model Driven Architecture(MDA). This paper introduces the theory about MDA technology, proposes the model transformation method of Platform Specific Model(PIM) to Platform Specific Model(PSM) based on pattern, according the transformation method to establish the transformation rules, it is verified in an instance of MDA application development, this method realizes the transformation from platform independent model to enterprise Java bean model of J2EE in platform specific model.

  12. APPLICATION OF TWO VERSIONS OF A RNG BASED k-ε MODEL TO NUMERICAL SIMULATIONS OF TURBULENT IMPINGING JET FLOW

    Institute of Scientific and Technical Information of China (English)

    Chen Qing-guang; Xu Zhong; Zhang Yong-jian

    2003-01-01

    Two independent versions of the RNG based k-ε turbulence model in conjunction with the law of the wall have been applied to the numerical simulation of an axisymmetric turbulent impinging jet flow field. The two model predictions are compared with those of the standard k-ε model and with the experimental data measured by LDV (Laser Doppler Velocimetry). It shows that the original version of the RNG k-ε model with the choice of Cε1=1.063 can not yield good results, among them the predicted turbulent kinetic energy profiles in the vicinity of the stagnation region are even worse than those predicted by the standard k-ε model. However, the new version of RNG k-ε model behaves well. This is mainly due to the corrections to the constants Cε1 and Cε2 along with a modification of the production term to account for non-equilibrium strain rates in the flow.

  13. Graph Model Based Indoor Tracking

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Lu, Hua; Yang, Bin

    2009-01-01

    The tracking of the locations of moving objects in large indoor spaces is important, as it enables a range of applications related to, e.g., security and indoor navigation and guidance. This paper presents a graph model based approach to indoor tracking that offers a uniform data management...... infrastructure for different symbolic positioning technologies, e.g., Bluetooth and RFID. More specifically, the paper proposes a model of indoor space that comprises a base graph and mappings that represent the topology of indoor space at different levels. The resulting model can be used for one or several...... indoor positioning technologies. Focusing on RFID-based positioning, an RFID specific reader deployment graph model is built from the base graph model. This model is then used in several algorithms for constructing and refining trajectories from raw RFID readings. Empirical studies with implementations...

  14. Effect of experimental design on the prediction performance of calibration models based on near-infrared spectroscopy for pharmaceutical applications.

    Science.gov (United States)

    Bondi, Robert W; Igne, Benoît; Drennen, James K; Anderson, Carl A

    2012-12-01

    Near-infrared spectroscopy (NIRS) is a valuable tool in the pharmaceutical industry, presenting opportunities for online analyses to achieve real-time assessment of intermediates and finished dosage forms. The purpose of this work was to investigate the effect of experimental designs on prediction performance of quantitative models based on NIRS using a five-component formulation as a model system. The following experimental designs were evaluated: five-level, full factorial (5-L FF); three-level, full factorial (3-L FF); central composite; I-optimal; and D-optimal. The factors for all designs were acetaminophen content and the ratio of microcrystalline cellulose to lactose monohydrate. Other constituents included croscarmellose sodium and magnesium stearate (content remained constant). Partial least squares-based models were generated using data from individual experimental designs that related acetaminophen content to spectral data. The effect of each experimental design was evaluated by determining the statistical significance of the difference in bias and standard error of the prediction for that model's prediction performance. The calibration model derived from the I-optimal design had similar prediction performance as did the model derived from the 5-L FF design, despite containing 16 fewer design points. It also outperformed all other models estimated from designs with similar or fewer numbers of samples. This suggested that experimental-design selection for calibration-model development is critical, and optimum performance can be achieved with efficient experimental designs (i.e., optimal designs).

  15. PLiMoS, a DSML to Reify Semantics Relationships: An Application to Model-Based Product Lines

    OpenAIRE

    2013-01-01

    In the Model-Based Product Line Engineering (MBPLE) context, modularization and separation of concerns have been introduced to master the inherent complexity of current developments. With the aim to exploit e ciently the variabilities and commonalities in MBPLs, the challenge of management of dependencies becomes essential (e.g. hierarchical and variability decomposition, inter-dependencies between models). However, one may observe that, in existing approaches, relational information (i) is m...

  16. Hyaluronic acid algorithm-based models for assessment of liver ifbrosis:translation from basic science to clinical application

    Institute of Scientific and Technical Information of China (English)

    Zeinab Babaei; Hadi Parsian

    2016-01-01

    BACKGROUND: The estimation of liver ifbrosis is usually dependent on liver biopsy evaluation. Because of its disad-vantages and side effects, researchers try to ifnd non-invasive methods for the assessment of liver injuries. Hyaluronic acid has been proposed as an index for scoring the severity of if-brosis, alone or in algorithm models. The algorithm model in which hyaluronic acid was used as a major constituent was more reliable and accurate in diagnosis than hyaluronic acid alone. This review described various hyaluronic acid algo-rithm-based models for assessing liver ifbrosis. DATA SOURCE: A PubMed database search was performed to identify the articles relevant to hyaluronic acid algorithm-based models for estimating liver ifbrosis. RESULT: The use of hyaluronic acid in an algorithm model is an extra and valuable tool for assessing liver ifbrosis. CONCLUSIONS: Although hyaluronic acid algorithm-based models have good diagnostic power in liver ifbrosis assess-ment, they cannot render the need for liver biopsy obsolete and it is better to use them in parallel with liver biopsy. They can be used when frequent liver biopsy is not possible in situa-tions such as highlighting the efifcacy of treatment protocol for liver ifbrosis.

  17. Application of Kriging-based optimization and smoothed particle hydrodynamics in development of a microchannel heat exchanger model

    Energy Technology Data Exchange (ETDEWEB)

    McLellan, J.; Kaya, T.; Goldak, J., E-mail: joshuamclellan@cmail.carleton.ca, E-mail: Tarik.Kaya@carleton.ca, E-mail: jgoldak@mrco2.carleton.ca [Carleton Univ., Ottawa, Ontario (Canada)

    2014-07-01

    A microchannel heat exchanger (MCHX) is a technology that provides increased thermal efficiency in a small volume relative to other types of heat exchangers via an extremely high surface area-to-volume ratio. This characteristic is specifically valued when considering use in small modular reactors. With relatively little design information for commercial MCHXs available in open literature, development of a robust model and optimization thereof for use in nuclear reactors takes on significant importance. Some applications of this technology involve phase change, which is a challenging modelling problem given large volumetric changes of the liquid and gas phases, as well as moving boundaries at the phase interface. This problem is mitigated by use of a Lagrangian formulation such as smoothed particle hydrodynamics (SPH), whose use in the development of the MCHX model is discussed. Additionally, the Kriging optimization algorithm is introduced, including its use to generate a suitable MCHX design. (author)

  18. Noise model based ν-support vector regression with its application to short-term wind speed forecasting.

    Science.gov (United States)

    Hu, Qinghua; Zhang, Shiguang; Xie, Zongxia; Mi, Jusheng; Wan, Jie

    2014-09-01

    Support vector regression (SVR) techniques are aimed at discovering a linear or nonlinear structure hidden in sample data. Most existing regression techniques take the assumption that the error distribution is Gaussian. However, it was observed that the noise in some real-world applications, such as wind power forecasting and direction of the arrival estimation problem, does not satisfy Gaussian distribution, but a beta distribution, Laplacian distribution, or other models. In these cases the current regression techniques are not optimal. According to the Bayesian approach, we derive a general loss function and develop a technique of the uniform model of ν-support vector regression for the general noise model (N-SVR). The Augmented Lagrange Multiplier method is introduced to solve N-SVR. Numerical experiments on artificial data sets, UCI data and short-term wind speed prediction are conducted. The results show the effectiveness of the proposed technique.

  19. Three-parameter-based streamflow elasticity model: application to MOPEX basins in the USA at annual and seasonal scales

    Science.gov (United States)

    Konapala, Goutam; Mishra, Ashok K.

    2016-07-01

    We present a three-parameter streamflow elasticity model as a function of precipitation, potential evaporation, and change in groundwater storage applicable at both seasonal and annual scales. The model was applied to 245 Model Parameter Estimation Experiment (MOPEX) basins spread across the continental USA. The analysis of the modified equation at annual and seasonal scales indicated that the groundwater and surface water storage change contributes significantly to the streamflow elasticity. Overall, in case of annual as well as seasonal water balances, precipitation has higher elasticity values when compared to both potential evapotranspiration and storage changes. The streamflow elasticities show significant nonlinear associations with the climate conditions of the catchments indicating a complex interplay between elasticities and climate variables with substantial seasonal variations.

  20. Volterra-series-based nonlinear system modeling and its engineering applications: A state-of-the-art review

    Science.gov (United States)

    Cheng, C. M.; Peng, Z. K.; Zhang, W. M.; Meng, G.

    2017-03-01

    Nonlinear problems have drawn great interest and extensive attention from engineers, physicists and mathematicians and many other scientists because most real systems are inherently nonlinear in nature. To model and analyze nonlinear systems, many mathematical theories and methods have been developed, including Volterra series. In this paper, the basic definition of the Volterra series is recapitulated, together with some frequency domain concepts which are derived from the Volterra series, including the general frequency response function (GFRF), the nonlinear output frequency response function (NOFRF), output frequency response function (OFRF) and associated frequency response function (AFRF). The relationship between the Volterra series and other nonlinear system models and nonlinear problem solving methods are discussed, including the Taylor series, Wiener series, NARMAX model, Hammerstein model, Wiener model, Wiener-Hammerstein model, harmonic balance method, perturbation method and Adomian decomposition. The challenging problems and their state of arts in the series convergence study and the kernel identification study are comprehensively introduced. In addition, a detailed review is then given on the applications of Volterra series in mechanical engineering, aeroelasticity problem, control engineering, electronic and electrical engineering.

  1. RIGID-PLASTIC/RIGID-VISCOPLASTIC FEM BASED ON LINEAR PROGRAMMING—THEORETICAL MODELING AND APPLICATION FOR AXISYMMETRICAL PROBLEMS

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Compared with the traditional rigid-plastic/rigid-viscoplastic(RP/RVP) FEM(based on iteration solution),RP/RVP FEM based on linear programming (LP) has some remarkable advantages,such as it's free of convergence problem and its convenience in contact,rigid zone,and friction force treatment.The numerical model of RP/RVP FEM based on LP for axisymmetrical metal forming simulation is studied,and some related key factors and its treatment methods in formulation of constraint condition are proposed.Some solution examples are provided to validate its accuracy and efficiency.

  2. Modeling of a Membrane Based Humidifier for Fuel Cell Applications Subject to End-Of-Life Conditions

    DEFF Research Database (Denmark)

    Nielsen, Mads Pagh; Olesen, Anders Christian; Menard, Alan

    2014-01-01

    Proton Exchange Membrane (PEM) Fuel Cell Stacks efficiently convert the chemical energy in hydrogen to electricity through electrochemical reactions occurring on either side of a proton conducting electrolyte. This is a promising and very robust energy conversion process which can be used in many...... applications. For instance for automotive applications and various backup power systems substituting batteries. Humidification of the inlet air of PEM fuel cell stacks is essential to obtain optimum proton conductivity. Operational humidities of the anode and cathode streams having dew points close to the fuel...... cell operating temperature are required. These conditions must be met at the Beginning-Of-Life (BOL) as well as at the End-Of-Life (EOL) of the fuel cell system. This paper presents results of a numerical 1D model of the heat- and mass transport phenomena in a membrane humidifier with a Nafion...

  3. CORBA Based CIMS Application Integration

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    Common object request broker architecture (CORBA) provides the framework and the mechanism for distributed object operation. It can also be applied to computer integrated manufacturing system (CIMS) application integration. This paper studies the CIMS information service requirement, presents a CORBA based integration approach including the CORBA based CIM information system architecture and the application integration mechanism, and discusses the relationship between CORBA and the CIM application integration platform.

  4. Application of GIS-based models for the evaluation of water resources; Analisi GIS-based dei fenomeni idrologici per la pianificazione territoriale

    Energy Technology Data Exchange (ETDEWEB)

    Pistocchi, A.; Neri, D. [Studio di Ingegneria per l' Ambiente e il Territorio, Cesena, FO (Italy)

    2000-08-01

    The paper illustrates the application of GIS-based models for the evaluation of water resources in a hilly area for land use planning at the municipality level. The method, suitable for integration with more detailed analyses for specialistic purposes, provides the fundamental elements for decision in areas of the kind hereby considered, by describing the terms of water budget, diffuse pollution and soil erosion at a regional level. [Italian] Il presente lavoro illustra l'applicazione di tecniche di cartografia modellistica per la caratterizzazione degli aspetti idrologici rilevanti ai fini della pianificazione territoriale di un comune collinare in Emilia Romagna. La metodologia viene proposta in senso generale per la valutazione preliminare delle risorse idriche e dei fattori che limitano la qualita' e gli usi, attraverso la descrizione regionalizzata dei termini del bilancio idrologico, dell'inquinamento diffuso e dell'erosione dei suoli. Pur non sostituendo analisi di maggiore dettaglio a fini specifici, si ritiene che la valutazione fornisca gli elementi fondamentali per le decisioni sull'uso del suolo in aree del tipo considerato.

  5. Model for water pollution remote sensing based on double scattering and its application in the Zhujiang River outfall

    Institute of Scientific and Technical Information of China (English)

    DENG Ruru; LIU Qinhuo; KE Ruiping; CHENG Lei; LIU Xiaoping

    2004-01-01

    It is a valid route for quantitatively remote sensing on water pollution to build a model according to the physical mechanisms of scattering and absorbing of suspended substance, pollutant, and molecules of water. Remote sensing model for water pollution based on single scattering is simple and easy to be used, but the precision is affected by turbidity of water. The characteristics of the energy composition of multiple scattering, are analyzed and it is proposed that, based on the model of single scattering, ifthe flux of the second scattering is considered additionally, the precision of the modelwill be remarkably improved and the calculation is still very simple. The factor of the second scattering is deduced to build a double scattering model, and the practical arithmetic for the calculation of the model is put forward. The result of applying this model in the water area around the Zhujiang(Pearl) River outfall shows that the precision is obviously improved. The result also shows that the seriously polluted water area is distributed in the northeast of Lingding Sea, the Victoria Bay of Hong Kong, and the Shengzhen Bay.

  6. Application of Intelligence Based Genetic Algorithm for Job Sequencing Problem on Parallel Mixed-Model Assembly Line

    Directory of Open Access Journals (Sweden)

    A. Norozi

    2010-01-01

    Full Text Available Problem statement: In the area of globalization the degree of competition in the market increased and many companies attempted to manufacture the products efficiently to overcome the challenges faced. Approach: Mixed model assembly line was able to provide continuous flow of material and flexibility with regard to model change. The problem under study attempted to describe the mathematical programming limitation for minimizing the overall make-span and balancing objective for set of parallel lines. Results: A proposed mixed-integer model only able to find the best job sequence in each line to meet the problem objectives for the given number of job allotted to each line. Hence using the proposed mathematical model for large size problem was time consuming and inefficient as so many job allocation values should be checked. This study presented an intelligence based genetic algorithm approach to optimize the considered problem objectives through reducing the problem complexity. A heuristic algorithm was introduced to generate the initial population for intelligence based genetic algorithm. Then, it started to find the best sequence of jobs for each line based on the generated population by heuristic algorithm. By this means, intelligence based genetic algorithm only concentrated on those initial populations that produce better solutions instead of probing the entire search space. Conclusion/Recommendations: The results obtained from intelligence based genetic algorithm were used as an initial point for fine-tuning by simulated annealing to increase the quality of solution. In order to check the capability of proposed algorithm, several experimentations on the set of problems were done. As the total objective values in most of problems could not be improved by simulated algorithm, it proved the well performing of proposed intelligence based genetic algorithm in reaching the near optimal solutions.

  7. Multilevel Models Applications Using SAS

    CERN Document Server

    Wang, Jichuan; Fisher, James

    2011-01-01

    This book covers a broad range of topics about multilevel modeling. The goal is to help readersto understand the basic concepts, theoretical frameworks, and application methods of multilevel modeling. Itis at a level also accessible to non-mathematicians, focusing on the methods and applications of various multilevel models and using the widely used statistical software SAS®.Examples are drawn from analysis of real-world research data.

  8. On-line forecasting model for zinc output based on self-tuning support vector regression and its application

    Institute of Scientific and Technical Information of China (English)

    胡志坤; 桂卫华; 彭小奇

    2004-01-01

    An on-line forecasting model based on self-tuning support vectors regression for zinc output was put forward to maximize zinc output by adjusting operational parameters in the process of imperial smelting furnace. In this model, the mathematical model of support vector regression was converted into the same format as support vector machine for classification. Then a simplified sequential minimal optimization for classification was applied to train the regression coefficient vector α- α* and threshold b. Sequentially penalty parameter C was tuned dynamically through forecasting result during the training process. Finally, an on-line forecasting algorithm for zinc output was proposed. The simulation result shows that in spite of a relatively small industrial data set, the effective error is less than 10% with a remarkable performance of real time. The model was applied to the optimization operation and fault diagnosis system for imperial smelting furnace.

  9. Model-Based Reasoning

    Science.gov (United States)

    Ifenthaler, Dirk; Seel, Norbert M.

    2013-01-01

    In this paper, there will be a particular focus on mental models and their application to inductive reasoning within the realm of instruction. A basic assumption of this study is the observation that the construction of mental models and related reasoning is a slowly developing capability of cognitive systems that emerges effectively with proper…

  10. LSER-based modeling vapor pressures of (solvent+salt) systems by application of Xiang-Tan equation

    Institute of Scientific and Technical Information of China (English)

    Aynur Senol

    2015-01-01

    The study deals with modeling the vapor pressures of (solvent+salt) systems depending on the linear solvation energy relation (LSER) principles. The LSER-based vapor pressure model clarifies the simultaneous impact of the vapor pressure of a pure solvent estimated by the Xiang-Tan equation, the solubility and solvatochromic parameters of the solvent and the physical properties of the ionic salt. It has been performed independently two structural forms of the generalized solvation model, i.e. the unified solvation model with the integrated properties (USMIP) containing nine physical descriptors and the reduced property-basis solvation model. The vapor pressure data of fourteen (solvent+salt) systems have been processed to analyze statistical y the reliabil-ity of existing models in terms of a log-ratio objective function. The proposed vapor pressure approaches reproduce the observed performance relatively accurately, yielding the overall design factors of 1.0643 and 1.0702 for the integrated property-basis and reduced property-basis solvation models.

  11. Model Predictions and Ground-based Observations for Jupiter's Magnetospheric Environment: Application to the JUICE and Juno Missions

    Science.gov (United States)

    Achilleos, Nicholas; Guio, Patrick; Arridge, Christopher S.; Ray, Licia C.; Yates, Japheth N.; Fossey, Stephen J.; Savini, Giorgio; Pearson, Mick; Fernando, Nathalie; Gerasimov, Roman; Murat, Thomas

    2016-10-01

    The advent of new missions to the Jovian system such as Juno (recently arrived) and JUICE (scheduled for 2022 launch) makes timely the provision of model-based predictions for the physical conditions to be encountered by these spacecraft; as well as the planning of simultaneous, ground-based observations of the Jovian system.Using the UCL Jovian magnetodisc model, which calculates magnetic field and plasma distributionsaccording to Caudal's (1986) force-balance formalism, we provide predictions of the following quantities along representative Juno / JUICE orbits through the middle magnetosphere: (i) Magnetic field strength and direction; (ii) Density and / or pressure of the 'cold' and 'hot' particle populations; (iii) Plasma angular velocity.The characteristic variation in these parameters is mainly influenced by the periodic approaches towards and recessions from the magnetodisc imposed on the 'synthetic spacecraft' by the planet's rotating, tilteddipole field. We also include some corresponding predictions for ionospheric / thermospheric conditions at the magnetic footpoint of the spacecraft, using the JASMIN model (Jovian Atmospheric Simulatorwith Magnetosphere, Ionosphere and Neutrals).We also present preliminary imaging results from 'IoSpot', a planned, ground-based programme of observations based at the University College London Observatory (UCLO) which targets ionized sulphur emissions from the Io plasma torus. Such programmes, conducted simultaneously with the above missions, will provide valuable context for the overall physical conditions within the Jovian magnetosphere, for which Io's volcanoes are the principal source of plasma.

  12. Constraint-based modeling of heterologous pathways: application and experimental demonstration for overproduction of fatty acids in Escherichia coli.

    Science.gov (United States)

    Ip, Kuhn; Donoghue, Neil; Kim, Min Kyung; Lun, Desmond S

    2014-10-01

    Constraint-based modeling has been shown, in many instances, to be useful for metabolic engineering by allowing the prediction of the metabolic phenotype resulting from genetic manipulations. But the basic premise of constraint-based modeling-that of applying constraints to preclude certain behaviors-only makes sense for certain genetic manipulations (such as knockouts and knockdowns). In particular, when genes (such as those associated with a heterologous pathway) are introduced under artificial control, it is unclear how to predict the correct behavior. In this paper, we introduce a modeling method that we call proportional flux forcing (PFF) to model artificially induced enzymatic genes. The model modifications introduced by PFF can be transformed into a set of simple mass balance constraints, which allows computational methods for strain optimization based on flux balance analysis (FBA) to be utilized. We applied PFF to the metabolic engineering of Escherichia coli (E. coli) for free fatty acid (FFA) production-a metabolic engineering problem that has attracted significant attention because FFAs are a precursor to liquid transportation fuels such as biodiesel and biogasoline. We show that PFF used in conjunction with FBA-based computational strain optimization methods can yield non-obvious genetic manipulation strategies that significantly increase FFA production in E. coli. The two mutant strains constructed and successfully tested in this work had peak fatty acid (FA) yields of 0.050 g FA/g carbon source (17.4% theoretical yield) and 0.035 g FA/g carbon source (12.3% theoretical yield) when they were grown using a mixed carbon source of glucose and casamino acids in a ratio of 2-to-1. These yields represent increases of 5.4- and 3.8-fold, respectively, over the baseline strain.

  13. Development of Hierarchical Bayesian Model Based on Regional Frequency Analysis and Its Application to Estimate Areal Rainfall in South Korea

    Science.gov (United States)

    Kim, J.; Kwon, H. H.

    2014-12-01

    The existing regional frequency analysis has disadvantages in that it is difficult to consider geographical characteristics in estimating areal rainfall. In this regard, This study aims to develop a hierarchical Bayesian model based regional frequency analysis in that spatial patterns of the design rainfall with geographical information are explicitly incorporated. This study assumes that the parameters of Gumbel distribution are a function of geographical characteristics (e.g. altitude, latitude and longitude) within a general linear regression framework. Posterior distributions of the regression parameters are estimated by Bayesian Markov Chain Monte Calro (MCMC) method, and the identified functional relationship is used to spatially interpolate the parameters of the Gumbel distribution by using digital elevation models (DEM) as inputs. The proposed model is applied to derive design rainfalls over the entire Han-river watershed. It was found that the proposed Bayesian regional frequency analysis model showed similar results compared to L-moment based regional frequency analysis. In addition, the model showed an advantage in terms of quantifying uncertainty of the design rainfall and estimating the area rainfall considering geographical information. Acknowledgement: This research was supported by a grant (14AWMP-B079364-01) from Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  14. Development and application of EEAST: a life cycle based model for use of harvested rainwater and composting toilets in buildings.

    Science.gov (United States)

    Devkota, J; Schlachter, H; Anand, C; Phillips, R; Apul, Defne

    2013-11-30

    Harvested rainwater systems and composting toilets are expected to be an important part of sustainable solutions in buildings. Yet, to this date, a model evaluating their economic and environmental impact has been missing. To address this need, a life cycle based model, EEAST was developed. EEAST was designed to compare the business as usual (BAU) case of using potable water for toilet flushing and irrigation to alternative scenarios of rainwater harvesting and composting toilet based technologies. In EEAST, building characteristics, occupancy, and precipitation are used to size the harvested rainwater and composting toilet systems. Then, life cycle costing and life cycle assessment methods are used to estimate cost, energy, and greenhouse gas (GHG) emission payback periods (PPs) for five alternative scenarios. The scenarios modeled include use of harvested rainwater for toilet flushing, for irrigation, or both; and use of composting toilets with or without harvested rainwater use for irrigation. A sample simulation using EEAST showed that for the office building modeled, the cost PPs were greater than energy PPs which in turn were greater than GHG emission PPs. This was primarily due to energy and emission intensive nature of the centralized water and wastewater infrastructure. The sample simulation also suggested that the composting toilets may have the best performance in all criteria. However, EEAST does not explicitly model solids management and as such may give composting toilets an unfair advantage compared to flush based toilets. EEAST results were found to be very sensitive to cost values used in the model. With the availability of EEAST, life cycle cost, energy, and GHG emissions can now be performed fairly easily by building designers and researchers. Future work is recommended to further improve EEAST and evaluate it for different types of buildings and climates so as to better understand when composting toilets and harvested rainwater systems

  15. Fusion strategies for selecting multiple tuning parameters for multivariate calibration and other penalty based processes: A model updating application for pharmaceutical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tencate, Alister J. [Department of Chemistry, Idaho State University, Pocatello, ID 83209 (United States); Kalivas, John H., E-mail: kalijohn@isu.edu [Department of Chemistry, Idaho State University, Pocatello, ID 83209 (United States); White, Alexander J. [Department of Physics and Optical Engineering, Rose-Hulman Institute of Technology, Terre Huate, IN 47803 (United States)

    2016-05-19

    New multivariate calibration methods and other processes are being developed that require selection of multiple tuning parameter (penalty) values to form the final model. With one or more tuning parameters, using only one measure of model quality to select final tuning parameter values is not sufficient. Optimization of several model quality measures is challenging. Thus, three fusion ranking methods are investigated for simultaneous assessment of multiple measures of model quality for selecting tuning parameter values. One is a supervised learning fusion rule named sum of ranking differences (SRD). The other two are non-supervised learning processes based on the sum and median operations. The effect of the number of models evaluated on the three fusion rules are also evaluated using three procedures. One procedure uses all models from all possible combinations of the tuning parameters. To reduce the number of models evaluated, an iterative process (only applicable to SRD) is applied and thresholding a model quality measure before applying the fusion rules is also used. A near infrared pharmaceutical data set requiring model updating is used to evaluate the three fusion rules. In this case, calibration of the primary conditions is for the active pharmaceutical ingredient (API) of tablets produced in a laboratory. The secondary conditions for calibration updating is for tablets produced in the full batch setting. Two model updating processes requiring selection of two unique tuning parameter values are studied. One is based on Tikhonov regularization (TR) and the other is a variation of partial least squares (PLS). The three fusion methods are shown to provide equivalent and acceptable results allowing automatic selection of the tuning parameter values. Best tuning parameter values are selected when model quality measures used with the fusion rules are for the small secondary sample set used to form the updated models. In this model updating situation, evaluation of

  16. Fundamental Study on Applicability of Powder-Based 3D Printer for Physical Modeling in Rock Mechanics

    Science.gov (United States)

    Fereshtenejad, Sayedalireza; Song, Jae-Joon

    2016-06-01

    Applications of 3D printing technology become more widespread in many research fields because of its rapid development and valuable capabilities. In rock mechanics and mining engineering, this technology has the potential to become a useful tool that might help implement a number of research studies previously considered impractical. Most commercial 3D printers cannot print prototypes with mechanical properties that match precisely those of natural rock samples. Therefore, some additional enhancements are required for 3D printers to be effectively utilized for rock mechanics applications. In this study, we printed and studied specimens using a powder-based commercial ZPrinter® 450 with ZP® 150 powder and Zb® 63 binder used as raw materials. The specimens printed by this 3D printer exhibited relatively low strength and ductile behavior, implying that it needs further improvements. Hence, we focused on several ways to determine the best combination of printing options and post-processing including the effects of the printing direction, printing layer thickness, binder saturation level, and heating process on the uniaxial compressive strength (UCS) and stress-strain behavior of the printed samples. The suggested procedures have demonstrated their effectiveness by obtaining the printed samples that behave similarly to the natural rocks with low UCS. Although our optimization methods were particularly successful, further improvements are required to expand 3D printer application in the area of rock mechanics.

  17. An efficient algorithm for computing fixed length attractors based on bounded model checking in synchronous Boolean networks with biochemical applications.

    Science.gov (United States)

    Li, X Y; Yang, G W; Zheng, D S; Guo, W S; Hung, W N N

    2015-01-01

    Genetic regulatory networks are the key to understanding biochemical systems. One condition of the genetic regulatory network under different living environments can be modeled as a synchronous Boolean network. The attractors of these Boolean networks will help biologists to identify determinant and stable factors. Existing methods identify attractors based on a random initial state or the entire state simultaneously. They cannot identify the fixed length attractors directly. The complexity of including time increases exponentially with respect to the attractor number and length of attractors. This study used the bounded model checking to quickly locate fixed length attractors. Based on the SAT solver, we propose a new algorithm for efficiently computing the fixed length attractors, which is more suitable for large Boolean networks and numerous attractors' networks. After comparison using the tool BooleNet, empirical experiments involving biochemical systems demonstrated the feasibility and efficiency of our approach.

  18. Models for Dynamic Applications

    DEFF Research Database (Denmark)

    Sales-Cruz, Mauricio; Morales Rodriguez, Ricardo; Heitzig, Martina;

    2011-01-01

    This chapter covers aspects of the dynamic modelling and simulation of several complex operations that include a controlled blending tank, a direct methanol fuel cell that incorporates a multiscale model, a fluidised bed reactor, a standard chemical reactor and finally a polymerisation reactor...

  19. Modelling of Impulsional pH Variations Using ChemFET-Based Microdevices: Application to Hydrogen Peroxide Detection

    OpenAIRE

    Abdou Karim Diallo; Lyes Djeghlaf; Jerome Launay; Pierre Temple-Boyer

    2014-01-01

    This work presents the modelling of impulsional pH variations in microvolume related to water-based electrolysis and hydrogen peroxide electrochemical oxidation using an Electrochemical Field Effect Transistor (ElecFET) microdevice. This ElecFET device consists of a pH-Chemical FET (pH-ChemFET) with an integrated microelectrode around the dielectric gate area in order to trigger electrochemical reactions. Combining oxidation/reduction reactions on the microelectrode, water self-ionization and...

  20. Land Use Allocation Based on a Multi-Objective Artificial Immune Optimization Model: An Application in Anlu County, China

    Directory of Open Access Journals (Sweden)

    Xiaoya Ma

    2015-11-01

    Full Text Available As the main feature of land use planning, land use allocation (LUA optimization is an important means of creating a balance between the land-use supply and demand in a region and promoting the sustainable utilization of land resources. In essence, LUA optimization is a multi-objective optimization problem under the land use supply and demand constraints in a region. In order to obtain a better sustainable multi-objective LUA optimization solution, the present study proposes a LUA model based on the multi-objective artificial immune optimization algorithm (MOAIM-LUA model. The main achievements of the present study are as follows: (a the land-use supply and demand factors are analyzed and the constraint conditions of LUA optimization problems are constructed based on the analysis framework of the balance between the land use supply and demand; (b the optimization objectives of LUA optimization problems are defined and modeled using ecosystem service value theory and land rent and price theory; and (c a multi-objective optimization algorithm is designed for solving multi-objective LUA optimization problems based on the novel immune clonal algorithm (NICA. On the basis of the aforementioned achievements, MOAIM-LUA was applied to a real case study of land-use planning in Anlu County, China. Compared to the current land use situation in Anlu County, optimized LUA solutions offer improvements in the social and ecological objective areas. Compared to the existing models, such as the non-dominated sorting genetic algorithm-II, experimental results demonstrate that the model designed in the present study can obtain better non-dominated solution sets and is superior in terms of algorithm stability.

  1. Fusion strategies for selecting multiple tuning parameters for multivariate calibration and other penalty based processes: A model updating application for pharmaceutical analysis.

    Science.gov (United States)

    Tencate, Alister J; Kalivas, John H; White, Alexander J

    2016-05-19

    New multivariate calibration methods and other processes are being developed that require selection of multiple tuning parameter (penalty) values to form the final model. With one or more tuning parameters, using only one measure of model quality to select final tuning parameter values is not sufficient. Optimization of several model quality measures is challenging. Thus, three fusion ranking methods are investigated for simultaneous assessment of multiple measures of model quality for selecting tuning parameter values. One is a supervised learning fusion rule named sum of ranking differences (SRD). The other two are non-supervised learning processes based on the sum and median operations. The effect of the number of models evaluated on the three fusion rules are also evaluated using three procedures. One procedure uses all models from all possible combinations of the tuning parameters. To reduce the number of models evaluated, an iterative process (only applicable to SRD) is applied and thresholding a model quality measure before applying the fusion rules is also used. A near infrared pharmaceutical data set requiring model updating is used to evaluate the three fusion rules. In this case, calibration of the primary conditions is for the active pharmaceutical ingredient (API) of tablets produced in a laboratory. The secondary conditions for calibration updating is for tablets produced in the full batch setting. Two model updating processes requiring selection of two unique tuning parameter values are studied. One is based on Tikhonov regularization (TR) and the other is a variation of partial least squares (PLS). The three fusion methods are shown to provide equivalent and acceptable results allowing automatic selection of the tuning parameter values. Best tuning parameter values are selected when model quality measures used with the fusion rules are for the small secondary sample set used to form the updated models. In this model updating situation, evaluation of

  2. Model-Based Pseudo-Quad-Pol Reconstruction from Compact Polarimetry and Its Application to Oil-Spill Observation

    Directory of Open Access Journals (Sweden)

    Junjun Yin

    2015-01-01

    Full Text Available Compact polarimetry is an effective imaging mode for wide area observation, especially for the open ocean. In this study, we propose a new method for pseudo-quad-polarization reconstruction from compact polarimetry based on the three-component decomposition. By using the decomposed powers, the reconstruction model is established as a power-weighted model. Further, the phase of the copolarized correlation is taken into consideration. The phase of double-bounce scattering is closer to π than to 0, while the phase of surface scattering is closer to 0 than to π. By considering the negative (double-bounce reflection and positive (surface reflection copolarized correlation, the reconstruction model for full polarimetry has a good consistency with the real polarimetric SAR data. L-band ALOS/PALSAR-1 fully polarimetric data acquired on August 27, 2006, over an oil-spill area are used for demonstration. Reconstruction performance is evaluated with a set of typical polarimetric oil-spill indicators. Quantitative comparison is given. Results show that the proposed model-based method is of great potential for oil-spill observation.

  3. GIS-Based (W+-W-) Weight of Evidence Model and Its Application to Gold Resources Assessment in Abitibi, Canada

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    The weight of evidence (WofE) model has been widely used for mineral potential mapping.During the conversion of a multiclass map into a binary map a lot of mineralization information is artificially added or lost because the generalization of the class within the cumulative distance interval to a linear feature is based on a maximum contrast, which matches a cumulative distance interval. Additionally,some categorical data evidence cannot be generated by this method because a maximum contrast does not exist. In this article, an alternative (W+ -W- )-based WofE model is proposed. In this model, the "(W+ -W- ) greater than zero or not" is used as a criterion to reclassify the corresponding categorical class into a presence or absence class to convert a multiclass map into a binary map. This model can be applied to both categorical data and successive data. The latter can be operated as categorical data. The W+ and W- of the generated binary maps can be recalculated, and several binary maps can be integrated on the condition that the reclassified binary evidences are conditionally independent of each other. This method effectively reduces artificial data and both nominal and categorical data can be operated. A case study of gold potential mapping in the Abitibi area, Ontario, Canada, shows that the gold potential map by the (W+ -W- ) model displays a smaller potential area but a higher posterior probability (POP),whereas the potential map by the traditional (W+ -W- ) model exhibits a larger potential area but a lower POP.

  4. A CCA+ICA based model for multi-task brain imaging data fusion and its application to schizophrenia.

    Science.gov (United States)

    Sui, Jing; Adali, Tülay; Pearlson, Godfrey; Yang, Honghui; Sponheim, Scott R; White, Tonya; Calhoun, Vince D

    2010-05-15

    Collection of multiple-task brain imaging data from the same subject has now become common practice in medical imaging studies. In this paper, we propose a simple yet effective model, "CCA+ICA", as a powerful tool for multi-task data fusion. This joint blind source separation (BSS) model takes advantage of two multivariate methods: canonical correlation analysis and independent component analysis, to achieve both high estimation accuracy and to provide the correct connection between two datasets in which sources can have either common or distinct between-dataset correlation. In both simulated and real fMRI applications, we compare the proposed scheme with other joint BSS models and examine the different modeling assumptions. The contrast images of two tasks: sensorimotor (SM) and Sternberg working memory (SB), derived from a general linear model (GLM), were chosen to contribute real multi-task fMRI data, both of which were collected from 50 schizophrenia patients and 50 healthy controls. When examining the relationship with duration of illness, CCA+ICA revealed a significant negative correlation with temporal lobe activation. Furthermore, CCA+ICA located sensorimotor cortex as the group-discriminative regions for both tasks and identified the superior temporal gyrus in SM and prefrontal cortex in SB as task-specific group-discriminative brain networks. In summary, we compared the new approach to some competitive methods with different assumptions, and found consistent results regarding each of their hypotheses on connecting the two tasks. Such an approach fills a gap in existing multivariate methods for identifying biomarkers from brain imaging data.

  5. A Statistically-Based Low-Level Cloud Scheme and Its Tentative Application in a General Circulation Model

    Institute of Scientific and Technical Information of China (English)

    DAI Fushan; YU Rucong; ZHANG Xuehong; YU Yongqiang

    2005-01-01

    In this study, a statistical cloud scheme is first introduced and coupled with a first-order turbulence scheme with second-order turbulence moments parameterized by the timescale of the turbulence dissipation and the vertical turbulent diffusion coefficient. Then the ability of the scheme to simulate cloud fraction at different relative humidity, vertical temperature profile, and the timescale of the turbulent dissipation is examined by numerical simulation. It is found that the simulated cloud fraction is sensitive to the parameter used in the statistical cloud scheme and the timescale of the turbulent dissipation. Based on the analyses, the introduced statistical cloud scheme is modified. By combining the modified statistical cloud scheme with a boundary layer cumulus scheme, a new statistically-based low-level cloud scheme is proposed and tentatively applied in NCAR (National Center for Atmospheric Research) CCM3 (Community Climate Model version3). It is found that the simulation of low-level cloud fraction is markedly improved and the centers with maximum low-level cloud fractions are well simulated in the cold oceans off the western coasts with the statistically-based low-level cloud scheme applied in CCM3. It suggests that the new statistically-based low-level cloud scheme has a great potential in the general circulation model for improving the low-level cloud parameterization.

  6. Study and application of monitoring plane displacement of a similarity model based on time-series images

    Institute of Scientific and Technical Information of China (English)

    Xu Jiankun; Wang Enyuan; Li Zhonghui; Wang Chao

    2011-01-01

    In order to compensate for the deficiency of present methods of monitoring plane displacement in similarity model tests,such as inadequate real-time monitoring and more manual intervention,an effective monitoring method was proposed in this study,and the major steps of the monitoring method include:firstly,time-series images of the similarity model in the test were obtained by a camera,and secondly,measuring points marked as artificial targets were automatically tracked and recognized from time-series images.Finally,the real-time plane displacement field was calculated by the fixed magnification between objects and images under the specific conditions.And then the application device of the method was designed and tested.At the same time,a sub-pixel location method and a distortion error model were used to improve the measuring accuracy.The results indicate that this method may record the entire test,especially the detailed non-uniform deformation and sudden deformation.Compared with traditional methods this method has a number of advantages,such as greater measurement accuracy and reliability,less manual intervention,higher automation,strong practical properties,much more measurement information and so on.

  7. Distributed Parameter Modelling Applications

    DEFF Research Database (Denmark)

    2011-01-01

    sands processing. The fertilizer granulation model considers the dynamics of MAP-DAP (mono and diammonium phosphates) production within an industrial granulator, that involves complex crystallisation, chemical reaction and particle growth, captured through population balances. A final example considers...

  8. Application of a Distributed, Physically Based, Hydrologic Model to Improve Streamflow Forecasts in the Upper Rio Grande Basin

    Science.gov (United States)

    Gorham, T. A.; Boyle, D. P.; McConnell, J. R.; Lamorey, G. W.; Markstrom, S.; Viger, R.; Leavesley, G.

    2001-12-01

    Approximately two-thirds of the runoff in the Rio Grande begins as seasonal snowpack in the headwaters above the USGS stream gaging stations at several points (nodes) above Albuquerque, New Mexico. Resource managers in the Rio Grande Basin rely on accurate short and long term forecasts of water availability and flow at these nodes to make important decisions aimed at achieving a balance among many different and competing water uses such as municipal, fish and wildlife, agricultural, and water quality. In this study, a distributed, physically based hydrologic model is used to investigate the degree of spatial and temporal distribution of snow and the processes that control snowmelt necessary to accurately simulate streamflow at seven of these nodes. Specifically, snow distribution and surface runoff are estimated using a combination of the USGS Modular Modeling System (MMS), GIS Weasel, Precipitation-Runoff Modeling System (PRMS), and XYZ snow distribution model. This highly collaborative work between researchers at the Desert Research Institute and the USGS is an important part of SAHRA (Sustainability of semi-Arid Hydrology and Riparian Areas) efforts aimed at improving models of snow distribution and snowmelt processes.

  9. Selection of Polynomial Chaos Bases via Bayesian Model Uncertainty Methods with Applications to Sparse Approximation of PDEs with Stochastic Inputs

    Energy Technology Data Exchange (ETDEWEB)

    Karagiannis, Georgios; Lin, Guang

    2014-02-15

    Generalized polynomial chaos (gPC) expansions allow the representation of the solution of a stochastic system as a series of polynomial terms. The number of gPC terms increases dramatically with the dimension of the random input variables. When the number of the gPC terms is larger than that of the available samples, a scenario that often occurs if the evaluations of the system are expensive, the evaluation of the gPC expansion can be inaccurate due to over-fitting. We propose a fully Bayesian approach that allows for global recovery of the stochastic solution, both in spacial and random domains, by coupling Bayesian model uncertainty and regularization regression methods. It allows the evaluation of the PC coefficients on a grid of spacial points via (1) Bayesian model average or (2) medial probability model, and their construction as functions on the spacial domain via spline interpolation. The former accounts the model uncertainty and provides Bayes-optimal predictions; while the latter, additionally, provides a sparse representation of the solution by evaluating the expansion on a subset of dominating gPC bases when represented as a gPC expansion. Moreover, the method quantifies the importance of the gPC bases through inclusion probabilities. We design an MCMC sampler that evaluates all the unknown quantities without the need of ad-hoc techniques. The proposed method is suitable for, but not restricted to, problems whose stochastic solution is sparse at the stochastic level with respect to the gPC bases while the deterministic solver involved is expensive. We demonstrate the good performance of the proposed method and make comparisons with others on 1D, 14D and 40D in random space elliptic stochastic partial differential equations.

  10. Estimating daily time series of streamflow using hydrological model calibrated based on satellite observations of river water surface width: Toward real world applications.

    Science.gov (United States)

    Sun, Wenchao; Ishidaira, Hiroshi; Bastola, Satish; Yu, Jingshan

    2015-05-01

    Lacking observation data for calibration constrains applications of hydrological models to estimate daily time series of streamflow. Recent improvements in remote sensing enable detection of river water-surface width from satellite observations, making possible the tracking of streamflow from space. In this study, a method calibrating hydrological models using river width derived from remote sensing is demonstrated through application to the ungauged Irrawaddy Basin in Myanmar. Generalized likelihood uncertainty estimation (GLUE) is selected as a tool for automatic calibration and uncertainty analysis. Of 50,000 randomly generated parameter sets, 997 are identified as behavioral, based on comparing model simulation with satellite observations. The uncertainty band of streamflow simulation can span most of 10-year average monthly observed streamflow for moderate and high flow conditions. Nash-Sutcliffe efficiency is 95.7% for the simulated streamflow at the 50% quantile. These results indicate that application to the target basin is generally successful. Beyond evaluating the method in a basin lacking streamflow data, difficulties and possible solutions for applications in the real world are addressed to promote future use of the proposed method in more ungauged basins.

  11. A statistically based seasonal precipitation forecast model with automatic predictor selection and its application to central and south Asia

    Science.gov (United States)

    Gerlitz, Lars; Vorogushyn, Sergiy; Apel, Heiko; Gafurov, Abror; Unger-Shayesteh, Katy; Merz, Bruno

    2016-11-01

    The study presents a statistically based seasonal precipitation forecast model, which automatically identifies suitable predictors from globally gridded sea surface temperature (SST) and climate variables by means of an extensive data-mining procedure and explicitly avoids the utilization of typical large-scale climate indices. This leads to an enhanced flexibility of the model and enables its automatic calibration for any target area without any prior assumption concerning adequate predictor variables. Potential predictor variables are derived by means of a cell-wise correlation analysis of precipitation anomalies with gridded global climate variables under consideration of varying lead times. Significantly correlated grid cells are subsequently aggregated to predictor regions by means of a variability-based cluster analysis. Finally, for every month and lead time, an individual random-forest-based forecast model is constructed, by means of the preliminary generated predictor variables. Monthly predictions are aggregated to running 3-month periods in order to generate a seasonal precipitation forecast. The model is applied and evaluated for selected target regions in central and south Asia. Particularly for winter and spring in westerly-dominated central Asia, correlation coefficients between forecasted and observed precipitation reach values up to 0.48, although the variability of precipitation rates is strongly underestimated. Likewise, for the monsoonal precipitation amounts in the south Asian target area, correlations of up to 0.5 were detected. The skill of the model for the dry winter season over south Asia is found to be low. A sensitivity analysis with well-known climate indices, such as the El Niño- Southern Oscillation (ENSO), the North Atlantic Oscillation (NAO) and the East Atlantic (EA) pattern, reveals the major large-scale controlling mechanisms of the seasonal precipitation climate for each target area. For the central Asian target areas, both

  12. Applicability of a noise-based model to estimate in-traffic exposure to black carbon and particle number concentrations in different cultures.

    Science.gov (United States)

    Dekoninck, Luc; Botteldooren, Dick; Panis, Luc Int; Hankey, Steve; Jain, Grishma; S, Karthik; Marshall, Julian

    2015-01-01

    Several studies show that a significant portion of daily air pollution exposure, in particular black carbon (BC), occurs during transport. In a previous work, a model for the in-traffic exposure of bicyclists to BC was proposed based on spectral evaluation of mobile noise measurements and validated with BC measurements in Ghent, Belgium. In this paper, applicability of this model in a different cultural context with a totally different traffic and mobility situation is presented. In addition, a similar modeling approach is tested for particle number (PN) concentration. Indirectly assessing BC and PN exposure through a model based on noise measurements is advantageous because of the availability of very affordable noise monitoring devices. Our previous work showed that a model including specific spectral components of the noise that relate to engine and rolling emission and basic meteorological data, could be quite accurate. Moreover, including a background concentration adjustment improved the model considerably. To explore whether this model could also be used in a different context, with or without tuning of the model parameters, a study was conducted in Bangalore, India. Noise measurement equipment, data storage, data processing, continent, country, measurement operators, vehicle fleet, driving behavior, biking facilities, background concentration, and meteorology are all very different from the first measurement campaign in Belgium. More than 24h of combined in-traffic noise, BC, and PN measurements were collected. It was shown that the noise-based BC exposure model gives good predictions in Bangalore and that the same approach is also successful for PN. Cross validation of the model parameters was used to compare factors that impact exposure across study sites. A pooled model (combining the measurements of the two locations) results in a correlation of 0.84 when fitting the total trip exposure in Bangalore. Estimating particulate matter exposure with traffic

  13. [Measuring water ecological carrying capacity with the ecosystem-service-based ecological footprint (ESEF) method: Theory, models and application].

    Science.gov (United States)

    Jiao, Wen-jun; Min, Qing-wen; Li, Wen-hua; Fuller, Anthony M

    2015-04-01

    Integrated watershed management based on aquatic ecosystems has been increasingly acknowledged. Such a change in the philosophy of water environment management requires recognizing the carrying capacity of aquatic ecosystems for human society from a more general perspective. The concept of the water ecological carrying capacity is therefore put forward, which considers both water resources and water environment, connects socio-economic development to aquatic ecosystems and provides strong support for integrated watershed management. In this paper, the authors proposed an ESEF-based measure of water ecological carrying capacity and constructed ESEF-based models of water ecological footprint and capacity, aiming to evaluate water ecological carrying capacity with footprint methods. A regional model of Taihu Lake Basin was constructed and applied to evaluate the water ecological carrying capacity in Changzhou City which located in the upper reaches of the basin. Results showed that human demand for water ecosystem services in this city had exceeded the supply capacity of local aquatic ecosystems and the significant gap between demand and supply had jeopardized the sustainability of local aquatic ecosystems. Considering aqua-product provision, water supply and pollutant absorption in an integrated way, the scale of population and economy aquatic ecosystems in Changzhou could bear only 54% of the current status.

  14. Development of a pyrolysis waste recovery model with designs, test plans, and applications for space-based habitats

    Science.gov (United States)

    Roberson, Bobby J.

    1992-01-01

    Extensive literature searches revealed the numerous advantages of using pyrolysis as a means of recovering usable resources from inedible plant biomass, paper, plastics, other polymers, and human waste. A possible design of a pyrolysis reactor with test plans and applications for use on a space-based habitat are proposed. The proposed system will accommodate the wastes generated by a four-person crew while requiring solar energy as the only power source. Waste materials will be collected and stored during the 15-day lunar darkness periods. Resource recovery will occur during the daylight periods. Usable gases such as methane and hydrogen and a solid char will be produced while reducing the mass and volume of the waste to almost infinitely small levels. The system will be operated economically, safely, and in a non-polluting manner.

  15. Application of physiologically based pharmacokinetic modeling in predicting drug–drug interactions for sarpogrelate hydrochloride in humans

    Directory of Open Access Journals (Sweden)

    Min JS

    2016-09-01

    Full Text Available Jee Sun Min,1 Doyun Kim,1 Jung Bae Park,1 Hyunjin Heo,1 Soo Hyeon Bae,2 Jae Hong Seo,1 Euichaul Oh,1 Soo Kyung Bae1 1Integrated Research Institute of Pharmaceutical Sciences, College of Pharmacy, The Catholic University of Korea, Bucheon, 2Department of Pharmacology, College of Medicine, The Catholic University of Korea, Seocho-gu, Seoul, South Korea Background: Evaluating the potential risk of metabolic drug–drug interactions (DDIs is clinically important. Objective: To develop a physiologically based pharmacokinetic (PBPK model for sarpogrelate hydrochloride and its active metabolite, (R,S-1-{2-[2-(3-methoxyphenylethyl]-phenoxy}-3-(dimethylamino-2-propanol (M-1, in order to predict DDIs between sarpogrelate and the clinically relevant cytochrome P450 (CYP 2D6 substrates, metoprolol, desipramine, dextromethorphan, imipramine, and tolterodine. Methods: The PBPK model was developed, incorporating the physicochemical and pharmacokinetic properties of sarpogrelate hydrochloride, and M-1 based on the findings from in vitro and in vivo studies. Subsequently, the model was verified by comparing the predicted concentration-time profiles and pharmacokinetic parameters of sarpogrelate and M-1 to the observed clinical data. Finally, the verified model was used to simulate clinical DDIs between sarpogrelate hydrochloride and sensitive CYP2D6 substrates. The predictive performance of the model was assessed by comparing predicted results to observed data after coadministering sarpogrelate hydrochloride and metoprolol. Results: The developed PBPK model accurately predicted sarpogrelate and M-1 plasma concentration profiles after single or multiple doses of sarpogrelate hydrochloride. The simulated ratios of area under the curve and maximum plasma concentration of metoprolol in the presence of sarpogrelate hydrochloride to baseline were in good agreement with the observed ratios. The predicted fold-increases in the area under the curve ratios of metoprolol

  16. Application of a Theory and Simulation based Convective Boundary Mixing model for AGB Star Evolution and Nucleosynthesis

    CERN Document Server

    Battino, U; Ritter, C; Herwig, F; Denisenkov, P; Hartogh, J W Den; Trappitsch, R; Hirschi, R; Freytag, B; Thielemann, F; Paxton, B

    2016-01-01

    The s-process nucleosynthesis in Asymptotic Giant Branch (AGB) stars depends on the modeling of convective boundaries. We present models and s-process simulations that adopt a treatment of convective boundaries based on the results of hydrodynamic simulations and on the theory of mixing due to gravity waves in the vicinity of convective boundaries. Hydrodynamics simulations suggest the presence of convective boundary mixing (CBM) at the bottom of the thermal pulse-driven convective zone. Similarly, convection-induced mixing processes are proposed for the mixing below the convective envelope during third dredge-up where the 13C pocket for the s process in AGB stars forms. In this work we apply a CBM model motivated by simulations and theory to models with initial mass M = 2 and M = 3M?, and with initial metal content Z = 0:01 and Z = 0:02. As reported previously, the He-intershell abundance of 12C and 16O are increased by CBM at the bottom of pulse-driven convection zone. This mixing is affecting the 22Ne(alph...

  17. Computational models of upper-limb motion during functional reaching tasks for application in FES-based stroke rehabilitation.

    Science.gov (United States)

    Freeman, Chris; Exell, Tim; Meadmore, Katie; Hallewell, Emma; Hughes, Ann-Marie

    2015-06-01

    Functional electrical stimulation (FES) has been shown to be an effective approach to upper-limb stroke rehabilitation, where it is used to assist arm and shoulder motion. Model-based FES controllers have recently confirmed significant potential to improve accuracy of functional reaching tasks, but they typically require a reference trajectory to track. Few upper-limb FES control schemes embed a computational model of the task; however, this is critical to ensure the controller reinforces the intended movement with high accuracy. This paper derives computational motor control models of functional tasks that can be directly embedded in real-time FES control schemes, removing the need for a predefined reference trajectory. Dynamic models of the electrically stimulated arm are first derived, and constrained optimisation problems are formulated to encapsulate common activities of daily living. These are solved using iterative algorithms, and results are compared with kinematic data from 12 subjects and found to fit closely (mean fitting between 63.2% and 84.0%). The optimisation is performed iteratively using kinematic variables and hence can be transformed into an iterative learning control algorithm by replacing simulation signals with experimental data. The approach is therefore capable of controlling FES in real time to assist tasks in a manner corresponding to unimpaired natural movement. By ensuring that assistance is aligned with voluntary intention, the controller hence maximises the potential effectiveness of future stroke rehabilitation trials.

  18. Establishment and application of drilling sealing model in the spherical grouting mode based on the loosing-circle theory

    Institute of Scientific and Technical Information of China (English)

    Hao; Zhiyong; Lin; Baiquan; Gao; Yabin; Cheng; Yanying

    2012-01-01

    There are quite a few studies that have been done on borehole sealing theory both domestically and internationally.The existing researches usually consider drilling of the surroundings as a dense homogeneous elastic body which does not meet the characteristics of real drilling of the fractured body.Based on the loosing-circle theory and analyses of the surrounding rock stress field,cracks and seepage fields,combined with Newtonian fluid spherical grouting model,we deduced the dynamic relationship between the seepage coefficient and rock or grouting parameters of the drilling sealing fluid mode of spherical fissure grouting.In this experiment,mucus was injected in the simulated coal seam and the permeability coefficient of the sealing body was calculated by using the model.To verify the validity of the model,the calculated sealing body number was compared with the extreme negative pressure that the sealing body could withstand.The theoretical model revealed the drilling sealing fluid mechanism,provided a method for the quantitative calculation of the drilling sealing fluid effect by grouting mode and a reference for the subsequent research of sealing mechanism.

  19. Application of Large-Scale Database-Based Online Modeling to Plant State Long-Term Estimation

    Science.gov (United States)

    Ogawa, Masatoshi; Ogai, Harutoshi

    Recently, attention has been drawn to the local modeling techniques of a new idea called “Just-In-Time (JIT) modeling”. To apply “JIT modeling” to a large amount of database online, “Large-scale database-based Online Modeling (LOM)” has been proposed. LOM is a technique that makes the retrieval of neighboring data more efficient by using both “stepwise selection” and quantization. In order to predict the long-term state of the plant without using future data of manipulated variables, an Extended Sequential Prediction method of LOM (ESP-LOM) has been proposed. In this paper, the LOM and the ESP-LOM are introduced.

  20. The application of dynamic micro-simulation model of urban planning based on multi-agent system

    Science.gov (United States)

    Xu, J.; Shiming, W.

    2012-12-01

    The dynamic micro-simulation model of urban planning based on multi-agent, is mainly used to measure and predict the impact of the policy on urban land use, employment opportunities and the price of real estate. The representation of the supply and characteristics of land and of real estate development, at a spatial scale. The use of real estate markets as a central organizing focus, with consumer choices and supplier choices explicitly represented, as well as the resulting effects on real estate prices. The relationship of agents to real estate tied to specific locations provided a clean accounting of space and its use. Finally, it will produce a map composited with the dynamic demographic distribution and the dynamic employment transfer by the geographic spatial data. With the data produced by the urban micro-simulation model, it can provide the favorable forecast reference for the scientific urban land use.

  1. Empirically Based, Agent-based models

    Directory of Open Access Journals (Sweden)

    Elinor Ostrom

    2006-12-01

    Full Text Available There is an increasing drive to combine agent-based models with empirical methods. An overview is provided of the various empirical methods that are used for different kinds of questions. Four categories of empirical approaches are identified in which agent-based models have been empirically tested: case studies, stylized facts, role-playing games, and laboratory experiments. We discuss how these different types of empirical studies can be combined. The various ways empirical techniques are used illustrate the main challenges of contemporary social sciences: (1 how to develop models that are generalizable and still applicable in specific cases, and (2 how to scale up the processes of interactions of a few agents to interactions among many agents.

  2. Ontology-based application integration

    CERN Document Server

    Paulheim, Heiko

    2011-01-01

    Ontology-based Application Integration introduces UI-level (User Interface Level) application integration and discusses current problems which can be remedied by using ontologies. It shows a novel approach for applying ontologies in system integration. While ontologies have been used for integration of IT systems on the database and on the business logic layer, integration on the user interface layer is a novel field of research. This book also discusses how end users, not only developers, can benefit from semantic technologies. Ontology-based Application Integration presents the development o

  3. Numerical discretization-based estimation methods for ordinary differential equation models via penalized spline smoothing with applications in biomedical research.

    Science.gov (United States)

    Wu, Hulin; Xue, Hongqi; Kumar, Arun

    2012-06-01

    Differential equations are extensively used for modeling dynamics of physical processes in many scientific fields such as engineering, physics, and biomedical sciences. Parameter estimation of differential equation models is a challenging problem because of high computational cost and high-dimensional parameter space. In this article, we propose a novel class of methods for estimating parameters in ordinary differential equation (ODE) models, which is motivated by HIV dynamics modeling. The new methods exploit the form of numerical discretization algorithms for an ODE solver to formulate estimating equations. First, a penalized-spline approach is employed to estimate the state variables and the estimated state variables are then plugged in a discretization formula of an ODE solver to obtain the ODE parameter estimates via a regression approach. We consider three different order of discretization methods, Euler's method, trapezoidal rule, and Runge-Kutta method. A higher-order numerical algorithm reduces numerical error in the approximation of the derivative, which produces a more accurate estimate, but its computational cost is higher. To balance the computational cost and estimation accuracy, we demonstrate, via simulation studies, that the trapezoidal discretization-based estimate is the best and is recommended for practical use. The asymptotic properties for the proposed numerical discretization-based estimators are established. Comparisons between the proposed methods and existing methods show a clear benefit of the proposed methods in regards to the trade-off between computational cost and estimation accuracy. We apply the proposed methods t an HIV study to further illustrate the usefulness of the proposed approaches.

  4. Modeling and preliminary characterization of passive, wireless temperature sensors for harsh environment applications based on periodic structures

    Science.gov (United States)

    Delfin Manriquez, Diego I.

    Wireless temperature sensing has attained significant attention in recent years due to the increasing need to develop reliable and affordable sensing solutions for energy conversion systems and other harsh environment applications. The development of next generation sensors for energy production processing parameters, such as temperature and pressure, can result in better performance of the system. Particularly, continuous temperature monitoring in energy conversion systems can result in enhancements such as better system integrity, less pollution and higher thermal efficiencies. However, the conditions experienced in these system components hinder the performance of current solutions due to the presence of semi-conductor materials and welded joints. Additionally, the use of wired systems can result in complex wiring networks, increasing the cost of installation, maintenance and sensor replacement. Therefore, next generation sensing solutions must be developed to overcome current challenges in systems where adverse conditions are present. This research project proposes two novel passive, wireless temperature sensor designs based on concepts of guided mode resonance filters (GMRF) and metamaterials. For the GMRF, a tri-layer structure using a metallic encasing and a circular aperture grating layer was developed to have a resonance frequency of 10 GHz. While for the metamaterial-based sensor a continuation of previous work was presented by utilizing a dielectric substrate and an array of commercially available metallic washers divided in two layers. For both designs, High Frequency Structure Simulator (HFSS) from ANSYSRTM was employed to assess the feasibility of the sensor as well as to optimize the geometry and guide the fabrication process. A systematic approach consisting of evaluating the unit cell, then assessing the number of periods needed, and finally characterizing the response of the final sensor was followed for each case. After the modeling process was

  5. A drought hazard assessment index based on the VIC-PDSI model and its application on the Loess Plateau, China

    Science.gov (United States)

    Zhang, Baoqing; Wu, Pute; Zhao, Xining; Wang, Yubao; Gao, Xiaodong; Cao, Xinchun

    2013-10-01

    Drought is a complex natural hazard that is poorly understood and difficult to assess. This paper describes a VIC-PDSI model approach to understanding drought in which the Variable Infiltration Capacity (VIC) Model was combined with the Palmer Drought Severity Index (PDSI). Simulated results obtained using the VIC model were used to replace the output of the more conventional two-layer bucket-type model for hydrological accounting, and a two-class-based procedure for calibrating the characteristic climate coefficient ( K j ) was introduced to allow for a more reliable computation of the PDSI. The VIC-PDSI model was used in conjunction with GIS technology to create a new drought assessment index (DAI) that provides a comprehensive overview of drought duration, intensity, frequency, and spatial extent. This new index was applied to drought hazard assessment across six subregions of the whole Loess Plateau. The results show that the DAI over the whole Loess Plateau ranged between 11 and 26 (the greater value of the DAI means the more severe of the drought hazard level). The drought hazards in the upper reaches of Yellow River were more severe than that in the middle reaches. The drought prone regions over the study area were mainly concentrated in Inner Mongolian small rivers, Zuli and Qingshui Rivers basin, while the drought hazards in the drainage area between Hekouzhen-Longmen and Weihe River basin were relatively mild during 1971-2010. The most serious drought vulnerabilities were associated with the area around Lanzhou, Zhongning, and Yinchuan, where the development of water-saving irrigation is the most direct and effective way to defend against and reduce losses from drought. For the relatively humid regions, it will be necessary to establish the rainwater harvesting systems, which could help to relieve the risk of water shortage and guarantee regional food security. Due to the DAI considers the multiple characteristic of drought duration, intensity, frequency

  6. Evaluation Theory, Models, and Applications

    Science.gov (United States)

    Stufflebeam, Daniel L.; Shinkfield, Anthony J.

    2007-01-01

    "Evaluation Theory, Models, and Applications" is designed for evaluators and students who need to develop a commanding knowledge of the evaluation field: its history, theory and standards, models and approaches, procedures, and inclusion of personnel as well as program evaluation. This important book shows how to choose from a growing…

  7. Agent Based Multiviews Requirements Model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Based on the current researches of viewpoints oriented requirements engineering and intelligent agent, we present the concept of viewpoint agent and its abstract model based on a meta-language for multiviews requirements engineering. It provided a basis for consistency checking and integration of different viewpoint requirements, at the same time, these checking and integration works can automatically realized in virtue of intelligent agent's autonomy, proactiveness and social ability. Finally, we introduce the practical application of the model by the case study of data flow diagram.

  8. Application of a Theory and Simulation-based Convective Boundary Mixing Model for AGB Star Evolution and Nucleosynthesis

    Science.gov (United States)

    Battino, U.; Pignatari, M.; Ritter, C.; Herwig, F.; Denisenkov, P.; Den Hartogh, J. W.; Trappitsch, R.; Hirschi, R.; Freytag, B.; Thielemann, F.; Paxton, B.

    2016-08-01

    The s-process nucleosynthesis in Asymptotic giant branch (AGB) stars depends on the modeling of convective boundaries. We present models and s-process simulations that adopt a treatment of convective boundaries based on the results of hydrodynamic simulations and on the theory of mixing due to gravity waves in the vicinity of convective boundaries. Hydrodynamics simulations suggest the presence of convective boundary mixing (CBM) at the bottom of the thermal pulse-driven convective zone. Similarly, convection-induced mixing processes are proposed for the mixing below the convective envelope during third dredge-up (TDU), where the {}13{{C}} pocket for the s process in AGB stars forms. In this work, we apply a CBM model motivated by simulations and theory to models with initial mass M = 2 and M=3 {M}⊙ , and with initial metal content Z = 0.01 and Z = 0.02. As reported previously, the He-intershell abundances of {}12{{C}} and {}16{{O}} are increased by CBM at the bottom of the pulse-driven convection zone. This mixing is affecting the {}22{Ne}(α, n){}25{Mg} activation and the s-process efficiency in the {}13{{C}}-pocket. In our model, CBM at the bottom of the convective envelope during the TDU represents gravity wave mixing. Furthermore, we take into account the fact that hydrodynamic simulations indicate a declining mixing efficiency that is already about a pressure scale height from the convective boundaries, compared to mixing-length theory. We obtain the formation of the {}13{{C}}-pocket with a mass of ≈ {10}-4 {M}⊙ . The final s-process abundances are characterized by 0.36\\lt [{{s}}/{Fe}]\\lt 0.78 and the heavy-to-light s-process ratio is -0.23\\lt [{hs}/{ls}]\\lt 0.45. Finally, we compare our results with stellar observations, presolar grain measurements and previous work.

  9. 农作物空间格局变化模拟模型的MATLAB实现及应用%Model application of an agent-based model for simulating crop pattern dynamics at regional scale based on MATLAB

    Institute of Scientific and Technical Information of China (English)

    余强毅; 吴文斌; 陈羊阳; 杨鹏; 孟超英; 周清波; 唐华俊

    2014-01-01

    Crop pattern is a key element in agricultural land systems other than land use and land cover. Crop pattern dynamic changes take place very frequently, but they are not always easily observable, making many difficulties for analysis. As an effective tool for understanding the driver, process and consequence of agricultural land system changes, the spatially-explicit agent-based land change models have successfully been applied in representing human and natural interactions on agricultural landscapes. With the assumption that the crop pattern at a regional level is the aggregation of crop choices at the filed level, we conceptualized an agent-based model to simulate crop pattern dynamics at a regional scale (CroPaDy), which was supposed to represent the frequent but uneasily observed crop pattern changes in agricultural land systems. The conceptualization of CroPaDy model was designed strictly following the standard protocol for agent-based modeling. However, the computational model hinders its application because it needs a grid-based representation and the model itself is complicated with multi objectives, and nested by 3 interactive sub modules. As CroPaDy model can hardly been developed by the common agent-based modeling platforms, such as RePast, NetLogo, and Swarm, we are trying to use another alternative MATLAB to realize an empirical based application in an agricultural region of Northeast China, by taking the advantage of powerful and open-accessed matrix computing ability of MATLAB. We coded the model for the 3 interactive sub modules in steps: 1) Agents generating module. The Monte Carlo method was used to generate the internal factors (family attributes) for each individual agent in the full coverage study region by combining GIS data, statistical data, survey data and the individual based blanket rules. 2) Agent classifying module. The back propagation artificial neural network method was used to automatically classify the generated agents to groups

  10. Constraint Based Modeling Going Multicellular.

    Science.gov (United States)

    Martins Conde, Patricia do Rosario; Sauter, Thomas; Pfau, Thomas

    2016-01-01

    Constraint based modeling has seen applications in many microorganisms. For example, there are now established methods to determine potential genetic modifications and external interventions to increase the efficiency of microbial strains in chemical production pipelines. In addition, multiple models of multicellular organisms have been created including plants and humans. While initially the focus here was on modeling individual cell types of the multicellular organism, this focus recently started to switch. Models of microbial communities, as well as multi-tissue models of higher organisms have been constructed. These models thereby can include different parts of a plant, like root, stem, or different tissue types in the same organ. Such models can elucidate details of the interplay between symbiotic organisms, as well as the concerted efforts of multiple tissues and can be applied to analyse the effects of drugs or mutations on a more systemic level. In this review we give an overview of the recent development of multi-tissue models using constraint based techniques and the methods employed when investigating these models. We further highlight advances in combining constraint based models with dynamic and regulatory information and give an overview of these types of hybrid or multi-level approaches.

  11. Fault-diagnosis applications. Model-based condition monitoring. Acutators, drives, machinery, plants, sensors, and fault-tolerant systems

    Energy Technology Data Exchange (ETDEWEB)

    Isermann, Rolf [Technische Univ. Darmstadt (DE). Inst. fuer Automatisierungstechnik (IAT)

    2011-07-01

    Supervision, condition-monitoring, fault detection, fault diagnosis and fault management play an increasing role for technical processes and vehicles in order to improve reliability, availability, maintenance and lifetime. For safety-related processes fault-tolerant systems with redundancy are required in order to reach comprehensive system integrity. This book is a sequel of the book ''Fault-Diagnosis Systems'' published in 2006, where the basic methods were described. After a short introduction into fault-detection and fault-diagnosis methods the book shows how these methods can be applied for a selection of 20 real technical components and processes as examples, such as: Electrical drives (DC, AC) Electrical actuators Fluidic actuators (hydraulic, pneumatic) Centrifugal and reciprocating pumps Pipelines (leak detection) Industrial robots Machine tools (main and feed drive, drilling, milling, grinding) Heat exchangers Also realized fault-tolerant systems for electrical drives, actuators and sensors are presented. The book describes why and how the various signal-model-based and process-model-based methods were applied and which experimental results could be achieved. In several cases a combination of different methods was most successful. The book is dedicated to graduate students of electrical, mechanical, chemical engineering and computer science and for engineers. (orig.)

  12. Formulation and Application of a Physically-Based Rupture Probability Model for Large Earthquakes on Subduction Zones: A Case Study of Earthquakes on Nazca Plate

    Science.gov (United States)

    Mahdyiar, M.; Galgana, G.; Shen-Tu, B.; Klein, E.; Pontbriand, C. W.

    2014-12-01

    Most time dependent rupture probability (TDRP) models are basically designed for a single-mode rupture, i.e. a single characteristic earthquake on a fault. However, most subduction zones rupture in complex patterns that create overlapping earthquakes of different magnitudes. Additionally, the limited historic earthquake data does not provide sufficient information to estimate reliable mean recurrence intervals for earthquakes. This makes it difficult to identify a single characteristic earthquake for TDRP analysis. Physical models based on geodetic data have been successfully used to obtain information on the state of coupling and slip deficit rates for subduction zones. Coupling information provides valuable insight into the complexity of subduction zone rupture processes. In this study we present a TDRP model that is formulated based on subduction zone slip deficit rate distribution. A subduction zone is represented by an integrated network of cells. Each cell ruptures multiple times from numerous earthquakes that have overlapping rupture areas. The rate of rupture for each cell is calculated using a moment balance concept that is calibrated based on historic earthquake data. The information in conjunction with estimates of coseismic slip from past earthquakes is used to formulate time dependent rupture probability models for cells. Earthquakes on the subduction zone and their rupture probabilities are calculated by integrating different combinations of cells. The resulting rupture probability estimates are fully consistent with the state of coupling of the subduction zone and the regional and local earthquake history as the model takes into account the impact of all large (M>7.5) earthquakes on the subduction zone. The granular rupture model as developed in this study allows estimating rupture probabilities for large earthquakes other than just a single characteristic magnitude earthquake. This provides a general framework for formulating physically-based

  13. An ANN-Based Synthesis Model for Parallel Coupled Microstrip Lines with Floating Ground-Plane Conductor and Its Applications

    Directory of Open Access Journals (Sweden)

    Yuan Cao

    2016-01-01

    Full Text Available To directly obtain physical dimensions of parallel coupled microstrip lines with a floating ground-plane conductor (PCMLFGPC, an accurate synthesis model based on an artificial neural network (ANN is proposed. The synthesis model is validated by using the conformal mapping technique (CMT analysis contours. Using the synthesis model and the CMT analysis, the PCMLFGPC having equal even- and odd-mode phase velocities can be obtained by adjusting the width of the floating ground-plane conductor. Applying the method, a 7 dB coupler with the measured isolation better than 27 dB across a wide bandwidth (more than 120%, a 90° Schiffman phase shifter with phase deviation ±2.5° and return loss more than 17.5 dB covering 63.4% bandwidth, and a bandpass filter with completely eliminated second-order spurious band are implemented. The performances of the current designs are superior to those of the previous components configured with the PCMLFGPC.

  14. A Space-Time Network-Based Modeling Framework for Dynamic Unmanned Aerial Vehicle Routing in Traffic Incident Monitoring Applications.

    Science.gov (United States)

    Zhang, Jisheng; Jia, Limin; Niu, Shuyun; Zhang, Fan; Tong, Lu; Zhou, Xuesong

    2015-06-12

    It is essential for transportation management centers to equip and manage a network of fixed and mobile sensors in order to quickly detect traffic incidents and further monitor the related impact areas, especially for high-impact accidents with dramatic traffic congestion propagation. As emerging small Unmanned Aerial Vehicles (UAVs) start to have a more flexible regulation environment, it is critically important to fully explore the potential for of using UAVs for monitoring recurring and non-recurring traffic conditions and special events on transportation networks. This paper presents a space-time network- based modeling framework for integrated fixed and mobile sensor networks, in order to provide a rapid and systematic road traffic monitoring mechanism. By constructing a discretized space-time network to characterize not only the speed for UAVs but also the time-sensitive impact areas of traffic congestion, we formulate the problem as a linear integer programming model to minimize the detection delay cost and operational cost, subject to feasible flying route constraints. A Lagrangian relaxation solution framework is developed to decompose the original complex problem into a series of computationally efficient time-dependent and least cost path finding sub-problems. Several examples are used to demonstrate the results of proposed models in UAVs' route planning for small and medium-scale networks.

  15. Collective estimation of multiple bivariate density functions with application to angular-sampling-based protein loop modeling

    KAUST Repository

    Maadooliat, Mehdi

    2015-10-21

    This paper develops a method for simultaneous estimation of density functions for a collection of populations of protein backbone angle pairs using a data-driven, shared basis that is constructed by bivariate spline functions defined on a triangulation of the bivariate domain. The circular nature of angular data is taken into account by imposing appropriate smoothness constraints across boundaries of the triangles. Maximum penalized likelihood is used to fit the model and an alternating blockwise Newton-type algorithm is developed for computation. A simulation study shows that the collective estimation approach is statistically more efficient than estimating the densities individually. The proposed method was used to estimate neighbor-dependent distributions of protein backbone dihedral angles (i.e., Ramachandran distributions). The estimated distributions were applied to protein loop modeling, one of the most challenging open problems in protein structure prediction, by feeding them into an angular-sampling-based loop structure prediction framework. Our estimated distributions compared favorably to the Ramachandran distributions estimated by fitting a hierarchical Dirichlet process model; and in particular, our distributions showed significant improvements on the hard cases where existing methods do not work well.

  16. A Space-Time Network-Based Modeling Framework for Dynamic Unmanned Aerial Vehicle Routing in Traffic Incident Monitoring Applications

    Directory of Open Access Journals (Sweden)

    Jisheng Zhang

    2015-06-01

    Full Text Available It is essential for transportation management centers to equip and manage a network of fixed and mobile sensors in order to quickly detect traffic incidents and further monitor the related impact areas, especially for high-impact accidents with dramatic traffic congestion propagation. As emerging small Unmanned Aerial Vehicles (UAVs start to have a more flexible regulation environment, it is critically important to fully explore the potential for of using UAVs for monitoring recurring and non-recurring traffic conditions and special events on transportation networks. This paper presents a space-time network- based modeling framework for integrated fixed and mobile sensor networks, in order to provide a rapid and systematic road traffic monitoring mechanism. By constructing a discretized space-time network to characterize not only the speed for UAVs but also the time-sensitive impact areas of traffic congestion, we formulate the problem as a linear integer programming model to minimize the detection delay cost and operational cost, subject to feasible flying route constraints. A Lagrangian relaxation solution framework is developed to decompose the original complex problem into a series of computationally efficient time-dependent and least cost path finding sub-problems. Several examples are used to demonstrate the results of proposed models in UAVs’ route planning for small and medium-scale networks.

  17. A Space-Time Network-Based Modeling Framework for Dynamic Unmanned Aerial Vehicle Routing in Traffic Incident Monitoring Applications

    Science.gov (United States)

    Zhang, Jisheng; Jia, Limin; Niu, Shuyun; Zhang, Fan; Tong, Lu; Zhou, Xuesong

    2015-01-01

    It is essential for transportation management centers to equip and manage a network of fixed and mobile sensors in order to quickly detect traffic incidents and further monitor the related impact areas, especially for high-impact accidents with dramatic traffic congestion propagation. As emerging small Unmanned Aerial Vehicles (UAVs) start to have a more flexible regulation environment, it is critically important to fully explore the potential for of using UAVs for monitoring recurring and non-recurring traffic conditions and special events on transportation networks. This paper presents a space-time network- based modeling framework for integrated fixed and mobile sensor networks, in order to provide a rapid and systematic road traffic monitoring mechanism. By constructing a discretized space-time network to characterize not only the speed for UAVs but also the time-sensitive impact areas of traffic congestion, we formulate the problem as a linear integer programming model to minimize the detection delay cost and operational cost, subject to feasible flying route constraints. A Lagrangian relaxation solution framework is developed to decompose the original complex problem into a series of computationally efficient time-dependent and least cost path finding sub-problems. Several examples are used to demonstrate the results of proposed models in UAVs’ route planning for small and medium-scale networks. PMID:26076404

  18. A Location-Based Interactive Model of Internet of Things and Cloud (IoT-Cloud) for Mobile Cloud Computing Applications.

    Science.gov (United States)

    Dinh, Thanh; Kim, Younghan; Lee, Hyukjoon

    2017-03-01

    This paper presents a location-based interactive model of Internet of Things (IoT) and cloud integration (IoT-cloud) for mobile cloud computing applications, in comparison with the periodic sensing model. In the latter, sensing collections are performed without awareness of sensing demands. Sensors are required to report their sensing data periodically regardless of whether or not there are demands for their sensing services. This leads to unnecessary energy loss due to redundant transmission. In the proposed model, IoT-cloud provides sensing services on demand based on interest and location of mobile users. By taking advantages of the cloud as a coordinator, sensing scheduling of sensors is controlled by the cloud, which knows when and where mobile users request for sensing services. Therefore, when there is no demand, sensors are put into an inactive mode to save energy. Through extensive analysis and experimental results, we show that the location-based model achieves a significant improvement in terms of network lifetime compared to the periodic model.

  19. Methane emissions from floodplains in the Amazon Basin: towards a process-based model for global applications

    Directory of Open Access Journals (Sweden)

    B. Ringeval

    2013-10-01

    Full Text Available Tropical wetlands are estimated to represent about 50% of the natural wetland emissions and explain a large fraction of the observed CH4 variability on time scales ranging from glacial-interglacial cycles to the currently observed year-to-year variability. Despite their importance, however, tropical wetlands are poorly represented in global models aiming to predict global CH4 emissions. This study documents the first regional-scale, process-based model of CH4 emissions from tropical floodplains. The LPX-Bern Dynamic Global Vegetation Model (LPX hereafter was modified to represent floodplain hydrology, vegetation and associated CH4 emissions. The extent of tropical floodplains was prescribed using output from the spatially-explicit hydrology model PCR-GLOBWB. We introduced new Plant Functional Types (PFTs that explicitly represent floodplain vegetation. The PFT parameterizations were evaluated against available remote sensing datasets (GLC2000 land cover and MODIS Net Primary Productivity. Simulated CH4 flux densities were evaluated against field observations and regional flux inventories. Simulated CH4 emissions at Amazon Basin scale were compared to model simulations performed in the WETCHIMP intercomparison project. We found that LPX simulated CH4 flux densities are in reasonable agreement with observations at the field scale but with a~tendency to overestimate the flux observed at specific sites. In addition, the model did not reproduce between-site variations or between-year variations within a site. Unfortunately, site informations are too limited to attest or disprove some model features. At the Amazon Basin scale, our results underline the large uncertainty in the magnitude of wetland CH4 emissions. In particular, uncertainties in floodplain extent (i.e., difference between GLC2000 and PCR-GLOBWB output modulate the simulated emissions by a factor of about 2. Our best estimates, using PCR-GLOBWB in combination with GLC2000, lead to

  20. ABC成本分配模型在仓储企业中的应用%The Application of Actively -based Costing Allocating Model in Logistics Enterprise

    Institute of Scientific and Technical Information of China (English)

    杨静; 于桂平

    2012-01-01

    为了解决仓储企业资源配置问题,将ABC(Activity-based Costing System)思想运用于存货成本分配,并用实例分析了模型的具体应用方法,表现出结果公开、过程易理解、投入富于意义的特性,旨在推动企业资源的合理配置,使产业最优化,效率最大化。%In order to solve the problems of resource allocation in logistics enterprise, this paper applies ABC ( Ac- tively - based Costing) model in allocating stock cost, and analyzes the specific application method of the model, which shows the characteristics of open outcome, understandable process and meaningful input, with the purpose of improving the reasonable resource allocation, optimizing the industry and maximizing the efficiency.

  1. Small signal model parameters analysis of GaN and GaAs based HEMTs over temperature for microwave applications

    Science.gov (United States)

    Alim, Mohammad A.; Rezazadeh, Ali A.; Gaquiere, Christophe

    2016-05-01

    Thermal and small-signal model parameters analysis have been carried out on 0.5 μm × (2 × 100 μm) AlGaAs/GaAs HEMT grown on semi-insulating GaAs substrate and 0.25 μm × (2 × 100 μm) AlGaN/GaN HEMT grown on SiC substrate. Two different technologies are investigated in order to establish a detailed understanding of their capabilities in terms of frequency and temperature using on-wafer S-parameter measurement over the temperature range from -40 to 150 °C up to 50 GHz. The equivalent circuit parameters as well as their temperature-dependent behavior of the two technologies were analyzed and discussed for the first time. The principle elevation or degradation of transistor parameters with temperature demonstrates the great potential of GaN device for high frequency and high temperature applications. The result provides some valuable insights for future design optimizations of advanced GaN and a comparison of this with the GaAs technology.

  2. Web Based VRML Modelling

    NARCIS (Netherlands)

    Kiss, S.

    2001-01-01

    Presents a method to connect VRML (Virtual Reality Modeling Language) and Java components in a Web page using EAI (External Authoring Interface), which makes it possible to interactively generate and edit VRML meshes. The meshes used are based on regular grids, to provide an interaction and modeling

  3. Expansion and Application of Observer Model Based on MVC Model%基于MVC的Observer开发模式的扩展及应用

    Institute of Scientific and Technical Information of China (English)

    肖力涛; 亓常松

    2012-01-01

    首先讨论了MVC设计模式的特点,对Observer模式进行适当扩展,抽象出用于显示逻辑的表现层,增强了代码的可重用性,降低了耦合度.然后利用此扩展的Observer模式完成防撞预警系统的架构设计.%This paper discusses the features of MVC model firstly. Based on the discussion, the Observer model is expanded by adding a presentation layer for the covenient of logic presentation. Hence, the code reusability is increased, and at the same time, the coupling of the models is decreased. Finally, this paper design the architecture of collision-proof warning system by u-sing the expanded Observer model.

  4. Web-based applications for virtual laboratories

    NARCIS (Netherlands)

    Bier, H.H.

    2011-01-01

    Web-based applications for academic education facilitate, usually, exchange of multimedia files, while design-oriented domains such as architectural and urban design require additional support in collaborative real-time drafting and modeling. In this context, multi-user interactive interfaces employ

  5. Variability of tsunami inundation footprints considering stochastic scenarios based on a single rupture model: Application to the 2011 Tohoku earthquake

    KAUST Repository

    Goda, Katsuichiro

    2015-06-30

    The sensitivity and variability of spatial tsunami inundation footprints in coastal cities and towns due to a megathrust subduction earthquake in the Tohoku region of Japan are investigated by considering different fault geometry and slip distributions. Stochastic tsunami scenarios are generated based on the spectral analysis and synthesis method with regards to an inverted source model. To assess spatial inundation processes accurately, tsunami modeling is conducted using bathymetry and elevation data with 50 m grid resolutions. Using the developed methodology for assessing variability of tsunami hazard estimates, stochastic inundation depth maps can be generated for local coastal communities. These maps are important for improving disaster preparedness by understanding the consequences of different situations/conditions, and by communicating uncertainty associated with hazard predictions. The analysis indicates that the sensitivity of inundation areas to the geometrical parameters (i.e., top-edge depth, strike, and dip) depends on the tsunami source characteristics and the site location, and is therefore complex and highly nonlinear. The variability assessment of inundation footprints indicates significant influence of slip distributions. In particular, topographical features of the region, such as ria coast and near-shore plain, have major influence on the tsunami inundation footprints.

  6. Application of a rule-based model to estimate mercury exchange for three background biomes in the continental United States.

    Science.gov (United States)

    Hartman, Jelena S; Weisberg, Peter J; Pillai, Rekha; Ericksen, Jody A; Kuiken, Todd; Lindberg, Steve E; Zhang, Hong; Rytuba, James J; Gustin, Mae S

    2009-07-01

    Ecosystems that have low mercury (Hg) concentrations (i.e., not enriched or impacted by geologic or anthropogenic processes) cover most of the terrestrial surface area of the earth yet their role as a net source or sink for atmospheric Hg is uncertain. Here we use empirical data to develop a rule-based model implemented within a geographic information system framework to estimate the spatial and temporal patterns of Hg flux for semiarid deserts, grasslands, and deciduous forests representing 45% of the continental United States. This exercise provides an indication of whether these ecosystems are a net source or sink for atmospheric Hg as well as a basis for recommendation of data to collect in future field sampling campaigns. Results indicated that soil alone was a small net source of atmospheric Hg and that emitted Hg could be accounted for based on Hg input by wet deposition. When foliar assimilation and wet deposition are added to the area estimate of soil Hg flux these biomes are a sink for atmospheric Hg.

  7. Survival analysis models and applications

    CERN Document Server

    Liu, Xian

    2012-01-01

    Survival analysis concerns sequential occurrences of events governed by probabilistic laws.  Recent decades have witnessed many applications of survival analysis in various disciplines. This book introduces both classic survival models and theories along with newly developed techniques. Readers will learn how to perform analysis of survival data by following numerous empirical illustrations in SAS. Survival Analysis: Models and Applications: Presents basic techniques before leading onto some of the most advanced topics in survival analysis.Assumes only a minimal knowledge of SAS whilst enablin

  8. Fractional order model reduction approach based on retention of the dominant dynamics: application in IMC based tuning of FOPI and FOPID controllers.

    Science.gov (United States)

    Tavakoli-Kakhki, Mahsan; Haeri, Mohammad

    2011-07-01

    Fractional order PI and PID controllers are the most common fractional order controllers used in practice. In this paper, a simple analytical method is proposed for tuning the parameters of these controllers. The proposed method is useful in designing fractional order PI and PID controllers for control of complicated fractional order systems. To achieve the goal, at first a reduction technique is presented for approximating complicated fractional order models. Then, based on the obtained reduced models some analytical rules are suggested to determine the parameters of fractional order PI and PID controllers. Finally, numerical results are given to show the efficiency of the proposed tuning algorithm.

  9. Relationships between snowfall density and solid hydrometeors, based on measured size and fall speed, for snowpack modeling applications

    Science.gov (United States)

    Ishizaka, Masaaki; Motoyoshi, Hiroki; Yamaguchi, Satoru; Nakai, Sento; Shiina, Toru; Muramoto, Ken-ichiro

    2016-11-01

    The initial density of deposited snow is mainly controlled by snowfall hydrometeors. The relationship between snowfall density and hydrometeors has been qualitatively examined by previous researchers; however, a quantitative relationship has not yet been established due to difficulty in parameterizing the hydrometeor characteristics of a snowfall event. Thus, in an earlier study, we developed a new variable, the centre of mass flux distribution (CMF), which we used to describe the main hydrometeors contributing to a snowfall event. The CMF is based on average size and fall speed weighted by the mass flux estimated from all measured hydrometeors in a snowfall event. It provides a quantitative representation of the predominant hydrometeor characteristics of the event. In this study, we examine the relationships between the density of newly fallen snow and predominant snow type as indicated by the CMFs. We measured snowfall density at Nagaoka, Japan, where riming and aggregation are predominant, simultaneously observing the size and fall speed of snowfall hydrometeors, and deduced the predominant hydrometeor characteristics of each snowfall event from their CMFs. Snow density measurements were carried out for short periods, 1 or 2 h, during which the densification of the deposited snow was negligible. Also, we grouped snowfall events based on similar hydrometeor characteristics. As a result, we were able to obtain not only the qualitative relationships between the main types of snow and snowfall density as reported by previous researchers, but also quantitative relationships between snowfall density and the CMF density introduced here. CMF density is defined as the ratio between mass and volume, assuming the diameter of a sphere is equal to the CMF size component. This quantitative relationship provides a means for more precise estimation of snowfall density based on snow type (hydrometeor characteristics), by using hydrometeor size and fall speed data to derive

  10. A model for assessment of telemedicine applications

    DEFF Research Database (Denmark)

    Kidholm, Kristian; Ekeland, Anne Granstrøm; Jensen, Lise Kvistgaard;

    2012-01-01

    the European Commission initiated the development of a framework for assessing telemedicine applications, based on the users' need for information for decision making. This article presents the Model for ASsessment of Telemedicine applications (MAST) developed in this study.......Telemedicine applications could potentially solve many of the challenges faced by the healthcare sectors in Europe. However, a framework for assessment of these technologies is need by decision makers to assist them in choosing the most efficient and cost-effective technologies. Therefore in 2009...

  11. Constructing probabilistic models for realistic velocity distributions based on forward modeling and tomographic inversion: applications for active and passive source observation schemes

    Science.gov (United States)

    Koulakov, I. Yu.

    2009-04-01

    Seismic tomography is like a photography taken by a camera with deformed and blurred lenses. In the resulting tomograms, colors (amplitudes of anomalies) and shapes of objects are often strongly biased and are usually not representing the reality. We propose an approach which allows investigating properties of the "camera" and retrieving most probable shapes and amplitudes of anomalies in the real Earth. The main idea of this approach is to construct a synthetic model which, after performing forward modeling and tomographic inversion, reproduces the same amplitudes and shapes of patterns as after inversion of observed data. In this modeling, the conditions of the tomographic inversion (damping, grid spacing, source location parameters etc) should be absolutely identical to the case of the observed data processing. The a priori information, if available any, should be taken into account in this modeling to decrease the uncertainty related to fundamental non-uniqueness of the inversion problem. In the talk, several examples of applying this approach at various scales for different data schemes are presented: (1) regional scheme which uses the global data of the ISC catalogue (with examples of regional upper mantle models in Europe and central Asia); (2) local earthquake tomography scheme (illustrated with models in Toba caldera area and in Central Java); (3) seismic profiling which is based on active source refraction travel time data (with examples of several deep seismic sounding profiles in Central Pacific and subduction zones in Chile).

  12. Application of hybrid robust three-axis attitude control approach to overactuated spacecraft-A quaternion based model

    Institute of Scientific and Technical Information of China (English)

    A. H. Mazinan

    2016-01-01

    A novel hybrid robust three-axis attitude control approach, namely HRTAC, is considered along with the well-known developments in the area of space systems, since there is a consensus among the related experts that the new insights may be taken into account as decision points to outperform the available materials. It is to note that the traditional control approaches may generally be upgraded, as long as a number of modifications are made with respect to state-of-the-art, in order to propose high-precision outcomes. Regarding the investigated issues, the robust sliding mode finite-time control approach is first designed to handle three-axis angular rates in the inner control loop, which consists of the pulse width pulse frequency modulations in line with the control allocation scheme and the system dynamics. The main subject to employ these modulations that is realizing in association with the control allocation scheme is to be able to handle a class of overactuated systems, in particular. The proportional derivative based linear quadratic regulator approach is then designed to handle three-axis rotational angles in the outer control loop, which consists of the system kinematics that is correspondingly concentrated to deal with the quaternion based model. The utilization of the linear and its nonlinear terms, simultaneously, are taken into real consideration as the research motivation, while the performance results are of the significance as the improved version in comparison with the recent investigated outcomes. Subsequently, there is a stability analysis to verify and guarantee the closed loop system performance in coping with the whole of nominal referenced commands. At the end, the effectiveness of the approach considered here is highlighted in line with a number of potential recent benchmarks.

  13. Applicability of heat and gas trans-port models in biocover design based on a case study from Denmark

    DEFF Research Database (Denmark)

    Nielsen, A. A. F.; Binning, Philip John; Kjeldsen, Peter

    2015-01-01

    . Both models used the heat equation for heat transfer, and the numerical model used advection-diffusion model with dual Monod kinetics for gas transport. The results were validated with data from a Danish landfi The models correlated well with the observed data: the coefficient of determination (R2...

  14. Behavior Modeling -- Foundations and Applications

    DEFF Research Database (Denmark)

    This book constitutes revised selected papers from the six International Workshops on Behavior Modelling - Foundations and Applications, BM-FA, which took place annually between 2009 and 2014. The 9 papers presented in this volume were carefully reviewed and selected from a total of 58 papers...

  15. Improved version of BTOPMC model and its application in event-based hydrologic simulations%改进的BTOPMC模型及其在水文模拟中的应用

    Institute of Scientific and Technical Information of China (English)

    王国强; 周买春; 竹内邦良; 石平博

    2007-01-01

    In this paper, a grid-based distributed hydrological model BTOPMC (Block-wise use of TOPMODEL) is introduced, which was developed from the original TOPMODEL. In order to broaden the model's application to arid regions, improvement methodology is also implemented. The canopy interception and soil infiltration processes were incorporated into the original BTOPMC to model event-based runoff simulation in large arid regions. One designed infiltration model with application of time compression approximation method is emphasized and validated for improving model's performance for event hydrological simulations with a case study of Lushi River basin.

  16. Hydraulic model for multi-sources reclaimed water pipe network based on EPANET and its applications in Beijing, China

    Institute of Scientific and Technical Information of China (English)

    Haifeng JIA; Wei WEI; Kunlun XIN

    2008-01-01

    Water shortage is one of the major water related problems for many cities in the world. The planning for utilization of reclaimed water has been or would be drafted in these cities. For using the reclaimed water soundly, Beijing planned to build a large scale reclaimed water pipe networks with multi-sources. In order to support the plan, the integrated hydraulic model of planning pipe network was developed based on EPANET supported by geographic information system (GIS). The complicated pipe network was divided into four weak conjunction subzones according to the distribution of reclaimed water plants and the elevation. It could provide a better solution for the problem of overhigh pressure in several regions of the network. Through the scenarios analy-sis in different subzones, some of the initial diameter of pipes in the network was adjusted. At last the pipe network planning scheme of reclaimed water was proposed. The proposed planning scheme could reach the balances between reclaimed water requirements and reclaimed water supplies, and provided a scientific basis for the reclaimed water utilization in Beijing. Now the scheme had been adopted by Beijing municipal government.

  17. Study on the Estimation of Groundwater Withdrawals Based on Groundwater Flow Modeling and Its Application in the North China Plain

    Institute of Scientific and Technical Information of China (English)

    Jingli Shao; Yali Cui; Qichen Hao; Zhong Han; Tangpei Cheng

    2014-01-01

    The amount of water withdrawn by wells is one of the quantitative variables that can be applied to estimate groundwater resources and further evaluate the human influence on ground-water systems. The accuracy for the calculation of the amount of water withdrawal significantly in-fluences the regional groundwater resource evaluation and management. However, the decentralized groundwater pumping, inefficient management, measurement errors and uncertainties have resulted in considerable errors in the groundwater withdrawal estimation. In this study, to improve the esti-mation of the groundwater withdrawal, an innovative approach was proposed using an inversion method based on a regional groundwater flow numerical model, and this method was then applied in the North China Plain. The principle of the method was matching the simulated water levels with the observation ones by adjusting the amount of groundwater withdrawal. In addition, uncertainty analysis of hydraulic conductivity and specific yield for the estimation of the groundwater with-drawal was conducted. By using the proposed inversion method, the estimated annual average groundwater withdrawal was approximately 24.92×109 m3 in the North China Plain from 2002 to 2008. The inversion method also significantly improved the simulation results for both hydrograph and the flow field. Results of the uncertainty analysis showed that the hydraulic conductivity was more sensitive to the inversion results than the specific yield.

  18. A New Continuous Rotation IMU Alignment Algorithm Based on Stochastic Modeling for Cost Effective North-Finding Applications

    Directory of Open Access Journals (Sweden)

    Yun Li

    2016-12-01

    Full Text Available Based on stochastic modeling of Coriolis vibration gyros by the Allan variance technique, this paper discusses Angle Random Walk (ARW, Rate Random Walk (RRW and Markov process gyroscope noises which have significant impacts on the North-finding accuracy. A new continuous rotation alignment algorithm for a Coriolis vibration gyroscope Inertial Measurement Unit (IMU is proposed in this paper, in which the extended observation equations are used for the Kalman filter to enhance the estimation of gyro drift errors, thus improving the north-finding accuracy. Theoretical and numerical comparisons between the proposed algorithm and the traditional ones are presented. The experimental results show that the new continuous rotation alignment algorithm using the extended observation equations in the Kalman filter is more efficient than the traditional two-position alignment method. Using Coriolis vibration gyros with bias instability of 0.1°/h, a north-finding accuracy of 0.1° (1σ is achieved by the new continuous rotation alignment algorithm, compared with 0.6° (1σ north-finding accuracy for the two-position alignment and 1° (1σ for the fixed-position alignment.

  19. A New Continuous Rotation IMU Alignment Algorithm Based on Stochastic Modeling for Cost Effective North-Finding Applications

    Science.gov (United States)

    Li, Yun; Wu, Wenqi; Jiang, Qingan; Wang, Jinling

    2016-01-01

    Based on stochastic modeling of Coriolis vibration gyros by the Allan variance technique, this paper discusses Angle Random Walk (ARW), Rate Random Walk (RRW) and Markov process gyroscope noises which have significant impacts on the North-finding accuracy. A new continuous rotation alignment algorithm for a Coriolis vibration gyroscope Inertial Measurement Unit (IMU) is proposed in this paper, in which the extended observation equations are used for the Kalman filter to enhance the estimation of gyro drift errors, thus improving the north-finding accuracy. Theoretical and numerical comparisons between the proposed algorithm and the traditional ones are presented. The experimental results show that the new continuous rotation alignment algorithm using the extended observation equations in the Kalman filter is more efficient than the traditional two-position alignment method. Using Coriolis vibration gyros with bias instability of 0.1°/h, a north-finding accuracy of 0.1° (1σ) is achieved by the new continuous rotation alignment algorithm, compared with 0.6° (1σ) north-finding accuracy for the two-position alignment and 1° (1σ) for the fixed-position alignment. PMID:27983585

  20. A model of the radiation-induced bystander effect based on an analogy with ferromagnets. Application to modelling tissue response in a uniform field

    Science.gov (United States)

    Vassiliev, O. N.

    2014-12-01

    We propose a model of the radiation-induced bystander effect based on an analogy with magnetic systems. The main benefit of this approach is that it allowed us to apply powerful methods of statistical mechanics. The model exploits the similarity between how spin-spin interactions result in correlations of spin states in ferromagnets, and how signalling from a damaged cell reduces chances of survival of neighbour cells, resulting in correlated cell states. At the root of the model is a classical Hamiltonian, similar to that of an Ising ferromagnet with long-range interactions. The formalism is developed in the framework of the Mean Field Theory. It is applied to modelling tissue response in a uniform radiation field. In this case the results are remarkably simple and at the same time nontrivial. They include cell survival curves, expressions for the tumour control probability and effects of fractionation. The model extends beyond of what is normally considered as bystander effects. It offers an insight into low-dose hypersensitivity and into mechanisms behind threshold doses for deterministic effects.

  1. Cell-Based Biosensors Principles and Applications

    CERN Document Server

    Wang, Ping

    2009-01-01

    Written by recognized experts the field, this leading-edge resource is the first book to systematically introduce the concept, technology, and development of cell-based biosensors. You find details on the latest cell-based biosensor models and novel micro-structure biosensor techniques. Taking an interdisciplinary approach, this unique volume presents the latest innovative applications of cell-based biosensors in a variety of biomedical fields. The book also explores future trends of cell-based biosensors, including integrated chips, nanotechnology and microfluidics. Over 140 illustrations hel

  2. Aviation Safety Modeling and Simulation (ASMM) Propulsion Fleet Modeling: A Tool for Semi-Automatic Construction of CORBA-based Applications from Legacy Fortran Programs

    Science.gov (United States)

    Sang, Janche

    2003-01-01

    Within NASA's Aviation Safety Program, NASA GRC participates in the Modeling and Simulation Project called ASMM. NASA GRC s focus is to characterize the propulsion systems performance from a fleet management and maintenance perspective by modeling and through simulation predict the characteristics of two classes of commercial engines (CFM56 and GE90). In prior years, the High Performance Computing and Communication (HPCC) program funded, NASA Glenn in developing a large scale, detailed simulations for the analysis and design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). Three major aspects of this modeling included the integration of different engine components, coupling of multiple disciplines, and engine component zooming at appropriate level fidelity, require relatively tight coupling of different analysis codes. Most of these codes in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with distributed objects can increase these codes reusability. Aviation Safety s modeling and simulation use in characterizing fleet management has similar needs. The modeling and simulation of these propulsion systems use existing Fortran and C codes that are instrumental in determining the performance of the fleet. The research centers on building a CORBA-based development environment for programmers to easily wrap and couple legacy Fortran codes. This environment consists of a C++ wrapper library to hide the details of CORBA and an efficient remote variable scheme to facilitate data exchange between the client and the server model. Additionally, a Web Service model should also be constructed for evaluation of this technology s use over the next two- three years.

  3. Model-Based Security Testing

    CERN Document Server

    Schieferdecker, Ina; Schneider, Martin; 10.4204/EPTCS.80.1

    2012-01-01

    Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security testing (MBST) is a relatively new field and especially dedicated to the systematic and efficient specification and documentation of security test objectives, security test cases and test suites, as well as to their automated or semi-automated generation. In particular, the combination of security modelling and test generation approaches is still a challenge in research and of high interest for industrial applications. MBST includes e.g. security functional testing, model-based fuzzing, risk- and threat-oriented testing,...

  4. A fingerprint orientation model based on 2D Fourier expansion (FOMFE) and its application to singular-point detection and fingerprint indexing.

    Science.gov (United States)

    Wang, Yi; Hu, Jiankun; Phillips, Damien

    2007-04-01

    In this paper, we have proposed a fingerprint orientation model based on 2D Fourier expansions (FOMFE) in the phase plane. The FOMFE does not require prior knowledge of singular points (SPs). It is able to describe the overall ridge topology seamlessly, including the SP regions, even for noisy fingerprints. Our statistical experiments on a public database show that the proposed FOMFE can significantly improve the accuracy of fingerprint feature extraction and thus that of fingerprint matching. Moreover, the FOMFE has a low-computational cost and can work very efficiently on large fingerprint databases. The FOMFE provides a comprehensive description for orientation features, which has enabled its beneficial use in feature-related applications such as fingerprint indexing. Unlike most indexing schemes using raw orientation data, we exploit FOMFE model coefficients to generate the feature vector. Our indexing experiments show remarkable results using different fingerprint databases.

  5. Optimal pricing decision model based on activity-based costing

    Institute of Scientific and Technical Information of China (English)

    王福胜; 常庆芳

    2003-01-01

    In order to find out the applicability of the optimal pricing decision model based on conventional costbehavior model after activity-based costing has given strong shock to the conventional cost behavior model andits assumptions, detailed analyses have been made using the activity-based cost behavior and cost-volume-profitanalysis model, and it is concluded from these analyses that the theory behind the construction of optimal pri-cing decision model is still tenable under activity-based costing, but the conventional optimal pricing decisionmodel must be modified as appropriate to the activity-based costing based cost behavior model and cost-volume-profit analysis model, and an optimal pricing decision model is really a product pricing decision model construc-ted by following the economic principle of maximizing profit.

  6. The Development of Dynamic Brand Equity Chase Model and Its Application to Digital Industry Based on Scanner Data

    Directory of Open Access Journals (Sweden)

    Nam Yongsik

    2009-12-01

    Full Text Available The purpose of this research is to develop a comprehensive modeling for measuring dynamics of brand power. We define brand power as brand specific coefficients to yield the sales volume for each period. The modeling consists of multinomial log it model for eachproduct category, the brand-specific coefficients, mixture modeling and fuzzy clustering algorithm. We apply our modeling to TV scanner data in Tianjin China. The results show 5 brands have 12 to 23 times change on their brand power in a year. The lasting time of brandpower spreads from 1 week to 12 weeks.

  7. Application of flood risk modelling in a web-based geospatial decision support tool for coastal adaptation to climate change

    Directory of Open Access Journals (Sweden)

    P. J. Knight

    2015-02-01

    Full Text Available A pressing problem facing coastal decision makers is the conversion of "high level" but plausible climate change assessments into an effective basis for climate change adaptation at the local scale. Here, we describe a web-based, geospatial decision-support tool (DST that provides an assessment of the potential flood risk for populated coastal lowlands arising from future sea-level rise, coastal storms and high river flows. This DST has been developed to support operational and strategic decision making by enabling the user to explore the flood hazard from extreme events, changes in the extent of the flood-prone areas with sea-level rise, and thresholds of sea-level rise where current policy and resource options are no longer viable. The DST is built in an open source GIS that uses freely available geospatial data. Flood risk assessments from a combination of LISFLOOD-FP and SWAB models are embedded within the tool; the user interface enables interrogation of different combinations of coastal and river events under rising sea-level scenarios. Users can readily vary the input parameters (sea level, storms, wave height and river flow relative to the present-day topography and infrastructure to identify combinations where significant regime shifts or "tipping points" occur. Two case studies are used to demonstrate the attributes of the DST with respect to the wider coastal community and the UK energy sector. Examples report on the assets at risk and illustrate the extent of flooding in relation to infrastructure access. This informs an economic assessment of potential losses due to climate change and thus provides local authorities and energy operators with essential information on the feasibility of investment for building resilience into vulnerable components of their area of responsibility.

  8. Research and Application of Role-Based Access Control Model in Web Application System%Web应用系统中RBAC模型的研究与实现

    Institute of Scientific and Technical Information of China (English)

    黄秀文

    2015-01-01

    Access control is the main strategy of security and protection in Web system, the traditional access control can not meet the needs of the growing security. With using the role based access control (RBAC) model and introducing the concept of the role in the web system, the user is mapped to a role in an organization, access to the corresponding role authorization, access authorization and control according to the user's role in an organization, so as to improve the web system flexibility and security permissions and access control.%访问控制是Web系统中安全防范和保护的主要策略,传统的访问控制已不能满足日益增长的安全性需求。本文在web应用系统中,使用基于角色的访问控制(RBAC)模型,通过引入角色的概念,将用户映射为在一个组织中的某种角色,将访问权限授权给相应的角色,根据用户在组织内所处的角色进行访问授权与控制,从而提高了在web系统中权限分配和访问控制的灵活性与安全性。

  9. Location Based Services and Applications

    Directory of Open Access Journals (Sweden)

    Elenis Gorrita Michel

    2012-05-01

    Full Text Available Location Based Services (LBS continue to grow in popularity, effectiveness and reliability, to the extent that applications are designed and implemented taking into account the facilities of the user location information. In this work, some of the main applications are addressed, in order to make an assessment of the current importance of the LBS, as a branch of technology in full swing. In addition, the main techniques for location estimation are studied, essential information to the LBS. Because of this it is a highly topical issue, the ongoing works and researches are also discussed.

  10. Research protocol: EB-GIS4HEALTH UK – foundation evidence base and ontology-based framework of modular, reusable models for UK/NHS health and healthcare GIS applications

    Directory of Open Access Journals (Sweden)

    Boulos Maged

    2005-01-01

    Full Text Available Abstract EB-GIS4HEALTH UK aims at building a UK-oriented foundation evidence base and modular conceptual models for GIS applications and programmes in health and healthcare to improve the currently poor GIS state of affairs within the NHS; help the NHS understand and harness the importance of spatial information in the health sector in order to better respond to national health plans, priorities, and requirements; and also foster the much-needed NHS-academia GIS collaboration. The project will focus on diabetes and dental care, which together account for about 11% of the annual NHS budget, and are thus important topics where GIS can help optimising resource utilisation and outcomes. Virtual e-focus groups will ensure all UK/NHS health GIS stakeholders are represented. The models will be built using Protégé ontology editor http://protege.stanford.edu/ based on the best evidence pooled in the project's evidence base (from critical literature reviews and e-focus groups. We will disseminate our evidence base, GIS models, and documentation through the project's Web server. The models will be human-readable in different ways to inform NHS GIS implementers, and it will be possible to also use them to generate the necessary template databases (and even to develop "intelligent" health GIS solutions using software agents for running the modelled applications. Our products and experience in this project will be transferable to address other national health topics based on the same principles. Our ultimate goal is to provide the NHS with practical, vendor-neutral, modular workflow models, and ready-to-use, evidence-based frameworks for developing successful GIS business plans and implementing GIS to address various health issues. NHS organisations adopting such frameworks will achieve a common understanding of spatial data and processes, which will enable them to efficiently and effectively share, compare, and integrate their data silos and results for

  11. An On-Line Modeling Based Kalman Filtering Process for Time-Interval-Variable Sequences with Application to Astronomic Surveying

    Institute of Scientific and Technical Information of China (English)

    韩建国; 孙才红; 李彦琴

    2003-01-01

    The problem of variable sampling time interval which appears in application of Kalman Filtering is analyzed and the corresponding filtering process with or without present transition matrix is suggested, then an application experiment for astronomical surveying is introduced. In this process, the known stochastically variable sampling time intervals play the roles as deterministic input sequences of the state-space description, and the corresponding matrix and (if needed) state transition matrix can be established by performing real-time and structure-linear system identification.

  12. Neural Network-Based Modeling of PEM fuel cell and Controller Synthesis of a stand-alone system for residential application

    OpenAIRE

    Khaled Mammar; Abdelkader Chaker

    2012-01-01

    The paper is focused especially on presenting possibilities of applying artificial neural networks at creating the optimal model PEM fuel cell. Various ANN approaches have been tested; the back-propagation feed-forward networks show satisfactory performance with regard to cell voltage prediction. The model is then used in a power system for residential application. This models include an ANN fuel cell stack model, reformer model and DC/AC inverter model. Furthermore a neural network (NNTC) an...

  13. A Role-Based Fuzzy Assignment Model

    Institute of Scientific and Technical Information of China (English)

    ZUO Bao-he; FENG Shan

    2002-01-01

    It's very important to dynamically assign the tasks to corresponding actors in workflow management system, especially in complex applications. This improves the flexibility of workflow systems.In this paper, a role-based workflow model with fuzzy optimized intelligent assignment is proposed and applied in the investment management system. A groupware-based software model is also proposed.

  14. Application of a dynamic population-based model for evaluation of exposure reduction strategies in the baking industry

    NARCIS (Netherlands)

    Meijster, T.; Warren, N.; Heederik, D.; Tielemans, E.

    2009-01-01

    Recently a dynamic population model was developed that simulates a population of bakery workers longitudinally through time and tracks the development of work-related sensitisation and respiratory symptoms in each worker. Input for this model comes from cross-sectional and longitudinal epidemiologic

  15. A Gaussian Mixture MRF for Model-Based Iterative Reconstruction with Applications to Low-Dose X-ray CT

    CERN Document Server

    Zhang, Ruoqiao; Pal, Debashish; Thibault, Jean-Baptiste; Sauer, Ken D; Bouman, Charles A

    2016-01-01

    Markov random fields (MRFs) have been widely used as prior models in various inverse problems such as tomographic reconstruction. While MRFs provide a simple and often effective way to model the spatial dependencies in images, they suffer from the fact that parameter estimation is difficult. In practice, this means that MRFs typically have very simple structure that cannot completely capture the subtle characteristics of complex images. In this paper, we present a novel Gaussian mixture Markov random field model (GM-MRF) that can be used as a very expressive prior model for inverse problems such as denoising and reconstruction. The GM-MRF forms a global image model by merging together individual Gaussian-mixture models (GMMs) for image patches. In addition, we present a novel analytical framework for computing MAP estimates using the GM-MRF prior model through the construction of surrogate functions that result in a sequence of quadratic optimizations. We also introduce a simple but effective method to adjust...

  16. Comparison between traditional laboratory tests, permeability measurements and CT-based fluid flow modelling for cultural heritage applications.

    Science.gov (United States)

    De Boever, Wesley; Bultreys, Tom; Derluyn, Hannelore; Van Hoorebeke, Luc; Cnudde, Veerle

    2016-06-01

    In this paper, we examine the possibility to use on-site permeability measurements for cultural heritage applications as an alternative for traditional laboratory tests such as determination of the capillary absorption coefficient. These on-site measurements, performed with a portable air permeameter, were correlated with the pore network properties of eight sandstones and one granular limestone that are discussed in this paper. The network properties of the 9 materials tested in this study were obtained from micro-computed tomography (μCT) and compared to measurements and calculations of permeability and the capillary absorption rate of the stones under investigation, in order to find the correlation between pore network characteristics and fluid management characteristics of these sandstones. Results show a good correlation between capillary absorption, permeability and network properties, opening the possibility of using on-site permeability measurements as a standard method in cultural heritage applications.

  17. An Effective Security Mechanism for M-Commerce Applications Exploiting Ontology Based Access Control Model for Healthcare System

    OpenAIRE

    S.M. Roychoudri; Dr. M. Aramudhan

    2016-01-01

    Health organizations are beginning to move mobile commerce services in recent years to enhance services and quality without spending much investment for IT infrastructure. Medical records are very sensitive and private to any individuals. Hence effective security mechanism is required. The challenges of our research work are to maintain privacy for the users and provide smart and secure environment for accessing the application. It is achieved with the help of personalization. Internet has pr...

  18. Yin-yang of space travel: lessons from the ground-based models of microgravity and their applications to disease and health for life on Earth

    Science.gov (United States)

    Kulkarni, A.; Yamauchi, K.; Hales, N.; Sundaresan, A.; Pellis, N.; Yamamoto, S.; Andrassy, R.

    Space flight environment has numerous clinical effects on human physiology; however, the advances made in physical and biological sciences have benefited humans on Earth. Space flight induces adverse effects on bone, muscle, cardiovascular, neurovestibular, gastrointestinal, and immune function. Similar pathophysiologic changes are also observed in aging with debilitating consequences. Anti-orthostatic tail-suspension (AOS) of rodents is an in vivo model to study many of these effects induced by the microgravity environment of space travel. Over the years AOS has been used by several researchers to study bone demineralization, muscle atrophy, neurovestibular and stress related effects. ecently we employed the AOS model in parallel with in vitro cell culture microgravity analog (Bioreactor) to document the decrease in immune function and its reversal by a nutritional countermeasure. We have modified the rodent model to study nutrient effects and benefits in a short period of time, usually within one to two weeks, in contrast to conventional aging research models which take several weeks to months to get the same results. This model has a potential for further development to study the role of nutrition in other pathophysiologies in an expedited manner. Using this model it is possible to evaluate the response of space travelers of various ages to microgravity stressors for long-term space travel. Hence this modified model will have significant impact on time and financial research budget. For the first time our group has documented a true potential immunonutritional countermeasure for the space flight induced effects on immune system (Clinical Nutrition 2002). Based on our nutritional and immunological studies we propose application of these microgravity analogs and its benefits and utility for nutritional effects on other physiologic parameters especially in aging. (Supported by NASA NCC8-168 grant, ADK)

  19. Model of Hot Metal Silicon Content in Blast Furnace Based on Principal Component Analysis Application and Partial Least Square

    Institute of Scientific and Technical Information of China (English)

    SHI Lin; LI Zhi-ling; YU Tao; LI Jiang-peng

    2011-01-01

    In blast furnace (BF) iron-making process, the hot metal silicon content was usually used to measure the quality of hot metal and to reflect the thermal state of BF. Principal component analysis (PCA) and partial least- square (PLS) regression methods were used to predict the hot metal silicon content. Under the conditions of BF rela- tively stable situation, PCA and PLS regression models of hot metal silicon content utilizing data from Baotou Steel No. 6 BF were established, which provided the accuracy of 88.4% and 89.2%. PLS model used less variables and time than principal component analysis model, and it was simple to calculate. It is shown that the model gives good results and is helpful for practical production.

  20. Application of the Pareto Principle in Rapid Application Development Model

    Directory of Open Access Journals (Sweden)

    Vishal Pandey

    2013-06-01

    Full Text Available the Pareto principle or most popularly termed as the 80/20 rule is one of the well-known theories in the field of economics. This rule of thumb was named after the great economist Vilferdo Pareto. The Pareto principle was proposed by a renowned management consultant Joseph M Juran. The rule states that 80% of the required work can be completed in 20% of the time allotted. The idea is to apply this rule of thumb in the Rapid Application Development (RAD Process model of software engineering. The Rapid application development model integrates end-user in the development using iterative prototyping emphasizing on delivering a series of fully functional prototype to designated user experts. During the application of Pareto Principle the other concepts like the Pareto indifference curve and Pareto efficiency also come into the picture. This enables the development team to invest major amount of time focusing on the major functionalities of the project as per the requirement prioritizationof the customer. The paper involves an extensive study on different unsatisfactory projects in terms of time and financial resources and the reasons of failures are analyzed. Based on the possible reasons offailure, a customized RAD model is proposed integrating the 80/20 rule and advanced software development strategies to develop and deploy excellent quality software product in minimum time duration. The proposed methodology is such that its application will directly affect the quality of the end product for the better.

  1. A finite-element-based perturbation model for the rotordynamic analysis of shrouded pump impellers: Part 1: Model development and applications

    Science.gov (United States)

    Baskharone, Erian A.

    1993-01-01

    This study concerns the rotor dynamic characteristics of fluid-encompassed rotors, with special emphasis on shrouded pump impellers. The core of the study is a versatile and categorically new finite-element-based perturbation model, which is based on a rigorous flow analysis and what we have generically termed the 'virtually' deformable finite-element approach. The model is first applied to the case of a smooth annular seal for verification purposes. The rotor excitation components, in this sample problem, give rise to a purely cylindrical, purely conical, and a simultaneous cylindrical/conical rotor whirl around the housing centerline. In all cases, the computed results are compared to existing experimental and analytical data involving the same seal geometry and operating conditions. Next, two labyrinth-seal configurations, which share the same tooth-to-tooth chamber geometry but differ in the total number of chambers, were investigated. The results, in this case, are compared to experimental measurements for both seal configurations. The focus is finally shifted to the shrouded-impeller problem, where the stability effects of the leakage flow in the shroud-to-housing secondary passage are investigated. To this end, the computational model is applied to a typical shrouded-impeller pump stage, fabricated and rotor dynamically tested by Sulzer Bros., and the results compared to those of a simplified 'bulk-flow' analysis and Sulzer Bros.' test data. In addition to assessing the computed rotor dynamic coefficients, the shrouded-impeller study also covers a controversial topic, namely that of the leakage-passage inlet swirl, which was previously cited as the origin of highly unconventional (resonance-like) trends of the fluid-exerted forces. In order to validate this claim, a 'microscopic' study of the fluid/shroud interaction mechanism is conducted, with the focus being on the structure of the perturbed flow field associated with the impeller whirl. The conclusions

  2. A model-based approach to adjust microwave observations for operational applications: results of a campaign at Munich Airport in winter 2011/2012

    Directory of Open Access Journals (Sweden)

    J. Güldner

    2013-10-01

    Full Text Available In the frame of the project "LuFo iPort VIS" which focuses on the implementation of a site-specific visibility forecast, a field campaign was organised to offer detailed information to a numerical fog model. As part of additional observing activities, a 22-channel microwave radiometer profiler (MWRP was operating at the Munich Airport site in Germany from October 2011 to February 2012 in order to provide vertical temperature and humidity profiles as well as cloud liquid water information. Independently from the model-related aims of the campaign, the MWRP observations were used to study their capabilities to work in operational meteorological networks. Over the past decade a growing quantity of MWRP has been introduced and a user community (MWRnet was established to encourage activities directed at the set up of an operational network. On that account, the comparability of observations from different network sites plays a fundamental role for any applications in climatology and numerical weather forecast. In practice, however, systematic temperature and humidity differences (bias between MWRP retrievals and co-located radiosonde profiles were observed and reported by several authors. This bias can be caused by instrumental offsets and by the absorption model used in the retrieval algorithms as well as by applying a non-representative training data set. At the Lindenberg observatory, besides a neural network provided by the manufacturer, a measurement-based regression method was developed to reduce the bias. These regression operators are calculated on the basis of coincident radiosonde observations and MWRP brightness temperature (TB measurements. However, MWRP applications in a network require comparable results at just any site, even if no radiosondes are available. The motivation of this work is directed to a verification of the suitability of the operational local forecast model COSMO-EU of the Deutscher Wetterdienst (DWD for the calculation

  3. A Brief Introduction of the Achievements of Key Project Image-based Modeling and Rendering for Virtual Reality Applications

    Institute of Scientific and Technical Information of China (English)

    Jiaoying Shi; Zhanyi Hu; Enhua Wu; Qunsheng Peng

    2006-01-01

    @@ 1.Background The virtual reality (VR) technology is now at the frontier of modern information science.VR is based on computer graphics,computer vision,and other fresh air topics in today's computer technology.

  4. RIGID-PLASTIC/RIGID-VISCOPLASTIC FEM BASED ON LINEAR PROGRAMMING-THEORETICAL MODELING AND APPLICATION FOR PLANE-STRAIN PROBLEMS

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    A new rigid-plastic/rigid-viscoplastic (RP/RVP) FEM based on linear programming (LP) for plane-strain metal forming simulation is proposed. Compared with the traditional RP/RVP FEM based on iteration solution, it has some remarkable advantages, such as it's free of convergence problem and its convenience in contact, incompressibility constraint and rigid zone treatment. Two solution examples are provided to validate its accuracy and efficiency.

  5. Comparative Study of Hybrid Models Based on a Series of Optimization Algorithms and Their Application in Energy System Forecasting

    Directory of Open Access Journals (Sweden)

    Xuejiao Ma

    2016-08-01

    Full Text Available Big data mining, analysis, and forecasting play vital roles in modern economic and industrial fields, especially in the energy system. Inaccurate forecasting may cause wastes of scarce energy or electricity shortages. However, forecasting in the energy system has proven to be a challenging task due to various unstable factors, such as high fluctuations, autocorrelation and stochastic volatility. To forecast time series data by using hybrid models is a feasible alternative of conventional single forecasting modelling approaches. This paper develops a group of hybrid models to solve the problems above by eliminating the noise in the original data sequence and optimizing the parameters in a back propagation neural network. One of contributions of this paper is to integrate the existing algorithms and models, which jointly show advances over the present state of the art. The results of comparative studies demonstrate that the hybrid models proposed not only satisfactorily approximate the actual value but also can be an effective tool in the planning and dispatching of smart grids.

  6. Applications of species distribution modeling to paleobiology

    DEFF Research Database (Denmark)

    Svenning, J.-C.; Fløjgaard, Camilla; A. Marske, Katharine;

    2011-01-01

    Species distribution modeling (SDM: statistical and/or mechanistic approaches to the assessment of range determinants and prediction of species occurrence) offers new possibilities for estimating and studying past organism distributions. SDM complements fossil and genetic evidence by providing (i...... of applications of SDM to paleobiology, outlining the methodology, reviewing SDM-based studies to paleobiology or at the interface of paleo- and neobiology, discussing assumptions and uncertainties as well as how to handle them, and providing a synthesis and outlook. Key methodological issues for SDM applications...

  7. A framework for the application of physically-oriented glacio-hydrological models in the Himalaya-Karakorum region based on a new approach of uncertainty evaluation

    Science.gov (United States)

    Ragettli, S.; Pellicciotti, F.

    2013-12-01

    distinct climatic patterns. Our results suggest that in both regions, melt below debris covered glaciers and gravitational snow redistribution are key processes that need to be represented accurately by the model. The application of the method was finally iterated using a unique dataset of locally acquired field data from the debris covered Lirung glacier in the Langtang catchment, which allowed repeating the sensitivity analysis with a new set of better constrained parameters. The results suggest that an iterative framework based on the presented methodology feeding local data collection and vice versa can efficiently reduce overall model uncertainty.

  8. Application of multilevel scheme and two level discretization for POD based model order reduction of nonlinear transient heat transfer problems

    Science.gov (United States)

    Gaonkar, A. K.; Kulkarni, S. S.

    2015-01-01

    In the present paper, a method to reduce the computational cost associated with solving a nonlinear transient heat conduction problem is presented. The proposed method combines the ideas of two level discretization and the multilevel time integration schemes with the proper orthogonal decomposition model order reduction technique. The accuracy and the computational efficiency of the proposed methods is discussed. Several numerical examples are presented for validation of the approach. Compared to the full finite element model, the proposed method significantly reduces the computational time while maintaining an acceptable level of accuracy.

  9. Comparison between traditional laboratory tests, permeability measurements and CT-based fluid flow modelling for cultural heritage applications

    Energy Technology Data Exchange (ETDEWEB)

    De Boever, Wesley, E-mail: Wesley.deboever@ugent.be [UGCT/PProGRess, Dept. of Geology, Ghent University, Krijgslaan 281, 9000 Ghent (Belgium); Bultreys, Tom; Derluyn, Hannelore [UGCT/PProGRess, Dept. of Geology, Ghent University, Krijgslaan 281, 9000 Ghent (Belgium); Van Hoorebeke, Luc [UGCT/Radiation Physics, Dept. of Physics & Astronomy, Ghent University, Proeftuinstraat 86, 9000 Ghent (Belgium); Cnudde, Veerle [UGCT/PProGRess, Dept. of Geology, Ghent University, Krijgslaan 281, 9000 Ghent (Belgium)

    2016-06-01

    In this paper, we examine the possibility to use on-site permeability measurements for cultural heritage applications as an alternative for traditional laboratory tests such as determination of the capillary absorption coefficient. These on-site measurements, performed with a portable air permeameter, were correlated with the pore network properties of eight sandstones and one granular limestone that are discussed in this paper. The network properties of the 9 materials tested in this study were obtained from micro-computed tomography (μCT) and compared to measurements and calculations of permeability and the capillary absorption rate of the stones under investigation, in order to find the correlation between pore network characteristics and fluid management characteristics of these sandstones. Results show a good correlation between capillary absorption, permeability and network properties, opening the possibility of using on-site permeability measurements as a standard method in cultural heritage applications. - Highlights: • Measurements of capillary absorption are compared to in-situ permeability. • We obtain pore size distribution and connectivity by using micro-CT. • These properties explain correlation between permeability and capillarity. • Correlation between both methods is good to excellent. • Permeability measurements could be a good alternative to capillarity measurement.

  10. Early FDI Based on Residuals Design According to the Analysis of Models of Faults: Application to DAMADICS

    Directory of Open Access Journals (Sweden)

    Yahia Kourd

    2011-01-01

    Full Text Available The increased complexity of plants and the development of sophisticated control systems have encouraged the parallel development of efficient rapid fault detection and isolation (FDI systems. FDI in industrial system has lately become of great significance. This paper proposes a new technique for short time fault detection and diagnosis in nonlinear dynamic systems with multi inputs and multi outputs. The main contribution of this paper is to develop a FDI schema according to reference models of fault-free and faulty behaviors designed with neural networks. Fault detection is obtained according to residuals that result from the comparison of measured signals with the outputs of the fault free reference model. Then, Euclidean distance from the outputs of models of faults to the measurements leads to fault isolation. The advantage of this method is to provide not only early detection but also early diagnosis thanks to the parallel computation of the models of faults and to the proposed decision algorithm. The effectiveness of this approach is illustrated with simulations on DAMADICS benchmark.

  11. Application of a dynamic population-based model for evaluation of exposure reduction strategies in the baking industry

    Energy Technology Data Exchange (ETDEWEB)

    Meijster, Tim; Tielemans, Erik [TNO Quality of Life, Business unit Quality and Safety, Zeist (Netherlands); Warren, Nick [Health and Safety Laboratory, Harpur Hill, Buxton, Derbyshire (United Kingdom); Heederik, Dick, E-mail: Tim.meijster@tno.n [Utrecht University, Institute of Risk Assessment Sciences, Division of Environmental Epidemiology, Utrecht (Netherlands)

    2009-02-01

    Recently a dynamic population model was developed that simulates a population of bakery workers longitudinally through time and tracks the development of work-related sensitisation and respiratory symptoms in each worker. Input for this model comes from cross-sectional and longitudinal epidemiological studies which allowed estimation of exposure response relationships and disease transition probabilities This model allows us to study the development of diseases and transitions between disease states over time in relation to determinants of disease including flour dust and/or allergen exposure. Furthermore it enables more realistic modelling of the health impact of different intervention strategies at the workplace (e.g. changes in exposure may take several years to impact on ill-health and often occur as a gradual trend). A large dataset of individual full-shift exposure measurements and real-time exposure measurements were used to obtain detailed insight into the effectiveness of control measures and other determinants of exposure. Given this information a population wide reduction of the median exposure with 50% was evaluated in this paper.

  12. Pattern-based Automatic Translation of Structured Power System Data to Functional Models for Decision Support Applications

    DEFF Research Database (Denmark)

    Heussen, Kai; Weckesser, Johannes Tilman Gabriel; Kullmann, Daniel

    2013-01-01

    Improved information and insight for decision support in operations and design are central promises of a smart grid. Well-structured information about the composition of power systems is increasingly becoming available in the domain, e.g. due to standard information models (e.g. CIM or IEC61850...

  13. 3D micro-particle image modeling and its application in measurement resolution investigation for visual sensing based axial localization in an optical microscope

    Science.gov (United States)

    Wang, Yuliang; Li, Xiaolai; Bi, Shusheng; Zhu, Xiaofeng; Liu, Jinhua

    2017-01-01

    Visual sensing based three dimensional (3D) particle localization in an optical microscope is important for both fundamental studies and practical applications. Compared with the lateral (X and Y) localization, it is more challenging to achieve a high resolution measurement of axial particle location. In this study, we aim to investigate the effect of different factors on axial measurement resolution through an analytical approach. Analytical models were developed to simulate 3D particle imaging in an optical microscope. A radius vector projection method was applied to convert the simulated particle images into radius vectors. With the obtained radius vectors, a term of axial changing rate was proposed to evaluate the measurement resolution of axial particle localization. Experiments were also conducted for comparison with that obtained through simulation. Moreover, with the proposed method, the effects of particle size on measurement resolution were discussed. The results show that the method provides an efficient approach to investigate the resolution of axial particle localization.

  14. Applications of a thermal-based two-source energy balance model using Priestley-Taylor approach for surface temperature partitioning under advective conditions

    Science.gov (United States)

    Song, Lisheng; Kustas, William P.; Liu, Shaomin; Colaizzi, Paul D.; Nieto, Hector; Xu, Ziwei; Ma, Yanfei; Li, Mingsong; Xu, Tongren; Agam, Nurit; Tolk, Judy A.; Evett, Steven R.

    2016-09-01

    In this study ground measured soil and vegetation component temperatures and composite temperature from a high spatial resolution thermal camera and a network of thermal-IR sensors collected in an irrigated maize field and in an irrigated cotton field are used to assess and refine the component temperature partitioning approach in the Two-Source Energy Balance (TSEB) model. A refinement to TSEB using a non-iterative approach based on the application of the Priestley-Taylor formulation for surface temperature partitioning and estimating soil evaporation from soil moisture observations under advective conditions (TSEB-A) was developed. This modified TSEB formulation improved the agreement between observed and modeled soil and vegetation temperatures. In addition, the TSEB-A model output of evapotranspiration (ET) and the components evaporation (E), transpiration (T) when compared to ground observations using the stable isotopic method and eddy covariance (EC) technique from the HiWATER experiment and with microlysimeters and a large monolithic weighing lysimeter from the BEAREX08 experiment showed good agreement. Difference between the modeled and measured ET measurements were less than 10% and 20% on a daytime basis for HiWATER and BEAREX08 data sets, respectively. The TSEB-A model was found to accurately reproduce the temporal dynamics of E, T and ET over a full growing season under the advective conditions existing for these irrigated crops located in arid/semi-arid climates. With satellite data this TSEB-A modeling framework could potentially be used as a tool for improving water use efficiency and conservation practices in water limited regions. However, TSEB-A requires soil moisture information which is not currently available routinely from satellite at the field scale.

  15. Extension and validation of ARTM (atmospheric radionuclide transportation model) for the application as dispersion calculation model in AVV (general administrative provision) and SBG (incident calculation bases); Erweiterung und Validierung von ARTM fuer den Einsatz als Ausbreitungsmodell in AVV und SBG

    Energy Technology Data Exchange (ETDEWEB)

    Martens, Reinhard; Bruecher, Wenzel; Richter, Cornelia; Sentuc, Florence; Sogalla, Martin; Thielen, Harald

    2012-02-15

    In the medium-term time scale the Gaussian plume model used so far for atmospheric dispersion calculations in the General Administrative Provision (AVV) relating to Section 47 of the Radiation Protection Ordinance (StrISchV) as well as in the Incident Calculation Bases (SGB) relating to Section 49 StrISchV is to be replaced by a Lagrangian particle model. Meanwhile the Atmospheric Radionuclide Transportation Model (ARTM) is available, which allows the simulation of the atmospheric dispersion of operational releases from nuclear installations. ARTM is based on the program package AUSTAL2000 which is designed for the simulation of atmospheric dispersion of nonradioactive operational releases from industrial plants and was adapted to the application of airborne radioactive releases. In the context of the research project 3608S05005 possibilities for an upgrade of ARTM were investigated and implemented as far as possible to the program system. The work program comprises the validation and evaluation of ARTM, the implementation of technical-scientific extensions of the model system and the continuation of experience exchange between developers and users. In particular, the suitability of the model approach for simulations of radiological consequences according to the German SBG and the representation of the influence of buildings typical for nuclear power stations have been validated and further evaluated. Moreover, post-processing modules for calculation of dose-relevant decay products and for dose calculations have been developed and implemented. In order to continue the experience feedback and exchange, a web page has been established and maintained. Questions by users and other feedback have been dealt with and a common workshop has been held. The continued development and validation of ARTM has strengthened the basis for applications of this model system in line with the German regulations AVV and SBG. Further activity in this field can contribute to maintain and

  16. An operational weather radar-based Quantitative Precipitation Estimation and its application in catchment water resources modeling

    DEFF Research Database (Denmark)

    He, Xin; Vejen, Flemming; Stisen, Simon

    2011-01-01

    of precipitation compared with rain-gauge-based methods, thus providing the basis for better water resources assessments. The radar QPE algorithm called ARNE is a distance-dependent areal estimation method that merges radar data with ground surface observations. The method was applied to the Skjern River catchment...... reliable simulations of stream flow and water balance. The potential of using radar-based precipitation was found to be especially high at a smaller scale, where the impact of spatial resolution was evident from the stream discharge results. Also, groundwater recharge was shown to be sensitive...

  17. Behavior and Design Intent Based Product Modeling

    Directory of Open Access Journals (Sweden)

    László Horváth

    2004-11-01

    Full Text Available A knowledge based modeling of mechanical products is presented for industrial CAD/CAM systems. An active model is proposed that comprise knowledge from modeling procedures, generic part models and engineers. Present day models of mechanical systems do not contain data about the background of human decisions. This situation motivated the authors at their investigations on exchange design intent information between engineers. Their concept was extending of product models to be capable of description of design intent information. Several human-computer and human-human communication issues were considered. The complex communication problem has been divided into four sub-problems, namely communication of human intent source with the computer system, representation of human intent, exchange of intent data between modeling procedures and communication of the represented intent with humans. Paper discusses the scenario of intelligent modeling based engineering. Then key concepts for the application of computational intelligence in computer model based engineering systems are detailed including knowledge driven models as well as areas of their application. Next, behavior based models with intelligent content involving specifications and knowledge for the design processes are emphasized and an active part modeling is proposed and possibilities for its application are outlined. Finally, design intent supported intelligent modeling is discussed.

  18. A Habitat-based Wind-Wildlife Collision Model with Application to the Upper Great Plains Region

    Energy Technology Data Exchange (ETDEWEB)

    Forcey, Greg, M.

    2012-08-28

    Most previous studies on collision impacts at wind facilities have taken place at the site-specific level and have only examined small-scale influences on mortality. In this study, we examine landscape-level influences using a hierarchical spatial model combined with existing datasets and life history knowledge for: Horned Lark, Red-eyed Vireo, Mallard, American Avocet, Golden Eagle, Whooping Crane, red bat, silver-haired bat, and hoary bat. These species were modeled in the central United States within Bird Conservation Regions 11, 17, 18, and 19. For the bird species, we modeled bird abundance from existing datasets as a function of habitat variables known to be preferred by each species to develop a relative abundance prediction for each species. For bats, there are no existing abundance datasets so we identified preferred habitat in the landscape for each species and assumed that greater amounts of preferred habitat would equate to greater abundance of bats. The abundance predictions for bird and bats were modeled with additional exposure factors known to influence collisions such as visibility, wind, temperature, precipitation, topography, and behavior to form a final mapped output of predicted collision risk within the study region. We reviewed published mortality studies from wind farms in our study region and collected data on reported mortality of our focal species to compare to our modeled predictions. We performed a sensitivity analysis evaluating model performance of 6 different scenarios where habitat and exposure factors were weighted differently. We compared the model performance in each scenario by evaluating observed data vs. our model predictions using spearmans rank correlations. Horned Lark collision risk was predicted to be highest in the northwestern and west-central portions of the study region with lower risk predicted elsewhere. Red-eyed Vireo collision risk was predicted to be the highest in the eastern portions of the study region and in

  19. A thermodynamically-based model for predicting microbial growth and community composition coupled to system geochemistry: Application to uranium bioreduction

    Energy Technology Data Exchange (ETDEWEB)

    Istok, Jonathan D.; Park, Melora M.; Michalsen, Mandy M.; Spain, A. M.; Krumholz, Lee R.; Liu, Chongxuan; McKinley, James P.; Long, Philip E.; Roden, Eric E.; Peacock, Aaron D.; Baldwin, Brett R.

    2010-04-01

    ‘Bioimmobilization’ of redox-sensitive heavy metals and radionuclides is being investigated as a way to remediate contaminated groundwater and sediments. In one approach, growth-limiting substrates are added to the subsurface to stimulate the activity of targeted groups of indigenous microorganisms and create conditions favorable for the microbially-mediated reductive precipitation (‘bioreduction’) of targeted contaminants. We present a theoretical framework for modeling this process that modifies conventional geochemical reaction path modeling to include thermodynamic descriptions for microbial growth and may be called biogeochemical reaction path modeling. In this approach, the actual microbial community is represented by a synthetic microbial community consisting of a collection of microbial groups; each with a unique growth equation that couples a specific pair of energy yielding redox reactions. The growth equations and their computed standard-state free energy yields are appended to the thermodynamic databasse used in conventional geochemical reaction path modeling, providing a direct coupling between chemical species participating in both microbial growth and geochemical reactions. To compute the biogeochemical reaction paths, growth substrates are added incrementally to a defined geochemical environment and the coupled equations are solved simultaneously to predict microbial biomass, community composition (i.e. the fraction of total biomass in each microbial group), and the aqueous and mineral composition of the system, including aqueous speciation and oxidation state of the targeted contaminants. The approach, with growth equations derived from the literature using well known bioenergetics principles, was used to predict the results of a laboratory microcosm experiment and an in situ field experiment that investigated the bioreduction of uranium. Predicted effects of ethanol or acetate addition on uranium concentration and speciation, major ion

  20. A thermodynamically-based model for predicting microbial growth and community composition coupled to system geochemistry: Application to uranium bioreduction

    Science.gov (United States)

    Istok, J. D.; Park, M.; Michalsen, M.; Spain, A. M.; Krumholz, L. R.; Liu, C.; McKinley, J.; Long, P.; Roden, E.; Peacock, A. D.; Baldwin, B.

    2010-03-01

    'Bioimmobilization' of redox-sensitive heavy metals and radionuclides is being investigated as a way to remediate contaminated groundwater and sediments. In one approach, growth-limiting substrates are added to the subsurface to stimulate the activity of targeted groups of indigenous microorganisms and create conditions favorable for the microbially-mediated reductive precipitation ('bioreduction') of targeted contaminants. We present a theoretical framework for modeling this process that modifies conventional geochemical reaction path modeling to include thermodynamic descriptions for microbial growth and may be called biogeochemical reaction path modeling. In this approach, the actual microbial community is represented by a synthetic microbial community consisting of a collection of microbial groups; each with a unique growth equation that couples a specific pair of energy yielding redox reactions. The growth equations and their computed standard-state free energy yields are appended to the thermodynamic database used in conventional geochemical reaction path modeling, providing a direct coupling between chemical species participating in both microbial growth and geochemical reactions. To compute the biogeochemical reaction paths, growth substrates are reacted incrementally with the defined geochemical environment and the coupled equations are solved simultaneously to predict reaction paths that display changing microbial biomass, community composition (i.e. the fraction of total biomass in each microbial group), and the aqueous and mineral composition of the system, including aqueous speciation and oxidation state of the targeted contaminants. The approach, with growth equations derived from the literature using well-known bioenergetics principles, was used to predict the results of a laboratory microcosm experiment and an in situ field experiment that investigated the bioreduction of uranium. Predicted effects of ethanol or acetate addition on uranium

  1. A thermodynamically-based model for predicting microbial growth and community composition coupled to system geochemistry: Application to uranium bioreduction.

    Science.gov (United States)

    Istok, J D; Park, M; Michalsen, M; Spain, A M; Krumholz, L R; Liu, C; McKinley, J; Long, P; Roden, E; Peacock, A D; Baldwin, B

    2010-03-01

    'Bioimmobilization' of redox-sensitive heavy metals and radionuclides is being investigated as a way to remediate contaminated groundwater and sediments. In one approach, growth-limiting substrates are added to the subsurface to stimulate the activity of targeted groups of indigenous microorganisms and create conditions favorable for the microbially-mediated reductive precipitation ('bioreduction') of targeted contaminants. We present a theoretical framework for modeling this process that modifies conventional geochemical reaction path modeling to include thermodynamic descriptions for microbial growth and may be called biogeochemical reaction path modeling. In this approach, the actual microbial community is represented by a synthetic microbial community consisting of a collection of microbial groups; each with a unique growth equation that couples a specific pair of energy yielding redox reactions. The growth equations and their computed standard-state free energy yields are appended to the thermodynamic database used in conventional geochemical reaction path modeling, providing a direct coupling between chemical species participating in both microbial growth and geochemical reactions. To compute the biogeochemical reaction paths, growth substrates are reacted incrementally with the defined geochemical environment and the coupled equations are solved simultaneously to predict reaction paths that display changing microbial biomass, community composition (i.e. the fraction of total biomass in each microbial group), and the aqueous and mineral composition of the system, including aqueous speciation and oxidation state of the targeted contaminants. The approach, with growth equations derived from the literature using well-known bioenergetics principles, was used to predict the results of a laboratory microcosm experiment and an in situ field experiment that investigated the bioreduction of uranium. Predicted effects of ethanol or acetate addition on uranium

  2. Stabilization effect of traffic flow in an extended car-following model based on an intelligent transportation system application.

    Science.gov (United States)

    Ge, H X; Dai, S Q; Dong, L Y; Xue, Y

    2004-12-01

    An extended car following model is proposed by incorporating an intelligent transportation system in traffic. The stability condition of this model is obtained by using the linear stability theory. The results show that anticipating the behavior of more vehicles ahead leads to the stabilization of traffic systems. The modified Korteweg-de Vries equation (the mKdV equation, for short) near the critical point is derived by applying the reductive perturbation method. The traffic jam could be thus described by the kink-antikink soliton solution for the mKdV equation. From the simulation of space-time evolution of the vehicle headway, it is shown that the traffic jam is suppressed efficiently with taking into account the information about the motion of more vehicles in front, and the analytical result is consonant with the simulation one.

  3. Lévy-based growth models

    DEFF Research Database (Denmark)

    Jónsdóttir, Kristjana Ýr; Schmiegel, Jürgen; Jensen, Eva Bjørn Vedel

    2008-01-01

    In the present paper, we give a condensed review, for the nonspecialist reader, of a new modelling framework for spatio-temporal processes, based on Lévy theory. We show the potential of the approach in stochastic geometry and spatial statistics by studying Lévy-based growth modelling of planar...... objects. The growth models considered are spatio-temporal stochastic processes on the circle. As a by product, flexible new models for space–time covariance functions on the circle are provided. An application of the Lévy-based growth models to tumour growth is discussed....

  4. Struts Application Frame Based on MVC Design Model%基于MVC设计模式的Struts框架应用

    Institute of Scientific and Technical Information of China (English)

    毕磊; 邓忠华

    2007-01-01

    介绍Struts概念和体系结构,通过程序示例探讨Struts三个主要功能模块Controller, Model, View之间的内在联系及各自的处理流程,展现Struts能够更好帮助Java开发者利用J2EE开发大型Web应用的优势.

  5. Estimation of the physico-chemical parameters of materials based on rare earth elements with the application of computational model

    Science.gov (United States)

    Mamaev, K.; Obkhodsky, A.; Popov, A.

    2016-01-01

    Computational model, technique and the basic principles of operation program complex for quantum-chemical calculations of material's physico-chemical parameters with rare earth elements are discussed. The calculating system is scalable and includes CPU and GPU computational resources. Control and operation of computational jobs and also Globus Toolkit 5 software provides the possibility to join computer users in a unified system of data processing with peer-to-peer architecture. CUDA software is used to integrate graphic processors into calculation system.

  6. An Evaluation of Diagnostic Atmospheric Dispersion Models for ’Cold Spill’ Applications at Vandenberg Air Force Base, California

    Science.gov (United States)

    1992-12-30

    tandem with ADPIC , RIMPUFF, or CALPUIF diffusion may provide the most robust diagnostic modeling suite. A cost/benefit analysis of computer hardware and...LINCOM/RIMPUFF 43 H. GENERAL DENSE GAS DISCUSSION 51 I. HEAVYPUFF 52 J. MATHEW/ ADPIC TICQUALYf 54 V. SUMMARY COMMENTS Accesion For 65 VI. REFERENCES...serious contenders: NUATMOS/CITPUFF, CALMET/CALPUFF, PGEMS, WOCSS/MACHWIND/Adaptive plume, LINCOM/ RIMPUFF/HEAVYPUFF, MATHEW/ ADPIC . The following

  7. Security Assessment of Web Based Distributed Applications

    Directory of Open Access Journals (Sweden)

    Catalin BOJA

    2010-01-01

    Full Text Available This paper presents an overview about the evaluation of risks and vulnerabilities in a web based distributed application by emphasizing aspects concerning the process of security assessment with regards to the audit field. In the audit process, an important activity is dedicated to the measurement of the characteristics taken into consideration for evaluation. From this point of view, the quality of the audit process depends on the quality of assessment methods and techniques. By doing a review of the fields involved in the research process, the approach wants to reflect the main concerns that address the web based distributed applications using exploratory research techniques. The results show that many are the aspects which must carefully be worked with, across a distributed system and they can be revealed by doing a depth introspective analyze upon the information flow and internal processes that are part of the system. This paper reveals the limitations of a non-existing unified security risk assessment model that could prevent such risks and vulnerabilities debated. Based on such standardize models, secure web based distributed applications can be easily audited and many vulnerabilities which can appear due to the lack of access to information can be avoided.

  8. Study and Application of Reinforcement Learning in Cooperative Strategy of the Robot Soccer Based on BDI Model

    Directory of Open Access Journals (Sweden)

    Wu Bo-ying

    2009-11-01

    Full Text Available The dynamic cooperation model of multi-Agent is formed by combining reinforcement learning with BDI model. In this model, the concept of the individual optimization loses its meaning, because the repayment of each Agent dose not only depend on itsself but also on the choice of other Agents. All Agents can pursue a common optimum solution and try to realize the united intention as a whole to a maximum limit. The robot moves to its goal, depending on the present positions of the other robots that cooperate with it and the present position of the ball. One of these robots cooperating with it is controlled to move by man with a joystick. In this way, Agent can be ensured to search for each state-action as frequently as possible when it carries on choosing movements, so as to shorten the time of searching for the movement space so that the convergence speed of reinforcement learning can be improved. The validity of the proposed cooperative strategy for the robot soccer has been proved by combining theoretical analysis with simulation robot soccer match (11vs11 .

  9. 基于Copula的机械系统可靠性模型及其应用%A Copula-based Mechanical System Reliability Model and Its Application

    Institute of Scientific and Technical Information of China (English)

    何成铭; 吴纬; 孟庆均

    2012-01-01

    There is a complex correlation in the reliability of mechanical system. A reliability model of Copula-based mechanical system is presented by taking the advantage of Copula in describing correlation. The application of the model in mechanical system reliability prediction is described by taking a suspension system of some type armored vehicle for example. The result shows that the model can be perfectly used in the reliability prediction of mechanical system.%机械系统可靠性中存在复杂的相关性,利用连接函数Copula在描述相关性方面的优势,提出了基于Copula的机械系统可靠性模型.以某型装甲车辆悬挂系统为例,阐述了其在系统可靠性预计中的应用.结果表明,该模型可以较好地解决机械系统的可靠性预计问题.

  10. Voxel-Based LIDAR Analysis and Applications

    Science.gov (United States)

    Hagstrom, Shea T.

    One of the greatest recent changes in the field of remote sensing is the addition of high-quality Light Detection and Ranging (LIDAR) instruments. In particular, the past few decades have been greatly beneficial to these systems because of increases in data collection speed and accuracy, as well as a reduction in the costs of components. These improvements allow modern airborne instruments to resolve sub-meter details, making them ideal for a wide variety of applications. Because LIDAR uses active illumination to capture 3D information, its output is fundamentally different from other modalities. Despite this difference, LIDAR datasets are often processed using methods appropriate for 2D images and that do not take advantage of its primary virtue of 3-dimensional data. It is this problem we explore by using volumetric voxel modeling. Voxel-based analysis has been used in many applications, especially medical imaging, but rarely in traditional remote sensing. In part this is because the memory requirements are substantial when handling large areas, but with modern computing and storage this is no longer a significant impediment. Our reason for using voxels to model scenes from LIDAR data is that there are several advantages over standard triangle-based models, including better handling of overlapping surfaces and complex shapes. We show how incorporating system position information from early in the LIDAR point cloud generation process allows radiometrically-correct transmission and other novel voxel properties to be recovered. This voxelization technique is validated on simulated data using the Digital Imaging and Remote Sensing Image Generation (DIRSIG) software, a first-principles based ray-tracer developed at the Rochester Institute of Technology. Voxel-based modeling of LIDAR can be useful on its own, but we believe its primary advantage is when applied to problems where simpler surface-based 3D models conflict with the requirement of realistic geometry. To

  11. Tip-tilt disturbance model identification for Kalman-based control scheme: application to XAO and ELT systems.

    Science.gov (United States)

    Meimon, Serge; Petit, Cyril; Fusco, Thierry; Kulcsar, Caroline

    2010-11-01

    Adaptive optics (AO) systems have to correct tip-tilt (TT) disturbances down to a fraction of the diffraction-limited spot. This becomes a key issue for very or extremely large telescopes affected by mechanical vibration peaks or wind shake effects. Linear quadratic Gaussian (LQG) control achieves optimal TT correction when provided with the temporal model of the disturbance. We propose a nonsupervised identification procedure that does not require any auxiliary system or loop opening and validate it on synthetic profile as well as on experimental data.

  12. DNA computing model based on lab-on-a-chip and its application to solving the timetabling problem

    Institute of Scientific and Technical Information of China (English)

    Fengyue Zhang; Bo Liu; Wenbin Liu; Qiang Zhang

    2008-01-01

    The essential characteristic of DNA computation is its massive parallelism in obtaining and managing information.With the development of molecular biology technique,the field of DNA computation has made a great progress.By using an advanced biochip technique,laboratory-on-a-chip,a new DNA computing model is presented in the paper to solve a simple timetabling problem,which is a special version ofthe optimization problems.It also plays an important role in education and other industries.With a simulated biological experiment,the result suggested that DNA computation with lab-on-a-chip has the potential to solve a real complex timetabling problem.

  13. Application of Multicast-based Video Conference on CERNET Backbone

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Multicast-based video conference is a representative application in advanced network. In multi-point video conference using multicast can get better efficiency facilitated by inner-group broadcast mechanism. In the application, the multicast-based network resources assignment, management and security should be considered together. This paper presents a framework model of multicast-based video conferencing application with three layers. And a practical multicast-based video conferencing is implemented in CERNET(China Education and Research Network) backbone. The practice is valuable for the development of multicast-based video conferencing application in China.

  14. A robust hybrid model integrating enhanced inputs based extreme learning machine with PLSR (PLSR-EIELM) and its application to intelligent measurement.

    Science.gov (United States)

    He, Yan-Lin; Geng, Zhi-Qiang; Xu, Yuan; Zhu, Qun-Xiong

    2015-09-01

    In this paper, a robust hybrid model integrating an enhanced inputs based extreme learning machine with the partial least square regression (PLSR-EIELM) was proposed. The proposed PLSR-EIELM model can overcome two main flaws in the extreme learning machine (ELM), i.e. the intractable problem in determining the optimal number of the hidden layer neurons and the over-fitting phenomenon. First, a traditional extreme learning machine (ELM) is selected. Second, a method of randomly assigning is applied to the weights between the input layer and the hidden layer, and then the nonlinear transformation for independent variables can be obtained from the output of the hidden layer neurons. Especially, the original input variables are regarded as enhanced inputs; then the enhanced inputs and the nonlinear transformed variables are tied together as the whole independent variables. In this way, the PLSR can be carried out to identify the PLS components not only from the nonlinear transformed variables but also from the original input variables, which can remove the correlation among the whole independent variables and the expected outputs. Finally, the optimal relationship model of the whole independent variables with the expected outputs can be achieved by using PLSR. Thus, the PLSR-EIELM model is developed. Then the PLSR-EIELM model served as an intelligent measurement tool for the key variables of the Purified Terephthalic Acid (PTA) process and the High Density Polyethylene (HDPE) process. The experimental results show that the predictive accuracy of PLSR-EIELM is stable, which indicate that PLSR-EIELM has good robust character. Moreover, compared with ELM, PLSR, hierarchical ELM (HELM), and PLSR-ELM, PLSR-EIELM can achieve much smaller predicted relative errors in these two applications.

  15. DESIGN OF LOW CYTOTOXICITY DIARYLANILINE DERIVATIVES BASED ON QSAR RESULTS: AN APPLICATION OF ARTIFICIAL NEURAL NETWORK MODELLING

    Directory of Open Access Journals (Sweden)

    Ihsanul Arief

    2016-11-01

    Full Text Available Study on cytotoxicity of diarylaniline derivatives by using quantitative structure-activity relationship (QSAR has been done. The structures and cytotoxicities of  diarylaniline derivatives were obtained from the literature. Calculation of molecular and electronic parameters was conducted using Austin Model 1 (AM1, Parameterized Model 3 (PM3, Hartree-Fock (HF, and density functional theory (DFT methods.  Artificial neural networks (ANN analysis used to produce the best equation with configuration of input data-hidden node-output data = 5-8-1, value of r2 = 0.913; PRESS = 0.069. The best equation used to design and predict new diarylaniline derivatives.  The result shows that compound N1-(4′-Cyanophenyl-5-(4″-cyanovinyl-2″,6″-dimethyl-phenoxy-4-dimethylether benzene-1,2-diamine is the best-proposed compound with cytotoxicity value (CC50 of 93.037 μM.

  16. Galerkin solution of Winkler foundation-based irregular Kirchhoff plate model and its application in crown pillar optimization

    Institute of Scientific and Technical Information of China (English)

    彭康; 尹旭岩; 尹光志; 许江; 黄滚; 殷志强

    2016-01-01

    Irregular plates are very common structures in engineering, such as ore structures in mining. In this work, the Galerkin solution to the problem of a Kirchhoff plate lying on the Winkler foundation with two edges simply supported and the other two clamped supported is derived. Coordinate transformation technique is used during the solving process so that the solution is suitable to irregular shaped plates. The mechanical model and the solution proposed are then used to model the crown pillars between two adjacent levels in Sanshandao gold mine, which uses backfill method for mining operation. After that, an objective function, which takes security, economic profits and filling effect into consideration, is built to evaluate design proposals. Thickness optimizations for crown pillars are finally conducted in both conditions that the vertical stiffness of the foundation is known and unknown. The procedure presented in the work provides the guidance in thickness designing of complex shaped crown pillars and the preparation of backfill materials, thus to achieve the best balance between security and profits.

  17. Studies of acid-base homeostasis during simulated weightlessness: Application of the water immersion model to man

    Science.gov (United States)

    Epstein, M.

    1975-01-01

    The effects of water immersion on acid-base homeostasis were investigated under carefully controlled conditions. Studies of renal acidification were carried out on seven healthy male subjects, each consuming a diet containing 150 meq sodium and 100 meq potassium. Control and immersion studies were carried out on each subject on the fourth and sixth days, respectively, of dietary equilibration, by which time all subjects had achieved sodium balance. The experimental protocols on study days were similar (except for the amount of water administered).

  18. Model-based predictive direct power control of brushless doubly fed reluctance generator for wind power applications

    Directory of Open Access Journals (Sweden)

    Maryam Moazen

    2016-09-01

    Full Text Available In this paper, a predictive direct power control (PDPC method for the brushless doubly fed reluctance generator (BDFRG is proposed. Firstly, the BDFRG active and reactive power equations are derived and then the active and reactive power variations have been predicted within a fixed sampling period. The predicted power variations are used to calculate the required voltage of the secondary winding so that the power errors at the end of the following sampling period are eliminated. Switching pulses are produced using space vector pulse width modulation (SVPWM approach which causes to a fixed switching frequency. The BDFRG model and the proposed control method are simulated in MATLAB/Simulink software. Simulation results indicate the good performance of the control system in tracking of the active and reactive power references in both power step and speed variation conditions. In addition, fast dynamic response and lower output power ripple are other advantages of this control method.

  19. INCLUSION RATIO BASED ESTIMATOR FOR THE MEAN LENGTH OF THE BOOLEAN LINE SEGMENT MODEL WITH AN APPLICATION TO NANOCRYSTALLINE CELLULOSE

    Directory of Open Access Journals (Sweden)

    Mikko Niilo-Rämä

    2014-06-01

    Full Text Available A novel estimator for estimating the mean length of fibres is proposed for censored data observed in square shaped windows. Instead of observing the fibre lengths, we observe the ratio between the intensity estimates of minus-sampling and plus-sampling. It is well-known that both intensity estimators are biased. In the current work, we derive the ratio of these biases as a function of the mean length assuming a Boolean line segment model with exponentially distributed lengths and uniformly distributed directions. Having the observed ratio of the intensity estimators, the inverse of the derived function is suggested as a new estimator for the mean length. For this estimator, an approximation of its variance is derived. The accuracies of the approximations are evaluated by means of simulation experiments. The novel method is compared to other methods and applied to real-world industrial data from nanocellulose crystalline.

  20. Optimal multi-agent path planning for fast inverse modeling in UAV-based flood sensing applications

    KAUST Repository

    Abdelkader, Mohamed

    2014-05-01

    Floods are the most common natural disasters, causing thousands of casualties every year in the world. In particular, flash flood events are particularly deadly because of the short timescales on which they occur. Unmanned air vehicles equipped with mobile microsensors could be capable of sensing flash floods in real time, saving lives and greatly improving the efficiency of the emergency response. However, of the main issues arising with sensing floods is the difficulty of planning the path of the sensing agents in advance so as to obtain meaningful data as fast as possible. In this particle, we present a fast numerical scheme to quickly compute the trajectories of a set of UAVs in order to maximize the accuracy of model parameter estimation over a time horizon. Simulation results are presented, a preliminary testbed is briefly described, and future research directions and problems are discussed. © 2014 IEEE.

  1. Application of the Johnson-Kendall-Roberts model in AFM-based mechanical measurements on cells and gel.

    Science.gov (United States)

    Efremov, Yu M; Bagrov, D V; Kirpichnikov, M P; Shaitan, K V

    2015-10-01

    The force-distance curves (FCs) obtained by the atomic force microscope (AFM) with colloid probes contain information about both the viscoelastic properties and adhesion of a sample. Here, we processed both the approach and retraction parts of FCs obtained on polyacrylamide gels (in water or PBS) and Vero cells (in a culture medium). The Johnson-Kendall-Roberts model was applied to the retraction curves to account for the adhesion. The effects of loading rate, holding time and indentation depth on adhesion force and Young's modulus, calculated from approach and retraction curves, were studied. It was shown that both bulk and local interfacial viscoelasticity can affect the observed approach-retraction hysteresis and measured parameters. The addition of 1% bovine serum albumin (BSA) decreased adhesion of the probe to the PAA gel surface, so interfacial viscoelasticity effects were diminished. On the contrary, the adhesiveness of Vero cells increased after BSA addition, indicating the complex nature of the cell-probe interaction.

  2. Research and application on integration modeling of 3D bodies in coal mine with blended data model based on TIN and ARTP

    Institute of Scientific and Technical Information of China (English)

    HAN Zuo-zhen; HAN Rui-dong; MAO Shan-jun; HAN Jing-min

    2007-01-01

    Data modeling is the foundation of three-dimensional visualization technology.First the paper proposed the 3D integrated data model of stratum, laneway and drill on the basic of TIN and ARTP, and designed the relevant conceptual and logical model from the view of data model, and described the data structure of geometric elements of the model by adopting the object-oriented modeling idea. And then studied the key modeling technology of stratum, laneway and drill, introduced the ARTP modeling process of stratum,laneway and drill and studied the 3D geometric modeling process of different section laneways. At last, the paper realized the three-dimensional visualization system professionally coalmine-oriented, using SQL Server as background database, Visual C++6.0 and OpenGL as foreground development tools.

  3. Research on Application of Perceptual Model Based Image Compression%基于视觉的模型在图象压缩中应用的研究

    Institute of Scientific and Technical Information of China (English)

    刘阳; 周泓; 徐小良

    2001-01-01

    针对在图象处理中对压缩和解压缩方法的评价缺乏一个与人类视觉相关性很好的模型这一问题,提出并使用了基于人类视觉的模型。并使用在lena和fruit这两幅图象处理中的经典图片上,取得了和人类观测结果吻合的结论。%Most existing efforts in image and video compression have focused on developing methods that minimize mathematically tractable distortion metrics(such as MSE).While such distortion measures were foumd to perform well for many applications,they do not always correlate well with the perceived quality.We present a perceptual model,which would decide which code/decoder method is the best at different code rate.The main idea of this perceptual model is to firstly decomplse both the original image and the recovered image using wavclet transform, and then compare them at different subband and orientation to get perceptual PSNR.We use this perceptual model to compare four different codec, i.e JPEG,PIC4x4,PIC8x8 and Said Pearlman codec for the typical images-lena.The results are well correlated with the human visual system.Another interesting point here is that we find that this wavelet based perceptual metric does favor the wavelet-based codec Said-Pearlman codec somehow (esp.at the lower bit rate).

  4. Cotton Assorting Optimized Model Based on HVI Data and Its Application%基于HVI数据的配棉优选模型及应用研究

    Institute of Scientific and Technical Information of China (English)

    邱兆宝

    2012-01-01

    研究基于HVI数据的配棉方案优选模型的创建方法及应用效果.以HVI数据为基础,配棉技术标准为依据,运用系统工程的思想和方法,遵循配棉原则,利用较少的定量信息使决策的思维过程数学化,从而为多目标、多准则的配棉问题提供比较简便的决策方法.模型的求解采用隐枚举法,有效地解决了非线性目标整数规划配棉问题.认为:配棉方案优选模型与配棉方案的评价方法,可作为棉纺质量工艺专家指导系统的组成部分.%Establish method and application effect of cotton assorting optimized model based on HVI data were researched. HVI data were taken as base, technology standard of cotton assorting were selected as foundation, ideas and method of system engineering was used. Principle of cotton assorting should be followed. Decision-making process of thinking could be mathematical by using less quantitative information, so decision method of cotton assorting problems with multi-objective and multi criteria can be provided. Implicit enumeration was used for model solve,cotton assorting problems of nonlinear objective integer programming can be solved effectively. It is considered that the cotton assorting optimization model and evaluation methods could be the part of cotton spinning processing expert guide system.

  5. Model-Based Security Testing

    Directory of Open Access Journals (Sweden)

    Ina Schieferdecker

    2012-02-01

    Full Text Available Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security testing (MBST is a relatively new field and especially dedicated to the systematic and efficient specification and documentation of security test objectives, security test cases and test suites, as well as to their automated or semi-automated generation. In particular, the combination of security modelling and test generation approaches is still a challenge in research and of high interest for industrial applications. MBST includes e.g. security functional testing, model-based fuzzing, risk- and threat-oriented testing, and the usage of security test patterns. This paper provides a survey on MBST techniques and the related models as well as samples of new methods and tools that are under development in the European ITEA2-project DIAMONDS.

  6. DATA MODELING METHOD BASED ON PARTIAL LEAST SQUARE REGRESSION AND APPLICATION IN CORRELATION ANALYSIS OF THE STATOR BARS CONDITION PARAMETERS

    Institute of Scientific and Technical Information of China (English)

    李锐华; 高乃奎; 谢恒堃; 史维祥

    2004-01-01

    Objective To investigate various data message of the stator bars condition parameters under the condition that only a few samples are available, especially about correlation information between the nondestructive parameters and residual breakdown voltage of the stator bars. Methods Artificial stator bars is designed to simulate the generator bars. The partial didcharge( PD) and dielectric loss experiments are performed in order to obtain the nondestructive parameters, and the residual breakdown voltage acquired by AC damage experiment. In order to eliminate the dimension effect on measurement data, raw data is preprocessed by centered-compress. Based on the idea of extracting principal components, a partial least square (PLS) method is applied to screen and synthesize correlation information between the nondestructive parameters and residual breakdown voltage easily. Moreover, various data message about condition parameters are also discussed. Results Graphical analysis function of PLS is easily to understand various data message of the stator bars condition parameters. The analysis Results are consistent with result of aging testing. Conclusion The method can select and extract PLS components of condition parameters from sample data, and the problems of less samples and multicollinearity are solved effectively in regression analysis.

  7. Two Strategies Of Agent-Based Modelling Application For Management Of Lakeland Landscapes At A Regional Scale

    Directory of Open Access Journals (Sweden)

    Giełda-Pinas Katarzyna

    2015-09-01

    Full Text Available This work presents two different strategies of ABM for management of selected lakeland landscapes and their impact on sustainable development. Two different lakeland research areas as well as two different sets of agents and their decision rules were compared. In Strategy 1 decisions made by farmers and their influence on the land use/cover pattern as well as the indirect consequence of phosphorus and nitrogen delivery to the water bodies were investigated. In this strategy, a group of farmer agents is encouraged to participate in an agri-environmental program. The Strategy 2 combines the decisions of farmers, foresters and local authorities. The agents in the model share a common goal to produce a spatial plan. The land use/cover patterns arising from different attitudes and decision rules of the involved actors were investigated. As the basic spatial unit, the first strategy employed a landscape unit, i.e. lake catchment whereas the second strategy used an administrative unit, i.e. commune. Both strategies resulted in different land use/cover patterns and changes, which were evaluated in terms of sustainability policy. The main conclusion for Strategy 1 is that during 5 years of farmer’s participation in the agri-environmental program, there was significant decrease of nutrient leaching to the lake. The main conclusion for Strategy 2 should be stated that cooperating of the agents is better for the natural environment than the competitions between them. In both strategies, agents’ decisions influence the environment but different spatial units of analysis express this environment.

  8. Cardiac C-arm CT: 4D non-model based heart motion estimation and its application

    Science.gov (United States)

    Prümmer, M.; Fahrig, R.; Wigström, L.; Boese, J.; Lauritsch, G.; Strobel, N.; Hornegger, J.

    2007-03-01

    The combination of real-time fluoroscopy and 3D cardiac imaging on the same C-arm system is a promising technique that might improve therapy planning, guiding, and monitoring in the interventional suite. In principal, to reconstruct a 3D image of the beating heart at a particular cardiac phase, a complete set of X-ray projection data representing that phase is required. One approximate approach is the retrospectively ECG-gated FDK reconstruction (RG-FDK). From the acquired data set of N s multiple C-arm sweeps, those projection images which are acquired closest in time to the desired cardiac phase are retrospectively selected. However, this approach uses only 1/ N s of the obtained data. Our goal is to utilize data from other cardiac phases as well. In order to minimize blurring and motion artifacts, cardiac motion has to be compensated for, which can be achieved using a temporally dependent spatial 3D warping of the filtered-backprojections. In this work we investigate the computation of the 4D heart motion based on prior reconstructions of several cardiac phases using RG-FDK. A 4D motion estimation framework is presented using standard fast non-rigid registration. A smooth 4D motion vector field (MVF) represents the relative deformation compared to a reference cardiac phase. A 4D deformation regridding by adaptive supersampling allows selecting any reference phase independently of the set of phases used in the RG-FDK for a motion corrected reconstruction. Initial promising results from in vivo experiments are shown. The subjects individual 4D cardiac MVF could be computed from only three RG-FDK image volumes. In addition, all acquired projection data were motion corrected and subsequently used for image reconstruction to improve the signal-to-noise ratio compared to RG-FDK.

  9. Application of Improved Radiation Modeling to General Circulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Michael J Iacono

    2011-04-07

    This research has accomplished its primary objectives of developing accurate and efficient radiation codes, validating them with measurements and higher resolution models, and providing these advancements to the global modeling community to enhance the treatment of cloud and radiative processes in weather and climate prediction models. A critical component of this research has been the development of the longwave and shortwave broadband radiative transfer code for general circulation model (GCM) applications, RRTMG, which is based on the single-column reference code, RRTM, also developed at AER. RRTMG is a rigorously tested radiation model that retains a considerable level of accuracy relative to higher resolution models and measurements despite the performance enhancements that have made it possible to apply this radiation code successfully to global dynamical models. This model includes the radiative effects of all significant atmospheric gases, and it treats the absorption and scattering from liquid and ice clouds and aerosols. RRTMG also includes a statistical technique for representing small-scale cloud variability, such as cloud fraction and the vertical overlap of clouds, which has been shown to improve cloud radiative forcing in global models. This development approach has provided a direct link from observations to the enhanced radiative transfer provided by RRTMG for application to GCMs. Recent comparison of existing climate model radiation codes with high resolution models has documented the improved radiative forcing capability provided by RRTMG, especially at the surface, relative to other GCM radiation models. Due to its high accuracy, its connection to observations, and its computational efficiency, RRTMG has been implemented operationally in many national and international dynamical models to provide validated radiative transfer for improving weather forecasts and enhancing the prediction of global climate change.

  10. Multiscale modeling of nanostructured ZnO based devices for optoelectronic applications: Dynamically-coupled structural fields, charge, and thermal transport processes

    Science.gov (United States)

    Abdullah, Abdulmuin; Alqahtani, Saad; Nishat, Md Rezaul Karim; Ahmed, Shaikh; SIU Nanoelectronics Research Group Team

    Recently, hybrid ZnO nanostructures (such as ZnO deposited on ZnO-alloys, Si, GaN, polymer, conducting oxides, and organic compounds) have attracted much attention for their possible applications in optoelectronic devices (such as solar cells, light emitting and laser diodes), as well as in spintronics (such as spin-based memory, and logic). However, efficiency and performance of these hybrid ZnO devices strongly depend on an intricate interplay of complex, nonlinear, highly stochastic and dynamically-coupled structural fields, charge, and thermal transport processes at different length and time scales, which have not yet been fully assessed experimentally. In this work, we study the effects of these coupled processes on the electronic and optical emission properties in nanostructured ZnO devices. The multiscale computational framework employs the atomistic valence force-field molecular mechanics, models for linear and non-linear polarization, the 8-band sp3s* tight-binding models, and coupling to a TCAD toolkit to determine the terminal properties of the device. A series of numerical experiments are performed (by varying different nanoscale parameters such as size, geometry, crystal cut, composition, and electrostatics) that mainly aim to improve the efficiency of these devices. Supported by the U.S. National Science Foundation Grant No. 1102192.

  11. Application of simulation models for the optimization of business processes

    Science.gov (United States)

    Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří

    2016-06-01

    The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.

  12. 校园数据集成过渡模式研究与应用%Research and application of data integrated transition model based on campus network

    Institute of Scientific and Technical Information of China (English)

    郭政慧

    2012-01-01

    分析了校园网各业务系统的特点和软件架构模式,提出了利用面向服务架构实现数据集成的过渡方案.采用多粒度服务设计原则,将遗留系统封装成为服务构件.经过比较,选择统一的Web服务标准接口方式,并给出了校园网业务集成的步骤、方法,分析其可行性,并应用于具体实践中.%With the construction of information systems,there has been a wide variety of isolated applications.Based on this,this paper analyzes the characteristics of business systems and software architecture model,then proposes service-oriented architecture for business integration framework.Based on the principles of multi-granularity services,Legacy systems will be encapsulated as a web service component.Finally,the article describes the steps,methods,feasibility of the business integration on compus network.

  13. Auditory model inversion and its application

    Institute of Scientific and Technical Information of China (English)

    ZHAO Heming; WANG Yongqi; CHEN Xueqin

    2005-01-01

    Auditory model has been applied to several aspects of speech signal processing field, and appears to be effective in performance. This paper presents the inverse transform of each stage of one widely used auditory model. First of all it is necessary to invert correlogram and reconstruct phase information by repetitious iterations in order to get auditory-nerve firing rate. The next step is to obtain the negative parts of the signal via the reverse process of the HWR (Half Wave Rectification). Finally the functions of inner hair cell/synapse model and Gammatone filters have to be inverted. Thus the whole auditory model inversion has been achieved. An application of noisy speech enhancement based on auditory model inversion algorithm is proposed. Many experiments show that this method is effective in reducing noise.Especially when SNR of noisy speech is low it is more effective than other methods. Thus this auditory model inversion method given in this paper is applicable to speech enhancement field.

  14. Comparison and applicability of landslide susceptibility models based on landslide ratio-based logistic regression, frequency ratio, weight of evidence, and instability index methods in an extreme rainfall event

    Science.gov (United States)

    Wu, Chunhung

    2016-04-01

    Few researches have discussed about the applicability of applying the statistical landslide susceptibility (LS) model for extreme rainfall-induced landslide events. The researches focuses on the comparison and applicability of LS models based on four methods, including landslide ratio-based logistic regression (LRBLR), frequency ratio (FR), weight of evidence (WOE), and instability index (II) methods, in an extreme rainfall-induced landslide cases. The landslide inventory in the Chishan river watershed, Southwestern Taiwan, after 2009 Typhoon Morakot is the main materials in this research. The Chishan river watershed is a tributary watershed of Kaoping river watershed, which is a landslide- and erosion-prone watershed with the annual average suspended load of 3.6×107 MT/yr (ranks 11th in the world). Typhoon Morakot struck Southern Taiwan from Aug. 6-10 in 2009 and dumped nearly 2,000 mm of rainfall in the Chishan river watershed. The 24-hour, 48-hour, and 72-hours accumulated rainfall in the Chishan river watershed exceeded the 200-year return period accumulated rainfall. 2,389 landslide polygons in the Chishan river watershed were extracted from SPOT 5 images after 2009 Typhoon Morakot. The total landslide area is around 33.5 km2, equals to the landslide ratio of 4.1%. The main landslide types based on Varnes' (1978) classification are rotational and translational slides. The two characteristics of extreme rainfall-induced landslide event are dense landslide distribution and large occupation of downslope landslide areas owing to headward erosion and bank erosion in the flooding processes. The area of downslope landslide in the Chishan river watershed after 2009 Typhoon Morakot is 3.2 times higher than that of upslope landslide areas. The prediction accuracy of LS models based on LRBLR, FR, WOE, and II methods have been proven over 70%. The model performance and applicability of four models in a landslide-prone watershed with dense distribution of rainfall

  15. Crowdsourcing Based 3d Modeling

    Science.gov (United States)

    Somogyi, A.; Barsi, A.; Molnar, B.; Lovas, T.

    2016-06-01

    Web-based photo albums that support organizing and viewing the users' images are widely used. These services provide a convenient solution for storing, editing and sharing images. In many cases, the users attach geotags to the images in order to enable using them e.g. in location based applications on social networks. Our paper discusses a procedure that collects open access images from a site frequently visited by tourists. Geotagged pictures showing the image of a sight or tourist attraction are selected and processed in photogrammetric processing software that produces the 3D model of the captured object. For the particular investigation we selected three attractions in Budapest. To assess the geometrical accuracy, we used laser scanner and DSLR as well as smart phone photography to derive reference values to enable verifying the spatial model obtained from the web-album images. The investigation shows how detailed and accurate models could be derived applying photogrammetric processing software, simply by using images of the community, without visiting the site.

  16. Cluster Based Text Classification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Wiil, Uffe Kock

    2011-01-01

    We propose a cluster based classification model for suspicious email detection and other text classification tasks. The text classification tasks comprise many training examples that require a complex classification model. Using clusters for classification makes the model simpler and increases...

  17. System identification application using Hammerstein model

    Indian Academy of Sciences (India)

    SABAN OZER; HASAN ZORLU; SELCUK METE

    2016-06-01

    Generally, memoryless polynomial nonlinear model for nonlinear part and finite impulse response (FIR) model or infinite impulse response model for linear part are preferred in Hammerstein models in literature. In this paper, system identification applications of Hammerstein model that is cascade of nonlinear second order volterra and linear FIR model are studied. Recursive least square algorithm is used to identify the proposed Hammerstein model parameters. Furthermore, the results are compared to identify the success of proposed Hammerstein model and different types of models

  18. Sticker DNA computer model--PartⅡ:Application

    Institute of Scientific and Technical Information of China (English)

    XU Jin; LI Sanping; DONG Yafei; WEI Xiaopeng

    2004-01-01

    Sticker model is one of the basic models in the DNA computer models. This model is coded with single-double stranded DNA molecules. It has the following advantages that the operations require no strands extension and use no enzymes; What's more, the materials are reusable. Therefore, it arouses attention and interest of scientists in many fields. In this paper, we extend and improve the sticker model, which will be definitely beneficial to the construction of DNA computer. This paper is the second part of our series paper, which mainly focuses on the application of sticker model. It mainly consists of the following three sections: the matrix representation of sticker model is first presented; then a brief review of the past research on graph and combinatorial optimization, such as the minimal set covering problem, the vertex covering problem, Hamiltonian path or cycle problem, the maximal clique problem, the maximal independent problem and the Steiner spanning tree problem, is described; Finally a DNA algorithm for the graph isomorphic problem based on the sticker model is given.

  19. Grid-based Meteorological and Crisis Applications

    Science.gov (United States)

    Hluchy, Ladislav; Bartok, Juraj; Tran, Viet; Lucny, Andrej; Gazak, Martin

    2010-05-01

    forecast model is a subject of the parameterization and parameter optimization before its real deployment. The parameter optimization requires tens of evaluations of the parameterized model accuracy and each evaluation of the model parameters requires re-running of the hundreds of meteorological situations collected over the years and comparison of the model output with the observed data. The architecture and inherent heterogeneity of both examples and their computational complexity and their interfaces to other systems and services make them well suited for decomposition into a set of web and grid services. Such decomposition has been performed within several projects we participated or participate in cooperation with academic sphere, namely int.eu.grid (dispersion model deployed as a pilot application to an interactive grid), SEMCO-WS (semantic composition of the web and grid services), DMM (development of a significant meteorological phenomena prediction system based on the data mining), VEGA 2009-2011 and EGEE III. We present useful and practical applications of technologies of high performance computing. The use of grid technology provides access to much higher computation power not only for modeling and simulation, but also for the model parameterization and validation. This results in the model parameters optimization and more accurate simulation outputs. Having taken into account that the simulations are used for the aviation, road traffic and crisis management, even small improvement in accuracy of predictions may result in significant improvement of safety as well as cost reduction. We found grid computing useful for our applications. We are satisfied with this technology and our experience encourages us to extend its use. Within an ongoing project (DMM) we plan to include processing of satellite images which extends our requirement on computation very rapidly. We believe that thanks to grid computing we are able to handle the job almost in real time.

  20. Model-based design of an agricultural biogas plant: application of anaerobic digestion model no.1 for an improved four chamber scheme.

    Science.gov (United States)

    Wett, B; Schoen, M; Phothilangka, P; Wackerle, F; Insam, H

    2007-01-01

    Different digestion technologies for various substrates are addressed by the generic process description of Anaerobic Digestion Model No. 1. In the case of manure or agricultural wastes a priori knowledge about the substrate in terms of ADM1 compounds is lacking and influent characterisation becomes a major issue. The actual project has been initiated for promotion of biogas technology in agriculture and for expansion of profitability also to rather small capacity systems. In order to avoid costly individual planning and installation of each facility a standardised design approach needs to be elaborated. This intention pleads for bio kinetic modelling as a systematic tool for process design and optimisation. Cofermentation under field conditions was observed, quality data and flow data were recorded and mass flow balances were calculated. In the laboratory different substrates have been digested separately in parallel under specified conditions. A configuration of four ADM1 model reactors was set up. Model calibration identified disintegration rate, decay rates for sugar degraders and half saturation constant for sugar as the three most sensitive parameters showing values (except the latter) about one order of magnitude higher than default parameters. Finally, the model is applied to the comparison of different reactor configurations and volume partitions. Another optimisation objective is robustness and load flexibility, i.e. the same configuration should be adaptive to different load situations only by a simple recycle control in order to establish a standardised design.

  1. Dynamic modeling of breast tissue with application of model reference adaptive system identification technique based on clinical robot-assisted palpation.

    Science.gov (United States)

    Keshavarz, M; Mojra, A

    2015-11-01

    Accurate identification of breast tissue's dynamic behavior in physical examination is critical to successful diagnosis and treatment. In this study a model reference adaptive system identification (MRAS) algorithm is utilized to estimate the dynamic behavior of breast tissue from mechanical stress-strain datasets. A robot-assisted device (Robo-Tac-BMI) is going to mimic physical palpation on a 45 year old woman having a benign mass in the left breast. Stress-strain datasets will be collected over 14 regions of both breasts in a specific period of time. Then, a 2nd order linear model is adapted to the experimental datasets. It was confirmed that a unique dynamic model with maximum error about 0.89% is descriptive of the breast tissue behavior meanwhile mass detection may be achieved by 56.1% difference from the normal tissue.

  2. Intelligent Model for Traffic Safety Applications

    Directory of Open Access Journals (Sweden)

    C. Chellappan

    2012-01-01

    Full Text Available Problem statement: This study presents an analysis on road traffic system focused on the use of communications to detect dangerous vehicles on roads and highways and how it could be used to enhance driver safety. Approach: The intelligent traffic safety application model is based on all traffic flow theories developed in the last years, leading to reliable representations of road traffic, which is of major importance in achieving the attenuation of traffic problems. The model also includes the decision making process from the driver in accelerating, decelerating and changing lanes. Results: The individuality of each of these processes appears from the model parameters that are randomly generated from statistical distributions introduced as input parameters. Conclusion: This allows the integration of the individuality factor of the population elements yielding knowledge on various driving modes at wide variety of situations.

  3. Development of EMC-based empirical model for estimating spatial distribution of pollutant loads and its application in rural areas of Korea.

    Science.gov (United States)

    Yi, Qitao; Li, Hui; Lee, Jin-Woo; Kim, Youngchul

    2015-09-01

    An integrated approach to easily calculate pollutant loads from agricultural watersheds is suggested and verified in this research. The basic concepts of this empirical tool were based on the assumption that variations in event mean concentrations (EMCs) of pollutants from a given agricultural watershed during rainstorms were only attributable to the rainfall pattern. Fifty one sets of EMC values were obtained from nine different watersheds located in the rural areas of Korea, and these data were used to develop predictive tools for the EMCs in rainfall runoff. The results of statistical tests of these formulas show that they are fairly good in predicting actual EMC values of some parameters, and useful in terms of calculating pollutant loads for any rainfall event time span such as daily, weekly, monthly, and yearly. This model was further checked in for its field applicability in a reservoir receiving stormwater after a cleanup of the sediments, covering 17 consecutive rainfall events from 1 July to 15 August in 2007. Overall the predicted values matched the observed values, indicating the feasibility of this empirical tool as a simple and useful solution in evaluating timely distribution of nonpoint source pollution loads from small rural watersheds of Korea.

  4. Application of measuring {sup 99m}Tc-MAG3 plasma clearance based on one-compartment model (MPC method) to pediatric patients

    Energy Technology Data Exchange (ETDEWEB)

    Koizumi, Kiyoshi [Tokyo Medical Coll., Hachioji (Japan). Hachioji Medical Center; Higashida, Kousuke; Arbab, A.S.; Toyama, Keiji; Arai, Takao; Yoshitomi, Tatsuya

    1997-04-01

    Measurement of {sup 99m}Tc-MAG3 plasma clearance based on 1-compartment model (MPC method) were applied to 12 pediatric patients and evaluated for the factors which might affect the calculated results. Depth correction is a critical factor for the measurement of renal uptake. Three different equations for estimating renal depth were compared with the real depth measured by ultrasonography. The equation proposed by K. Itoh was suitable though the equations by T. Ito and Raynaud were insufficient. Estimation of distribution volume, which is regarded as circulating plasma volume (CPV), is also critical for the calculation of MAG3 clearance by MPC method. Precisely, hematocrit measured by venous sampling and circulating blood volume (CBV) calculated as 7.5% of body weight are used for estimation of CPV. However, assumed CPV as 5% of body weight was acceptable if the hematocrit was not severely deviated from the normal value. Simplified MPC method utilizing two factors mentioned above gave a positive correlation with Russell`s one point sampling method. In conclusion, MPC method is applicable for pediatric patients. (author)

  5. Chemistry Teachers' Knowledge and Application of Models

    Science.gov (United States)

    Wang, Zuhao; Chi, Shaohui; Hu, Kaiyan; Chen, Wenting

    2014-01-01

    Teachers' knowledge and application of model play an important role in students' development of modeling ability and scientific literacy. In this study, we investigated Chinese chemistry teachers' knowledge and application of models. Data were collected through test questionnaire and analyzed quantitatively and qualitatively. The result indicated…

  6. Probabilistic Model-Based Background Subtraction

    DEFF Research Database (Denmark)

    Krüger, Volker; Andersen, Jakob; Prehn, Thomas

    2005-01-01

    manner. Bayesian propagation over time is used for proper model selection and tracking during model-based background subtraction. Bayes propagation is attractive in our application as it allows to deal with uncertainties during tracking. We have tested our approach on suitable outdoor video data....... is the correlation between pixels. In this paper we introduce a model-based background subtraction approach which facilitates prior knowledge of pixel correlations for clearer and better results. Model knowledge is being learned from good training video data, the data is stored for fast access in a hierarchical...

  7. Dynamic modelling and analysis of biochemical networks: mechanism-based models and model-based experiments.

    Science.gov (United States)

    van Riel, Natal A W

    2006-12-01

    Systems biology applies quantitative, mechanistic modelling to study genetic networks, signal transduction pathways and metabolic networks. Mathematical models of biochemical networks can look very different. An important reason is that the purpose and application of a model are essential for the selection of the best mathematical framework. Fundamental aspects of selecting an appropriate modelling framework and a strategy for model building are discussed. Concepts and methods from system and control theory provide a sound basis for the further development of improved and dedicated computational tools for systems biology. Identification of the network components and rate constants that are most critical to the output behaviour of the system is one of the major problems raised in systems biology. Current approaches and methods of parameter sensitivity analysis and parameter estimation are reviewed. It is shown how these methods can be applied in the design of model-based experiments which iteratively yield models that are decreasingly wrong and increasingly gain predictive power.

  8. PBG based terahertz antenna for aerospace applications

    CERN Document Server

    Choudhury, Balamati; Jha, Rakesh Mohan

    2016-01-01

    This book focuses on high-gain antennas in the terahertz spectrum and their optimization. The terahertz spectrum is an unallocated EM spectrum, which is being explored for a number of applications, especially to meet increasing demands of high data rates for wireless space communications. Space communication systems using the terahertz spectrum can resolve the problems of limited bandwidth of present wireless communications without radio-frequency interference. This book describes design of such high-gain antennas and their performance enhancement using photonic band gap (PBG) substrates. Further, optimization of antenna models using evolutionary algorithm based computational engine has been included. The optimized high-performance compact antenna may be used for various wireless applications, such as inter-orbital communications and on-vehicle satellite communications.

  9. Java Applications Development Based on Component and Metacomponent Approach

    OpenAIRE

    Danijel Radošević; Mario Konecki; Tihomir Orehovački

    2008-01-01

    Component based modeling offers new and improved approach to design, construction, implementation and evolution of software applications development. This kind of software applications development is usually represented by appropriate component model/diagram. UML, for example, offers component diagram for representation of this kind of model. On the other hand, metacomponents usage offers some new features which hardly could be achieved by using generic components. Firstly, implementation of ...

  10. Multiagent-Based Model For ESCM

    OpenAIRE

    Delia MARINCAS

    2011-01-01

    Web based applications for Supply Chain Management (SCM) are now a necessity for every company in order to meet the increasing customer demands, to face the global competition and to make profit. Multiagent-based approach is appropriate for eSCM because it shows many of the characteristics a SCM system should have. For this reason, we have proposed a multiagent-based eSCM model which configures a virtual SC, automates the SC activities: selling, purchasing, manufacturing, planning, inventory,...

  11. 信号的 SVD 重建模型及其应用%Signal reconstruction model based on SVD and its application

    Institute of Scientific and Technical Information of China (English)

    何希平; 杨劲; 刘波

    2015-01-01

    In the background of low-rank approximation and noise-elimination for big data,aiming at the needs of signal approxi-mation and reconstruction,a linear low-rank approximation model based on singular value decomposition (SVD)of signal was presented.To handle one-dimensional signal with the SVD approximation model,three structural matrix-construction models were introduced from the perspective of structure similarity,and their structural characteristics were analyzed.Then,the gene-ral signal reconstruction algorithm and signal denoising application methods were described.Finally,through the contrast experi-ments of SVD low rank approximation method and wavelet threshold shrinkage denoising method,the visual effect and the mean-squared error,signal to noise ratio and other statistical characteristics verify the practicability of the model.%以大数据低秩逼近与噪声消除问题为背景,针对信号近似表示与重建需要,提出信号在奇异值分解(SVD)基础上的低秩逼近线性模型。为使模型能够处理一维信号,从结构相似的角度出发引入3种结构矩阵构建模型,分析各自的结构特点;讨论信号 SVD 重建的通用算法及信号去噪声应用方法。进行 SVD 阈值去噪声及低秩逼近与小波阈值收缩去噪声的对比实验,实验结果表明了该模型在直观效果和均方误差、信噪比等统计特征方面的实用性。

  12. Multi-variable grey model (MGM (1,n,q)) based on genetic algorithm and its application in urban water consumption

    Institute of Scientific and Technical Information of China (English)

    Yan; Han; Shi; Guoxu

    2007-01-01

    Urban water consumption has some characteristics of grey because it is influenced by economy, population, standard of living and so on. The multi-variable grey model (MGM(1,n)), as the expansion and complement of GM(1,1) model, reveals the relationship between restriction and stimulation among variables, and the genetic algorithm has the whole optimal and parallel characteristics. In this paper, the parameter q of MGM(1,n) model was optimized, and a multi-variable grey model (MGM(1,n,q)) was built by using the genetic algorithm. The model was validated by examining the urban water consumption from 1990 to 2003 in Dalian City. The result indicated that the multi-variable grey model (MGM(1,n,q)) based on genetic algorithm was better than MGM(1,n) model, and the MGM(1,n) model was better than MGM(1,1) model.

  13. Structural Equation Modeling with Mplus Basic Concepts, Applications, and Programming

    CERN Document Server

    Byrne, Barbara M

    2011-01-01

    Modeled after Barbara Byrne's other best-selling structural equation modeling (SEM) books, this practical guide reviews the basic concepts and applications of SEM using Mplus Versions 5 & 6. The author reviews SEM applications based on actual data taken from her own research. Using non-mathematical language, it is written for the novice SEM user. With each application chapter, the author "walks" the reader through all steps involved in testing the SEM model including: an explanation of the issues addressed illustrated and annotated testing of the hypothesized and post hoc models expl

  14. Model Construct Based Enterprise Model Architecture and Its Modeling Approach

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In order to support enterprise integration, a kind of model construct based enterprise model architecture and its modeling approach are studied in this paper. First, the structural makeup and internal relationships of enterprise model architecture are discussed. Then, the concept of reusable model construct (MC) which belongs to the control view and can help to derive other views is proposed. The modeling approach based on model construct consists of three steps, reference model architecture synthesis, enterprise model customization, system design and implementation. According to MC based modeling approach a case study with the background of one-kind-product machinery manufacturing enterprises is illustrated. It is shown that proposal model construct based enterprise model architecture and modeling approach are practical and efficient.

  15. Application of stakeholder-based and modelling approaches for supporting robust adaptation decision making under future climatic uncertainty and changing urban-agricultural water demand

    Science.gov (United States)

    Bhave, Ajay; Dessai, Suraje; Conway, Declan; Stainforth, David

    2016-04-01

    Deep uncertainty in future climate change and socio-economic conditions necessitates the use of assess-risk-of-policy approaches over predict-then-act approaches for adaptation decision making. Robust Decision Making (RDM) approaches embody this principle and help evaluate the ability of adaptation options to satisfy stakeholder preferences under wide-ranging future conditions. This study involves the simultaneous application of two RDM approaches; qualitative and quantitative, in the Cauvery River Basin in Karnataka (population ~23 million), India. The study aims to (a) determine robust water resources adaptation options for the 2030s and 2050s and (b) compare the usefulness of a qualitative stakeholder-driven approach with a quantitative modelling approach. For developing a large set of future scenarios a combination of climate narratives and socio-economic narratives was used. Using structured expert elicitation with a group of climate experts in the Indian Summer Monsoon, climatic narratives were developed. Socio-economic narratives were developed to reflect potential future urban and agricultural water demand. In the qualitative RDM approach, a stakeholder workshop helped elicit key vulnerabilities, water resources adaptation options and performance criteria for evaluating options. During a second workshop, stakeholders discussed and evaluated adaptation options against the performance criteria for a large number of scenarios of climatic and socio-economic change in the basin. In the quantitative RDM approach, a Water Evaluation And Planning (WEAP) model was forced by precipitation and evapotranspiration data, coherent with the climatic narratives, together with water demand data based on socio-economic narratives. We find that compared to business-as-usual conditions options addressing urban water demand satisfy performance criteria across scenarios and provide co-benefits like energy savings and reduction in groundwater depletion, while options reducing

  16. HMM-based Trust Model

    DEFF Research Database (Denmark)

    ElSalamouny, Ehab; Nielsen, Mogens; Sassone, Vladimiro

    2010-01-01

    with their dynamic behaviour. Using Hidden Markov Models (HMMs) for both modelling and approximating the behaviours of principals, we introduce the HMM-based trust model as a new approach to evaluating trust in systems exhibiting dynamic behaviour. This model avoids the fixed behaviour assumption which is considered...... the major limitation of existing Beta trust model. We show the consistency of the HMM-based trust model and contrast it against the well known Beta trust model with the decay principle in terms of the estimation precision....

  17. On the applicability of unimodal and bimodal van Genuchten-Mualem based models to peat and other organic soils under evaporation conditions

    Science.gov (United States)

    Dettmann, Ullrich; Bechtold, Michel; Frahm, Enrico; Tiemeyer, Bärbel

    2014-07-01

    Soil moisture is one of the key parameters controlling biogeochemical processes in peat and other organic soils. To understand and accurately model soil moisture dynamics and peatland hydrological functioning in general, knowledge about soil hydraulic properties is crucial. As peat differs in several aspects from mineral soils, the applicability of standard hydraulic functions (e.g. van Genuchten-Mualem model) developed for mineral soils to peat soil moisture dynamics might be questionable. In this study, the hydraulic properties of five types of peat and other organic soils from different German peatlands have been investigated by laboratory evaporation experiments. Soil hydraulic parameters of the commonly-applied van Genuchten-Mualem model and the bimodal model by Durner (1994) were inversely estimated using HYDRUS-1D and global optimization. The objective function included measured pressure heads and cumulative evaporation. The performance of eight model set-ups differing in the degree of complexity and the choice of fitting parameters were evaluated. Depending on the model set-up, botanical origin and degree of peat decomposition, the quality of the model results differed strongly. We show that fitted ‘tortuosity’ parameters τ of the van Genuchten-Mualem model can deviate very much from the default value of 0.5 that is frequently applied to mineral soils. Results indicate a rather small decrease of the hydraulic conductivity with increasing suction compared to mineral soils. Optimizing τ did therefore strongly reduce the model error at dry conditions when high pressure head gradients occurred. As strongly negative pressure heads in the investigated peatlands rarely occur, we also reduced the range of pressure heads in the inversion to a ‘wet range’ from 0 to -200 cm. For the ‘wet range’ model performance was highly dependent on the inclusion of macropores. Here, fitting only the macropore fraction of the bimodal model as immediately drainable

  18. Model-based Software Engineering

    DEFF Research Database (Denmark)

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  19. Element-Based Computational Model

    Directory of Open Access Journals (Sweden)

    Conrad Mueller

    2012-02-01

    Full Text Available A variation on the data-flow model is proposed to use for developing parallel architectures. While the model is a data driven model it has significant differences to the data-flow model. The proposed model has an evaluation cycleof processing elements (encapsulated data that is similar to the instruction cycle of the von Neumann model. The elements contain the information required to process them. The model is inherently parallel. An emulation of the model has been implemented. The objective of this paper is to motivate support for taking the research further. Using matrix multiplication as a case study, the element/data-flow based model is compared with the instruction-based model. This is done using complexity analysis followed by empirical testing to verify this analysis. The positive results are given as motivation for the research to be taken to the next stage - that is, implementing the model using FPGAs.

  20. Software Testing Method Based on Model Comparison

    Institute of Scientific and Technical Information of China (English)

    XIE Xiao-dong; LU Yan-sheng; MAO Cheng-yin

    2008-01-01

    A model comparison based software testing method (MCST) is proposed. In this method, the requirements and programs of software under test are transformed into the ones in the same form, and described by the same model describe language (MDL).Then, the requirements are transformed into a specification model and the programs into an implementation model. Thus, the elements and structures of the two models are compared, and the differences between them are obtained. Based on the diffrences, a test suite is generated. Different MDLs can be chosen for the software under test. The usages of two classical MDLs in MCST, the equivalence classes model and the extended finite state machine (EFSM) model, are described with example applications. The results show that the test suites generated by MCST are more efficient and smaller than some other testing methods, such as the path-coverage testing method, the object state diagram testing method, etc.

  1. 基于领域驱动设计的应用系统模型%Application model based on domain-driven design

    Institute of Scientific and Technical Information of China (English)

    李引; 袁峰

    2013-01-01

      领域驱动设计(Domain-Driven Design,DDD)是Evans提出来的用来处理软件系统核心复杂性的方法。该方法的有效性在实践中得到证明,但是方法在细节上存在不够清晰、对设计人员素质要求高等问题。在对大量业务系统进行分析和实践的基础上,对业务对象的公共操作进行了抽象,提出了基于DDD的应用系统模型来指导系统设计和开发。研发了开发框架对业务系统中通用的属性和操作进行了封装。实际项目中的应用证明了该框架能够辅助进行系统设计开发,提高软件开发效率和减少缺陷。%Domain-Driven Design(DDD)is introduced by Evans E. to track complexity in the software,which has been proved effectively in practical.However,it lacks of fine-grained definition of some details and depends on the high-quality of the developers and so on.Based on the analysis of a number of business systems,the common operations of business object are abstracted,and an application model of DDD is proposed to guide the system design and development.Meanwhile,a framework is proposed to encapsulate common properties and operations of business system.In practical,this framework has been proved to assist the soft-ware development and improve the efficiency and reduce effort.

  2. Model validation, science and application

    NARCIS (Netherlands)

    Builtjes, P.J.H.; Flossmann, A.

    1998-01-01

    Over the last years there is a growing interest to try to establish a proper validation of atmospheric chemistry-transport (ATC) models. Model validation deals with the comparison of model results with experimental data, and in this way adresses both model uncertainty and uncertainty in, and adequac

  3. Application of discriminant analysis-based model for prediction of risk of low back disorders due to workplace design in industrial jobs.

    Science.gov (United States)

    Ganga, G M D; Esposto, K F; Braatz, D

    2012-01-01

    The occupational exposure limits of different risk factors for development of low back disorders (LBDs) have not yet been established. One of the main problems in setting such guidelines is the limited understanding of how different risk factors for LBDs interact in causing injury, since the nature and mechanism of these disorders are relatively unknown phenomena. Industrial ergonomists' role becomes further complicated because the potential risk factors that may contribute towards the onset of LBDs interact in a complex manner, which makes it difficult to discriminate in detail among the jobs that place workers at high or low risk of LBDs. The purpose of this paper was to develop a comparative study between predictions based on the neural network-based model proposed by Zurada, Karwowski & Marras (1997) and a linear discriminant analysis model, for making predictions about industrial jobs according to their potential risk of low back disorders due to workplace design. The results obtained through applying the discriminant analysis-based model proved that it is as effective as the neural network-based model. Moreover, the discriminant analysis-based model proved to be more advantageous regarding cost and time savings for future data gathering.

  4. Atom-Role-Based Access Control Model

    Science.gov (United States)

    Cai, Weihong; Huang, Richeng; Hou, Xiaoli; Wei, Gang; Xiao, Shui; Chen, Yindong

    Role-based access control (RBAC) model has been widely recognized as an efficient access control model and becomes a hot research topic of information security at present. However, in the large-scale enterprise application environments, the traditional RBAC model based on the role hierarchy has the following deficiencies: Firstly, it is unable to reflect the role relationships in complicated cases effectively, which does not accord with practical applications. Secondly, the senior role unconditionally inherits all permissions of the junior role, thus if a user is under the supervisor role, he may accumulate all permissions, and this easily causes the abuse of permission and violates the least privilege principle, which is one of the main security principles. To deal with these problems, we, after analyzing permission types and role relationships, proposed the concept of atom role and built an atom-role-based access control model, called ATRBAC, by dividing the permission set of each regular role based on inheritance path relationships. Through the application-specific analysis, this model can well meet the access control requirements.

  5. Clinical applications of plasma based electrosurgical systems

    Science.gov (United States)

    Woloszko, Jean; Endler, Ashley; Ryan, Thomas P.; Stalder, Kenneth R.

    2013-02-01

    Over the past 18 years, several electrosurgical systems generating a low temperature plasma in an aqueous conductive solution have been commercialized for various clinical applications and have been used in over 10 million patients to date. The most popular utilizations are in arthroscopic surgery, otorhinolaryngology surgery, spine and neurosurgery, urology and wound care. These devices can be configured to bring saline to the tip and to have concomitant aspiration to remove by-products and excess fluid. By tuning the electrode geometry, waveform and fluid dynamic at the tip of the devices, tissue resection and thermal effects can be adjusted individually. This allows one to design products that can operate as precise tissue dissectors for treatment of articular cartilage or debridement of chronic wounds, as well as global tissue debulking devices providing sufficient concomitant hemostasis for applications like tonsillectomies. Effects of these plasma based electrosurgical devices on cellular biology, healing response and nociceptive receptors has also been studied in various models. This talk will include a review of the clinical applications, with product descriptions, results and introductory review of some of the research on the biological effects of these devices.

  6. Java Applications Development Based on Component and Metacomponent Approach

    Directory of Open Access Journals (Sweden)

    Danijel Radošević

    2008-12-01

    Full Text Available Component based modeling offers new and improved approach to design, construction, implementation and evolution of software applications development. This kind of software applications development is usually represented by appropriate component model/diagram. UML, for example, offers component diagram for representation of this kind of model. On the other hand, metacomponents usage offers some new features which hardly could be achieved by using generic components. Firstly, implementation of program properties which are dispersed on different classes and other program units, i.e. aspects, is offered. This implies using automated process of assembling components and their interconnection for building applications, according to appropriate model offered in this paper, which also offers generic components usage. Benefits of this hybrid process are higher flexibility achieved by automated connection process, optimization through selective features inclusion and easier application maintenance and development. In this paper we offer an approach of application development based on hybrid component/metacomponent model. The component model is given by UML diagrams, while the metacomponent model is given by generator scripting model. We explain that hybrid approach on an example of Java Web application development.

  7. Multi-dimensional rheology-based two-phase model for sediment transport and applications to sheet flow and pipeline scour

    Science.gov (United States)

    Lee, Cheng-Hsien; Low, Ying Min; Chiew, Yee-Meng

    2016-05-01

    Sediment transport is fundamentally a two-phase phenomenon involving fluid and sediments; however, many existing numerical models are one-phase approaches, which are unable to capture the complex fluid-particle and inter-particle interactions. In the last decade, two-phase models have gained traction; however, there are still many limitations in these models. For example, several existing two-phase models are confined to one-dimensional problems; in addition, the existing two-dimensional models simulate only the region outside the sand bed. This paper develops a new three-dimensional two-phase model for simulating sediment transport in the sheet flow condition, incorporating recently published rheological characteristics of sediments. The enduring-contact, inertial, and fluid viscosity effects are considered in determining sediment pressure and stresses, enabling the model to be applicable to a wide range of particle Reynolds number. A k - ɛ turbulence model is adopted to compute the Reynolds stresses. In addition, a novel numerical scheme is proposed, thus avoiding numerical instability caused by high sediment concentration and allowing the sediment dynamics to be computed both within and outside the sand bed. The present model is applied to two classical problems, namely, sheet flow and scour under a pipeline with favorable results. For sheet flow, the computed velocity is consistent with measured data reported in the literature. For pipeline scour, the computed scour rate beneath the pipeline agrees with previous experimental observations. However, the present model is unable to capture vortex shedding; consequently, the sediment deposition behind the pipeline is overestimated. Sensitivity analyses reveal that model parameters associated with turbulence have strong influence on the computed results.

  8. Applications and extensions of degradation modeling

    Energy Technology Data Exchange (ETDEWEB)

    Hsu, F.; Subudhi, M.; Samanta, P.K. (Brookhaven National Lab., Upton, NY (United States)); Vesely, W.E. (Science Applications International Corp., Columbus, OH (United States))

    1991-01-01

    Component degradation modeling being developed to understand the aging process can have many applications with potential advantages. Previous work has focused on developing the basic concepts and mathematical development of a simple degradation model. Using this simple model, times of degradations and failures occurrences were analyzed for standby components to detect indications of aging and to infer the effectiveness of maintenance in preventing age-related degradations from transforming to failures. Degradation modeling approaches can have broader applications in aging studies and in this paper, we discuss some of the extensions and applications of degradation modeling. The application and extension of degradation modeling approaches, presented in this paper, cover two aspects: (1) application to a continuously operating component, and (2) extension of the approach to analyze degradation-failure rate relationship. The application of the modeling approach to a continuously operating component (namely, air compressors) shows the usefulness of this approach in studying aging effects and the role of maintenance in this type component. In this case, aging effects in air compressors are demonstrated by the increase in both the degradation and failure rate and the faster increase in the failure rate compared to the degradation rate shows the ineffectiveness of the existing maintenance practices. Degradation-failure rate relationship was analyzed using data from residual heat removal system pumps. A simple linear model with a time-lag between these two parameters was studied. The application in this case showed a time-lag of 2 years for degradations to affect failure occurrences. 2 refs.

  9. Applications and extensions of degradation modeling

    Energy Technology Data Exchange (ETDEWEB)

    Hsu, F.; Subudhi, M.; Samanta, P.K. [Brookhaven National Lab., Upton, NY (United States); Vesely, W.E. [Science Applications International Corp., Columbus, OH (United States)

    1991-12-31

    Component degradation modeling being developed to understand the aging process can have many applications with potential advantages. Previous work has focused on developing the basic concepts and mathematical development of a simple degradation model. Using this simple model, times of degradations and failures occurrences were analyzed for standby components to detect indications of aging and to infer the effectiveness of maintenance in preventing age-related degradations from transforming to failures. Degradation modeling approaches can have broader applications in aging studies and in this paper, we discuss some of the extensions and applications of degradation modeling. The application and extension of degradation modeling approaches, presented in this paper, cover two aspects: (1) application to a continuously operating component, and (2) extension of the approach to analyze degradation-failure rate relationship. The application of the modeling approach to a continuously operating component (namely, air compressors) shows the usefulness of this approach in studying aging effects and the role of maintenance in this type component. In this case, aging effects in air compressors are demonstrated by the increase in both the degradation and failure rate and the faster increase in the failure rate compared to the degradation rate shows the ineffectiveness of the existing maintenance practices. Degradation-failure rate relationship was analyzed using data from residual heat removal system pumps. A simple linear model with a time-lag between these two parameters was studied. The application in this case showed a time-lag of 2 years for degradations to affect failure occurrences. 2 refs.

  10. APPLICATION OF FRF ESTIMATOR BASED ON ERRORS-IN-VARIABLES MODEL IN MULTI-INPUT MULTI-OUTPUT VIBRATION CONTROL SYSTEM

    Institute of Scientific and Technical Information of China (English)

    GUAN Guangfeng; CONG Dacheng; HAN Junwei; LI Hongren

    2007-01-01

    The FRF estimator based on the errors-in-variables (EV) model of multi-input multi-output (MIMO) System is presented to reduce the bias error of FRF Hl estimator. The FRF Hl estimator is influenced by the noises in the inputs of the System and generates an under-estimation of the true FRF. The FRF estimator based on the EV model takes into account the errors in both the inputs and Outputs of the System and would lead to more accurate FRF estimation. The FRF estimator based on the EV model is applied to the waveform replication on the 6-DOF (degree-of-freedom) hydraulic Vibration table. The result shows that it is favorable to improve the control precision of the MIMO Vibration control system.

  11. Agent-based modeling and network dynamics

    CERN Document Server

    Namatame, Akira

    2016-01-01

    The book integrates agent-based modeling and network science. It is divided into three parts, namely, foundations, primary dynamics on and of social networks, and applications. The book begins with the network origin of agent-based models, known as cellular automata, and introduce a number of classic models, such as Schelling’s segregation model and Axelrod’s spatial game. The essence of the foundation part is the network-based agent-based models in which agents follow network-based decision rules. Under the influence of the substantial progress in network science in late 1990s, these models have been extended from using lattices into using small-world networks, scale-free networks, etc. The book also shows that the modern network science mainly driven by game-theorists and sociophysicists has inspired agent-based social scientists to develop alternative formation algorithms, known as agent-based social networks. The book reviews a number of pioneering and representative models in this family. Upon the gi...

  12. Performance-degradation model for Li{sub 4}Ti{sub 5}O{sub 12}-based battery cells used in wind power applications

    Energy Technology Data Exchange (ETDEWEB)

    Stroe, D.I.; Swierczynski, M.; Stan, A.I.; Teodorescu, R.; Andreasen, S.J. [Aalborg Univ. (Denmark). Dept. of Energy Technology

    2012-07-01

    Energy storage systems based on Lithium-ion batteries have the potential to mitigate the negative impact of wind power grid integration on the power system stability, which is caused by the characteristics of the wind. This paper presents a performance model for a Li{sub 4}Ti{sub 5}O{sub 12}/LiMO{sub 2} battery cell. For developing the performance model an EIS-based electrical modelling approach was followed. The obtained model is able to predict with high accuracy charge and discharge voltage profiles for different ages of the battery cell and for different charging/ discharging current rates. Moreover, the ageing behaviour of the battery cell was analysed for the case of accelerated cycling ageing with a certain mission profile.

  13. Geometric Modeling Application Interface Program

    Science.gov (United States)

    1990-11-01

    Manual IDEF-Extended ( IDEFIX ) Integrated Information Support System (IISS), ICAM Project 6201, Contract F33615-80-C-5155, December 1985. Interim...Differential Geometry of Curves and Surfaces, M. P. de Carmo, Prentice-Hall, Inc., 1976. IDEFIX Readers Reference, D. Appleton Company, December 1985...Modeling. IDEFI -- IDEF Information Modeling. IDEFIX -- IDEF Extended Information Modeling. IDEF2 -- IDEF Dynamics Modeling. IDSS -- Integrated Decision

  14. The Application Model of Moving Objects in Cargo Delivery System

    Institute of Scientific and Technical Information of China (English)

    ZHANG Feng-li; ZHOU Ming-tian; XU Bo

    2004-01-01

    The development of spatio-temporal database systems is primarily motivated by applications which track and present mobile objects. In this paper, solutions for establishing the moving object database based on GPS/GIS environment are presented, and a data modeling of moving object is given by using Temporal logical to extent the query language, finally the application model in cargo delivery system is shown.

  15. Model-based consensus

    NARCIS (Netherlands)

    Boumans, Marcel

    2014-01-01

    The aim of the rational-consensus method is to produce “rational consensus”, that is, “mathematical aggregation”, by weighing the performance of each expert on the basis of his or her knowledge and ability to judge relevant uncertainties. The measurement of the performance of the experts is based on

  16. Model-based consensus

    NARCIS (Netherlands)

    M. Boumans

    2014-01-01

    The aim of the rational-consensus method is to produce "rational consensus", that is, "mathematical aggregation", by weighing the performance of each expert on the basis of his or her knowledge and ability to judge relevant uncertainties. The measurement of the performance of the experts is based on

  17. Towards an MDA-based development methodology for distributed applications

    NARCIS (Netherlands)

    Gavras, A.; Belaunde, M.; Ferreira Pires, L.; Andrade Almeida, J.P.; van Sinderen, M.J.; Ferreira Pires, L.

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  18. Markov chains models, algorithms and applications

    CERN Document Server

    Ching, Wai-Ki; Ng, Michael K; Siu, Tak-Kuen

    2013-01-01

    This new edition of Markov Chains: Models, Algorithms and Applications has been completely reformatted as a text, complete with end-of-chapter exercises, a new focus on management science, new applications of the models, and new examples with applications in financial risk management and modeling of financial data.This book consists of eight chapters.  Chapter 1 gives a brief introduction to the classical theory on both discrete and continuous time Markov chains. The relationship between Markov chains of finite states and matrix theory will also be highlighted. Some classical iterative methods

  19. Advances in Application of Models in Soil Quality Evaluation

    Institute of Scientific and Technical Information of China (English)

    SI Zhi-guo; WANG Ji-jie; YU Yuan-chun; LIANG Guan-feng; CHEN Chang-ren; SHU Hong-lan

    2012-01-01

    Soil quality is a comprehensive reflection of soil properties.Since the soil quality concept was put forward in the 1970s,the quality of different type soils in different regions have been evaluated through a variety of evaluation methods,but it still lacks universal soil quantity evaluation models and methods.In this paper,the applications and prospects of grey relevancy comprehensive evaluation model,attribute hierarchical model,fuzzy comprehensive evaluation model,matter-element model,RAGA-based PPC /PPE model and GIS model in soil quality evaluation are reviewed.

  20. Surrogate Model Application to the Identification of Optimal Groundwater Exploitation Scheme Based on Regression Kriging Method—A Case Study of Western Jilin Province

    Directory of Open Access Journals (Sweden)

    Yongkai An

    2015-07-01

    Full Text Available This paper introduces a surrogate model to identify an optimal exploitation scheme, while the western Jilin province was selected as the study area. A numerical simulation model of groundwater flow was established first, and four exploitation wells were set in the Tongyu county and Qian Gorlos county respectively so as to supply water to Daan county. Second, the Latin Hypercube Sampling (LHS method was used to collect data in the feasible region for input variables. A surrogate model of the numerical simulation model of groundwater flow was developed using the regression kriging method. An optimization model was established to search an optimal groundwater exploitation scheme using the minimum average drawdown of groundwater table and the minimum cost of groundwater exploitation as multi-objective functions. Finally, the surrogate model was invoked by the optimization model in the process of solving the optimization problem. Results show that the relative error and root mean square error of the groundwater table drawdown between the simulation model and the surrogate model for 10 validation samples are both lower than 5%, which is a high approximation accuracy. The contrast between the surrogate-based simulation optimization model and the conventional simulation optimization model for solving the same optimization problem, shows the former only needs 5.5 hours, and the latter needs 25 days. The above results indicate that the surrogate model developed in this study could not only considerably reduce the computational burden of the simulation optimization process, but also maintain high computational accuracy. This can thus provide an effective method for identifying an optimal groundwater exploitation scheme quickly and accurately.

  1. Surrogate Model Application to the Identification of Optimal Groundwater Exploitation Scheme Based on Regression Kriging Method—A Case Study of Western Jilin Province

    Science.gov (United States)

    An, Yongkai; Lu, Wenxi; Cheng, Weiguo

    2015-01-01

    This paper introduces a surrogate model to identify an optimal exploitation scheme, while the western Jilin province was selected as the study area. A numerical simulation model of groundwater flow was established first, and four exploitation wells were set in the Tongyu county and Qian Gorlos county respectively so as to supply water to Daan county. Second, the Latin Hypercube Sampling (LHS) method was used to collect data in the feasible region for input variables. A surrogate model of the numerical simulation model of groundwater flow was developed using the regression kriging method. An optimization model was established to search an optimal groundwater exploitation scheme using the minimum average drawdown of groundwater table and the minimum cost of groundwater exploitation as multi-objective functions. Finally, the surrogate model was invoked by the optimization model in the process of solving the optimization problem. Results show that the relative error and root mean square error of the groundwater table drawdown between the simulation model and the surrogate model for 10 validation samples are both lower than 5%, which is a high approximation accuracy. The contrast between the surrogate-based simulation optimization model and the conventional simulation optimization model for solving the same optimization problem, shows the former only needs 5.5 hours, and the latter needs 25 days. The above results indicate that the surrogate model developed in this study could not only considerably reduce the computational burden of the simulation optimization process, but also maintain high computational accuracy. This can thus provide an effective method for identifying an optimal groundwater exploitation scheme quickly and accurately. PMID:26264008

  2. Application of the Polynomial-Based Least Squares and Total Least Squares Models for the Attenuated Total Reflection Fourier Transform Infrared Spectra of Binary Mixtures of Hydroxyl Compounds.

    Science.gov (United States)

    Shan, Peng; Peng, Silong; Zhao, Yuhui; Tang, Liang

    2016-03-01

    An analysis of binary mixtures of hydroxyl compound by Attenuated Total Reflection Fourier transform infrared spectroscopy (ATR FT-IR) and classical least squares (CLS) yield large model error due to the presence of unmodeled components such as H-bonded components. To accommodate these spectral variations, polynomial-based least squares (LSP) and polynomial-based total least squares (TLSP) are proposed to capture the nonlinear absorbance-concentration relationship. LSP is based on assuming that only absorbance noise exists; while TLSP takes both absorbance noise and concentration noise into consideration. In addition, based on different solving strategy, two optimization algorithms (limited-memory Broyden-Fletcher-Goldfarb-Shanno (LBFGS) algorithm and Levenberg-Marquardt (LM) algorithm) are combined with TLSP and then two different TLSP versions (termed as TLSP-LBFGS and TLSP-LM) are formed. The optimum order of each nonlinear model is determined by cross-validation. Comparison and analyses of the four models are made from two aspects: absorbance prediction and concentration prediction. The results for water-ethanol solution and ethanol-ethyl lactate solution show that LSP, TLSP-LBFGS, and TLSP-LM can, for both absorbance prediction and concentration prediction, obtain smaller root mean square error of prediction than CLS. Additionally, they can also greatly enhance the accuracy of estimated pure component spectra. However, from the view of concentration prediction, the Wilcoxon signed rank test shows that there is no statistically significant difference between each nonlinear model and CLS.

  3. Potential Teachers' Appropriate and Inappropriate Application of Pedagogical Resources in a Model-Based Physics Course: A "Knowledge in Pieces" Perspective on Teacher Learning

    Science.gov (United States)

    Harlow, Danielle B.; Bianchini, Julie A.; Swanson, Lauren H.; Dwyer, Hilary A.

    2013-01-01

    We used a "knowledge in pieces" perspective on teacher learning to document undergraduates' pedagogical resources in a model-based physics course for potential teachers. We defined pedagogical resources as small, discrete ideas about teaching science that are applied appropriately or inappropriately in specific contexts. Neither…

  4. Photocell modelling for thermophotovoltaic applications

    Energy Technology Data Exchange (ETDEWEB)

    Mayor, J.-C.; Durisch, W.; Grob, B.; Panitz, J.-C. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1999-08-01

    Goal of the modelling described here is the extrapolation of the performance characteristics of solar photocells to TPV working conditions. The model accounts for higher flux of radiation and for the higher temperatures reached in TPV converters. (author) 4 figs., 1 tab., 2 refs.

  5. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  6. Flower solid modeling based on sketches

    Institute of Scientific and Technical Information of China (English)

    Zhan DING; Shu-chang XU; Xiu-zi YE; Yin ZHANG; San-yuan ZHANG

    2008-01-01

    In this paper we propose a method to model flowers of solid shape. Based on (Ijiri et al., 2005)'s method, we separate individual flower modeling and inflorescence modeling procedures into structure and geometry modeling. We incorporate interactive editing gestures to allow the user to edit structure parameters freely onto structure diagram. Furthermore, we use free-hand sketching techniques to allow users to create and edit 3D geometrical elements freely and easily. The final step is to automatically merge all independent 3D geometrical elements into a single waterproof mesh. Our experiments show that this solid modeling approach is promising. Using our approach, novice users can create vivid flower models easily and freely. The generated flower model is waterproof. It can have applications in visualization, animation, gaming, and toys and decorations if printed out on 3D rapid prototyping devices.

  7. Application of a Microstructure-Based ISV Plasticity Damage Model to Study Penetration Mechanics of Metals and Validation through Penetration Study of Aluminum

    Directory of Open Access Journals (Sweden)

    Yangqing Dou

    2017-01-01

    Full Text Available A developed microstructure-based internal state variable (ISV plasticity damage model is for the first time used for simulating penetration mechanics of aluminum to find out its penetration properties. The ISV damage model tries to explain the interplay between physics at different length scales that governs the failure and damage mechanisms of materials by linking the macroscopic failure and damage behavior of the materials with their micromechanical performance, such as void nucleation, growth, and coalescence. Within the continuum modeling framework, microstructural features of materials are represented using a set of ISVs, and rate equations are employed to depict damage history and evolution of the materials. For experimental calibration of this damage model, compression, tension, and torsion straining conditions are considered to distinguish damage evolutions under different stress states. To demonstrate the reliability of the presented ISV model, that model is applied for studying penetration mechanics of aluminum and the numerical results are validated by comparing with simulation results yielded from the Johnson-Cook model as well as analytical results calculated from an existing theoretical model.

  8. A nonlinear-elastic constitutive model for soft connective tissue based on a histologic description: Application to female pelvic soft tissue.

    Science.gov (United States)

    Brieu, Mathias; Chantereau, Pierre; Gillibert, Jean; de Landsheere, Laurent; Lecomte, Pauline; Cosson, Michel

    2016-05-01

    To understand the mechanical behavior of soft tissues, two fields of science are essential: biomechanics and histology. Nonetheless, those two fields have not yet been studied together often enough to be unified by a comprehensive model. This study attempts to produce such model. Biomechanical uniaxial tension tests were performed on vaginal tissues from 7 patients undergoing surgery. In parallel, vaginal tissue from the same patients was histologically assessed to determine the elastic fiber ratio. These observations demonstrated a relationship between the stiffness of tissue and its elastin content. To extend this study, a mechanical model, based on an histologic description, was developed to quantitatively correlate the mechanical behavior of vaginal tissue to its elastic fiber content. A satisfactory single-parameter model was developed assuming that the mechanical behavior of collagen and elastin was the same for all patients and that tissues are only composed of collagen and elastin. This single-parameter model showed good correlation with experimental results. The single-parameter mechanical model described here, based on histological description, could be very useful in helping to understand and better describe soft tissues with a view to their characterization. The mechanical behavior of a tissue can thus be determined thanks to its elastin content without introducing too many unidentified parameters.

  9. 一种道路参数化快速建模技术与应用%Road Quick Modeling and Application Based on Parameter Design

    Institute of Scientific and Technical Information of China (English)

    王阳生; 何兴富

    2014-01-01

    介绍一种在高精度三维地形模型的基础上对道路设计方案快速集成和实时模拟的方法。基于道路设计领域知识,设计道路空间-语义一体化的信息模型,实现道路设计成果的快速整合;基于参数化设计技术和断面拉伸建模技术实现道路三维模型的快速构建。在此基础上开展路基土石方计算、征地统计和辅助规划管理等工作,为道路方案的可视化展现、辅助分析决策和提升设计效率、加强方案科学性等提供了一种技术手段。%This paper introduces a method of rapid integration and real-time modeling for road design project based on high accuracy three-dimensional terrain model ,with constructing a space-semantic integration road information mod-eling based on knowledge of the field of road design to achieve rapid integration of road design project ,and creating road model rapidly by section stretching modeling based on parametric design techniques .Based on 3D road model,roadbed earthwork calculations ,statistical land acquisition and planning management were conducted .This paper provides a new technique for visualizing of road program ,assisting decision-making and improving road design efficiency and more scien-tific.

  10. Model-based testing for embedded systems

    CERN Document Server

    Zander, Justyna; Mosterman, Pieter J

    2011-01-01

    What the experts have to say about Model-Based Testing for Embedded Systems: "This book is exactly what is needed at the exact right time in this fast-growing area. From its beginnings over 10 years ago of deriving tests from UML statecharts, model-based testing has matured into a topic with both breadth and depth. Testing embedded systems is a natural application of MBT, and this book hits the nail exactly on the head. Numerous topics are presented clearly, thoroughly, and concisely in this cutting-edge book. The authors are world-class leading experts in this area and teach us well-used

  11. 一种基于中国墙策略的应用程序保护模型研究%Research on a Novel Application Protection Model based on Chinese Wall Policy

    Institute of Scientific and Technical Information of China (English)

    夏少君; 魏玲玲

    2012-01-01

      Aiming at the problem that application programs in the same application class are conflict of interests, furthermore, menaces the security of application program, such as information leakage and information unauthorized modification,this paper proposes a novel Application Program Protection Model based on Chinese Wall security policy. The novel model explicitly distinguishs the easily confused concept between“subject”and“user”and assign the sensitive label to entity based the lattice structure,. It contains of seven access control rules, and thus considers to equally protecting the confidentiality and integrity of information. Formally, it describes the basic elements and access control rules of the model, moreover, it discusses the application of the model in practice. This paper contributes on the researches on the protection of application program.%  针对当前系统中属于相同应用类中的应用程序产生利益冲突,可能威胁应用程序安全,包括信息泄露和信息未授权的修改。文章基于中国墙策略,提出了一个新的应用保护模型,该模型明确地区分了主体和用户的概念,并基于格的属性对于实体分配了敏感标记,通过七条访问控制规则,等同地考虑了信息的机密性和完整性保护。基于一阶谓词逻辑,形式化地描述了模型的基本元素及访问控制规则,进一步讨论了模型在实际中的应用。

  12. The Research and Application of Extended RBAC Model Based on Organization Structure%基于组织结构的RBAC扩展模型及应用

    Institute of Scientific and Technical Information of China (English)

    范志; 顾治波

    2013-01-01

    针对传统的基于角色的访问控制模型所存在的问题,提出了一种基于组织结构的RBAC扩展模型,该模型引入组织结构和用户组,有效地解决了目前RBAC权限管理存在的问题,满足大型企业对权限管理的精细化管理要求.%According to the problems in traditional RBAC model, an extended RBAC model based on organization structure is proposed. In the model, organizational structure and user group are introduced to solve the problem in traditional authorization, the model conform to modern large-scale enterprise organization structure management characteristics.

  13. Moving objects management models, techniques and applications

    CERN Document Server

    Meng, Xiaofeng; Xu, Jiajie

    2014-01-01

    This book describes the topics of moving objects modeling and location tracking, indexing and querying, clustering, location uncertainty, traffic aware navigation and privacy issues as well as the application to intelligent transportation systems.

  14. Registry of EPA Applications, Models, and Databases

    Data.gov (United States)

    U.S. Environmental Protection Agency — READ is EPA's authoritative source for information about Agency information resources, including applications/systems, datasets and models. READ is one component of...

  15. The application of GIS based decision-tree models for generating the spatial distribution of hydromorphic organic landscapes in relation to digital terrain data

    DEFF Research Database (Denmark)

    Kheir, Rania Bou; Bøcher, Peder Klith; Greve, Mette Balslev

    2010-01-01

    ) topographic parameters were generated from Digital Elevation Models (DEMs) acquired using airborne LIDAR (Light Detection and Ranging) systems. They were used along with existing digital data collected from other sources (soil type, geological substrate and landscape type) to explain organic/mineral field......Accurate information about organic/mineral soil occurrence is a prerequisite for many land resources management applications (including climate change mitigation). This paper aims at investigating the potential of using geomorphometrical analysis and decision tree modeling to predict the geographic...... distribution of hydromorphic organic landscapes in unsampled area in Denmark. Nine primary (elevation, slope angle, slope aspect, plan curvature, profile curvature, tangent curvature, flow direction, flow accumulation, and specific catchment area) and one secondary (steady-state topographic wetness index...

  16. Dynamic programming models and applications

    CERN Document Server

    Denardo, Eric V

    2003-01-01

    Introduction to sequential decision processes covers use of dynamic programming in studying models of resource allocation, methods for approximating solutions of control problems in continuous time, production control, more. 1982 edition.

  17. Dynamic Frames Based Generation of 3D Scenes and Applications

    OpenAIRE

    Kvesić, Anton; Radošević, Danijel; Orehovački, Tihomir

    2015-01-01

    Modern graphic/programming tools like Unity enables the possibility of creating 3D scenes as well as making 3D scene based program applications, including full physical model, motion, sounds, lightning effects etc. This paper deals with the usage of dynamic frames based generator in the automatic generation of 3D scene and related source code. The suggested model enables the possibility to specify features of the 3D scene in a form of textual specification, as well as exporting such features ...

  18. Rule-based transformations for geometric modelling

    Directory of Open Access Journals (Sweden)

    Thomas Bellet

    2011-02-01

    Full Text Available The context of this paper is the use of formal methods for topology-based geometric modelling. Topology-based geometric modelling deals with objects of various dimensions and shapes. Usually, objects are defined by a graph-based topological data structure and by an embedding that associates each topological element (vertex, edge, face, etc. with relevant data as their geometric shape (position, curve, surface, etc. or application dedicated data (e.g. molecule concentration level in a biological context. We propose to define topology-based geometric objects as labelled graphs. The arc labelling defines the topological structure of the object whose topological consistency is then ensured by labelling constraints. Nodes have as many labels as there are different data kinds in the embedding. Labelling constraints ensure then that the embedding is consistent with the topological structure. Thus, topology-based geometric objects constitute a particular subclass of a category of labelled graphs in which nodes have multiple labels.

  19. Rule-based transformations for geometric modelling

    CERN Document Server

    Bellet, Thomas; Gall, Pascale Le; 10.4204/EPTCS.48.5

    2011-01-01

    The context of this paper is the use of formal methods for topology-based geometric modelling. Topology-based geometric modelling deals with objects of various dimensions and shapes. Usually, objects are defined by a graph-based topological data structure and by an embedding that associates each topological element (vertex, edge, face, etc.) with relevant data as their geometric shape (position, curve, surface, etc.) or application dedicated data (e.g. molecule concentration level in a biological context). We propose to define topology-based geometric objects as labelled graphs. The arc labelling defines the topological structure of the object whose topological consistency is then ensured by labelling constraints. Nodes have as many labels as there are different data kinds in the embedding. Labelling constraints ensure then that the embedding is consistent with the topological structure. Thus, topology-based geometric objects constitute a particular subclass of a category of labelled graphs in which nodes hav...

  20. 基于JAVA WEB中MVC模式的研究与应用%Research and application of MVC model in WEB Based on JAVA

    Institute of Scientific and Technical Information of China (English)

    杨静

    2014-01-01

    MVC(Model-View-Controller)设计模式是现代软件设计中一种非常重要的设计模式,也是WEB系统中常用的一种经典模式,它实现了界面显示与业务逻辑的分离。为了提高基于WEB系统中代码的可重复性、可维护性、可移植性和系统性能的稳定性,从数据库访问技术、XML技术和DAO设计模式的角度,提出一种MVC改进模式,并阐述了基于此模式进行开发的技术和原理。%MVC (Model-View-Controller) design pattern is a very important design patterns in modern software design, the WEB system is used in a classical model, it implements the interface display and business logic separation. In order to improve the code in the WEB system repeatability, maintainability, portability and performance based on system stability, access technology, XML technology and DAO design pattern from the perspective of the database, put forward a kind of improved MVC model, and describes the technology and principle of development based on this model.