WorldWideScience

Sample records for application models based

  1. Modelling Gesture Based Ubiquitous Applications

    CERN Document Server

    Zacharia, Kurien; Varghese, Surekha Mariam

    2011-01-01

    A cost effective, gesture based modelling technique called Virtual Interactive Prototyping (VIP) is described in this paper. Prototyping is implemented by projecting a virtual model of the equipment to be prototyped. Users can interact with the virtual model like the original working equipment. For capturing and tracking the user interactions with the model image and sound processing techniques are used. VIP is a flexible and interactive prototyping method that has much application in ubiquitous computing environments. Different commercial as well as socio-economic applications and extension to interactive advertising of VIP are also discussed.

  2. Application software development via model based design

    OpenAIRE

    Haapala, O. (Olli)

    2015-01-01

    This thesis was set to study the utilization of the MathWorks’ Simulink® program in model based application software development and its compatibility with the Vacon 100 inverter. The target was to identify all the problems related to everyday usage of this method and create a white paper of how to execute a model based design to create a Vacon 100 compatible system software. Before this thesis was started, there was very little knowledge of the compatibility of this method. However durin...

  3. Measurement-based load modeling: Theory and application

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Load model is one of the most important elements in power system operation and control. However, owing to its complexity, load modeling is still an open and very difficult problem. Summarizing our work on measurement-based load modeling in China for more than twenty years, this paper systematically introduces the mathematical theory and applications regarding the load modeling. The flow chart and algorithms for measurement-based load modeling are presented. A composite load model structure with 13 parameters is also proposed. Analysis results based on the trajectory sensitivity theory indicate the importance of the load model parameters for the identification. Case studies show the accuracy of the presented measurement-based load model. The load model thus built has been validated by field measurements all over China. Future working directions on measurement- based load modeling are also discussed in the paper.

  4. Model Checking-Based Testing of Web Applications

    Institute of Scientific and Technical Information of China (English)

    ZENG Hongwei; MIAO Huaikou

    2007-01-01

    A formal model representing the navigation behavior of a Web application as the Kripke structure is proposed and an approach that applies model checking to test case generation is presented. The Object Relation Diagram as the object model is employed to describe the object structure of a Web application design and can be translated into the behavior model. A key problem of model checking-based test generation for a Web application is how to construct a set of trap properties that intend to cause the violations of model checking against the behavior model and output of counterexamples used to construct the test sequences.We give an algorithm that derives trap properties from the object model with respect to node and edge coverage criteria.

  5. WWW Business Applications Based on the Cellular Model

    Institute of Scientific and Technical Information of China (English)

    Toshio Kodama; Tosiyasu L. Kunii; Yoichi Seki

    2008-01-01

    A cellular model based on the Incrementally Modular Abstraction Hierarchy (IMAH) is a novel model that can represent the architecture of and changes in cyberworlds, preserving invariants from a general level to a specific one. We have developed a data processing system called the Cellular Data System (CDS). In the development of business applications, you can prevent combinatorial explosion in the process of business design and testing by using CDS. In this paper, we have first designed and implemented wide-use algebra on the presentation level. Next, we have developed and verified the effectiveness of two general business applications using CDS: 1) a customer information management system, and 2) an estimate system.

  6. Data Warehouse Model For Mobile-Based Applications

    Directory of Open Access Journals (Sweden)

    Muhammad Shahbani Abu Bakar

    2016-06-01

    Full Text Available Analysis and design are very important roles in the Data Warehouse (DW system development and forms as a backbone of any successful or failure of the DW project. The emerging trends of analytic-based application required the DW system to be implemented in the mobile environment. However, current analysis and design approaches are based on existing DW environments that focusing on the deployment of the DW system in traditional web-based applications. This will create the limitations on user accessed and the used of analytical information by the decision makers. Consequently, this will prolong the adoption of analytic-based applications to the users and organizations. This research aims to suggest an approach for modeling the DW and design the DW system on the mobile environments. A variant dimension of modeling techniques was used to enhance the DW schemas in order to accommodate the requirements of mobile characteristics in the DW design. A proposed mobile DW system was evaluated by expert review, and support the success of mobile DW-based application implementation

  7. Intelligent control based on intelligent characteristic model and its application

    Institute of Scientific and Technical Information of China (English)

    吴宏鑫; 王迎春; 邢琰

    2003-01-01

    This paper presents a new intelligent control method based on intelligent characteristic model for a kind of complicated plant with nonlinearities and uncertainties, whose controlled output variables cannot be measured on line continuously. The basic idea of this method is to utilize intelligent techniques to form the characteristic model of the controlled plant according to the principle of combining the char-acteristics of the plant with the control requirements, and then to present a new design method of intelli-gent controller based on this characteristic model. First, the modeling principles and expression of the intelligent characteristic model are presented. Then based on description of the intelligent characteristic model, the design principles and methods of the intelligent controller composed of several open-loops and closed-loops sub controllers with qualitative and quantitative information are given. Finally, the ap-plication of this method in alumina concentration control in the real aluminum electrolytic process is in-troduced. It is proved in practice that the above methods not only are easy to implement in engineering design but also avoid the trial-and-error of general intelligent controllers. It has taken better effect in the following application: achieving long-term stable control of low alumina concentration and increasing the controlled ratio of anode effect greatly from 60% to 80%.

  8. Medical applications of model-based dynamic thermography

    Science.gov (United States)

    Nowakowski, Antoni; Kaczmarek, Mariusz; Ruminski, Jacek; Hryciuk, Marcin; Renkielska, Alicja; Grudzinski, Jacek; Siebert, Janusz; Jagielak, Dariusz; Rogowski, Jan; Roszak, Krzysztof; Stojek, Wojciech

    2001-03-01

    The proposal to use active thermography in medical diagnostics is promising in some applications concerning investigation of directly accessible parts of the human body. The combination of dynamic thermograms with thermal models of investigated structures gives attractive possibility to make internal structure reconstruction basing on different thermal properties of biological tissues. Measurements of temperature distribution synchronized with external light excitation allow registration of dynamic changes of local temperature dependent on heat exchange conditions. Preliminary results of active thermography applications in medicine are discussed. For skin and under- skin tissues an equivalent thermal model may be determined. For the assumed model its effective parameters may be reconstructed basing on the results of transient thermal processes. For known thermal diffusivity and conductivity of specific tissues the local thickness of a two or three layer structure may be calculated. Results of some medical cases as well as reference data of in vivo study on animals are presented. The method was also applied to evaluate the state of the human heart during the open chest cardio-surgical interventions. Reference studies of evoked heart infarct in pigs are referred, too. We see the proposed new in medical applications technique as a promising diagnostic tool. It is a fully non-invasive, clean, handy, fast and affordable method giving not only qualitative view of investigated surfaces but also an objective quantitative measurement result, accurate enough for many applications including fast screening of affected tissues.

  9. Application of Z-Number Based Modeling in Psychological Research.

    Science.gov (United States)

    Aliev, Rafik; Memmedova, Konul

    2015-01-01

    Pilates exercises have been shown beneficial impact on physical, physiological, and mental characteristics of human beings. In this paper, Z-number based fuzzy approach is applied for modeling the effect of Pilates exercises on motivation, attention, anxiety, and educational achievement. The measuring of psychological parameters is performed using internationally recognized instruments: Academic Motivation Scale (AMS), Test of Attention (D2 Test), and Spielberger's Anxiety Test completed by students. The GPA of students was used as the measure of educational achievement. Application of Z-information modeling allows us to increase precision and reliability of data processing results in the presence of uncertainty of input data created from completed questionnaires. The basic steps of Z-number based modeling with numerical solutions are presented. PMID:26339231

  10. Application of Z-Number Based Modeling in Psychological Research

    Directory of Open Access Journals (Sweden)

    Rafik Aliev

    2015-01-01

    Full Text Available Pilates exercises have been shown beneficial impact on physical, physiological, and mental characteristics of human beings. In this paper, Z-number based fuzzy approach is applied for modeling the effect of Pilates exercises on motivation, attention, anxiety, and educational achievement. The measuring of psychological parameters is performed using internationally recognized instruments: Academic Motivation Scale (AMS, Test of Attention (D2 Test, and Spielberger’s Anxiety Test completed by students. The GPA of students was used as the measure of educational achievement. Application of Z-information modeling allows us to increase precision and reliability of data processing results in the presence of uncertainty of input data created from completed questionnaires. The basic steps of Z-number based modeling with numerical solutions are presented.

  11. Real-time application of the drag based model

    Science.gov (United States)

    Žic, Tomislav; Temmer, Manuela; Vršnak, Bojan

    2016-04-01

    The drag-based model (DBM) is an analytical model which is usually used for calculating kinematics of coronal mass ejections (CMEs) in the interplanetary space, prediction of the CME arrival times and impact speeds at arbitrary targets in the heliosphere. The main assumption of the model is that beyond a distance of about 20 solar radii from the Sun, the drag is dominant in the interplanetary space. The previous version of DBM relied on the rough assumption of averaged, unperturbed and constant environmental conditions as well as constant CME properties throughout the entire interplanetary CME propagation. The continuation of our work consists of enhancing the model into a form which uses a time dependent and perturbed environment without constraints on CME properties and distance forecasting. The extension provides the possibility of application in various scenarios, such as automatic least-square fitting on initial CME kinematic data suitable for a real-time forecasting of CME kinematics, or embedding the DBM into pre-calculated interplanetary ambient conditions provided by advanced numerical simulations (for example, codes of ENLIL, EUHFORIA, etc.). A demonstration of the enhanced DBM is available on the web-site: http://www.geof.unizg.hr/~tzic/dbm.html. We acknowledge the support of European Social Fund under the "PoKRet" project.

  12. Extending EMMS-based models to CFB boiler applications

    Institute of Scientific and Technical Information of China (English)

    Bona Lu; Nan Zhang; Wei Wang; Jinghai Li

    2012-01-01

    Recently,EMMS-based models are being widely applied in simulations of high-throughput circulating fluidized beds (CFBs) with fine particles.Its use for low flux systems,such as CFB boiler (CFBB),still remains unexplored.In this work,it has been found that the original definition of cluster diameter in EMMS model is unsuitable for simulations of the CFB boiler with low solids flux.To remedy this,we propose a new model of cluster diameter.The EMMS-based drag model (EMMS/matrix model) with this revised cluster definition is validated through the computational fluid dynamics (CFD) simulation of a CFB boiler.

  13. Numerical modeling in electroporation-based biomedical applications

    OpenAIRE

    Pavšelj, Nataša; Miklavčič, Damijan

    2015-01-01

    Background. Numerous experiments have to be performed before a biomedical application is put to practical use in clinical environment. As a complementary work to in vitro, in vivo and medical experiments, we can use analytical and numerical models to represent, as realistically as possible, real biological phenomena of, in our case, electroporation. In this way we canevaluate different electrical parameters in advance, such as pulse amplitude, duration, number of pulses, or different electrod...

  14. Numerical modeling in electroporation-based biomedical applications:

    OpenAIRE

    Miklavčič, Damijan; Pavšelj, Nataša

    2008-01-01

    Background. Numerous experiments have to be performed before a biomedical application is put to practical use in clinical environment. As a complementary work to in vitro, in vivo and medical experiments, we can use analytical and numerical models to represent, as realistically as possible, real biological phenomena of, in our case, electroporation. In this way we canevaluate different electrical parameters in advance, such as pulse amplitude, duration, number of pulses, or different electrod...

  15. Physical based Schottky barrier diode modeling for THz applications

    DEFF Research Database (Denmark)

    Yan, Lei; Krozer, Viktor; Michaelsen, Rasmus Schandorph;

    2013-01-01

    In this work, a physical Schottky barrier diode model is presented. The model is based on physical parameters such as anode area, Ohmic contact area, doping profile from epitaxial (EPI) and substrate (SUB) layers, layer thicknesses, barrier height, specific contact resistance, and device...... temperature. The effects of barrier height lowering, nonlinear resistance from the EPI layer, and hot electron noise are all included for accurate characterization of the Schottky diode. To verify the diode model, measured I-V and C-V characteristics are compared with the simulation results. Due to the lack...

  16. An application-semantics-based relaxed transaction model for internetware

    Institute of Scientific and Technical Information of China (English)

    HUANG Tao; DING Xiaoning; WEI Jun

    2006-01-01

    An internetware application is composed by existing individual services, while transaction processing is a key mechanism to make the composition reliable. The existing research of transactional composite service (TCS) depends on the analysis to composition structure and exception handling mechanism in order to guarantee the relaxed atomicity.However, this approach cannot handle some application-specific requirements and causes lots of unnecessary failure recoveries or even aborts. In this paper, we propose a relaxed transaction model, including system mode, relaxed atomicity criterion, static checking algorithm and dynamic enforcement algorithm. Users are able to define different relaxed atomicity constraint for different TCS according to application-specific requirements, including acceptable configurations and the preference order. The checking algorithm determines whether the constraint can be guaranteed to be satisfied. The enforcement algorithm monitors the execution and performs transaction management work according to the constraint. Compared to the existing work, our approach can handle complex application requirements, avoid unnecessary failure recoveries and perform the transaction management work automatically.

  17. Model Based Fault Detection in a Centrifugal Pump Application

    DEFF Research Database (Denmark)

    Kallesøe, Carsten; Cocquempot, Vincent; Izadi-Zamanabadi, Roozbeh

    2006-01-01

    A model based approach for fault detection in a centrifugal pump, driven by an induction motor, is proposed in this paper. The fault detection algorithm is derived using a combination of structural analysis, observer design and Analytical Redundancy Relation (ARR) design. Structural considerations...

  18. Application of Search Algorithms for Model Based Regression Testing

    Directory of Open Access Journals (Sweden)

    Sidra Noureen

    2014-04-01

    Full Text Available UML models have gained their significance as reported in the literature. The use of a model to describe the behavior of a system is a proven and major advantage to test. With the help of Model Based Testing (MBT, it is possible to automatically generate test cases. When MBT is applied on large industrial systems, there is problem to sampling the test cases from the suit of entire test because it is difficult to execute the huge number of test cases being generated. The motivation of this study is to design a multi objective genetic algorithm based test case selection technique which can select the most appropriate subset of test cases. NSGA (Non-dominated Sorting Genetic Algorithm is used as an optimization algorithm and its fitness function is improved for selecting test cases from the dataset. It is concluded that there is a room to improve the performance of NSGA algorithm by means of tailoring its respective fitness function.

  19. GIS application on spatial landslide analysis using statistical based models

    Science.gov (United States)

    Pradhan, Biswajeet; Lee, Saro; Buchroithner, Manfred F.

    2009-09-01

    This paper presents the assessment results of spatially based probabilistic three models using Geoinformation Techniques (GIT) for landslide susceptibility analysis at Penang Island in Malaysia. Landslide locations within the study areas were identified by interpreting aerial photographs, satellite images and supported with field surveys. Maps of the topography, soil type, lineaments and land cover were constructed from the spatial data sets. There are ten landslide related factors were extracted from the spatial database and the frequency ratio, fuzzy logic, and bivariate logistic regression coefficients of each factor was computed. Finally, landslide susceptibility maps were drawn for study area using frequency ratios, fuzzy logic and bivariate logistic regression models. For verification, the results of the analyses were compared with actual landslide locations in study area. The verification results show that bivariate logistic regression model provides slightly higher prediction accuracy than the frequency ratio and fuzzy logic models.

  20. A Systematic Review of Agent-Based Modelling and Simulation Applications in the Higher Education Domain

    Science.gov (United States)

    Gu, X.; Blackmore, K. L.

    2015-01-01

    This paper presents the results of a systematic review of agent-based modelling and simulation (ABMS) applications in the higher education (HE) domain. Agent-based modelling is a "bottom-up" modelling paradigm in which system-level behaviour (macro) is modelled through the behaviour of individual local-level agent interactions (micro).…

  1. Application of Physiologically Based Pharmacokinetic Models in Chemical Risk Assessment

    Directory of Open Access Journals (Sweden)

    Moiz Mumtaz

    2012-01-01

    Full Text Available Post-exposure risk assessment of chemical and environmental stressors is a public health challenge. Linking exposure to health outcomes is a 4-step process: exposure assessment, hazard identification, dose response assessment, and risk characterization. This process is increasingly adopting “in silico” tools such as physiologically based pharmacokinetic (PBPK models to fine-tune exposure assessments and determine internal doses in target organs/tissues. Many excellent PBPK models have been developed. But most, because of their scientific sophistication, have found limited field application—health assessors rarely use them. Over the years, government agencies, stakeholders/partners, and the scientific community have attempted to use these models or their underlying principles in combination with other practical procedures. During the past two decades, through cooperative agreements and contracts at several research and higher education institutions, ATSDR funded translational research has encouraged the use of various types of models. Such collaborative efforts have led to the development and use of transparent and user-friendly models. The “human PBPK model toolkit” is one such project. While not necessarily state of the art, this toolkit is sufficiently accurate for screening purposes. Highlighted in this paper are some selected examples of environmental and occupational exposure assessments of chemicals and their mixtures.

  2. Application of Z-Number Based Modeling in Psychological Research

    OpenAIRE

    Rafik Aliev; Konul Memmedova

    2015-01-01

    Pilates exercises have been shown beneficial impact on physical, physiological, and mental characteristics of human beings. In this paper, Z-number based fuzzy approach is applied for modeling the effect of Pilates exercises on motivation, attention, anxiety, and educational achievement. The measuring of psychological parameters is performed using internationally recognized instruments: Academic Motivation Scale (AMS), Test of Attention (D2 Test), and Spielberger’s Anxiety Test completed by s...

  3. A Model of Cloud Based Application Environment for Software Testing

    CERN Document Server

    Vengattaraman, T; Baskaran, R

    2010-01-01

    Cloud computing is an emerging platform of service computing designed for swift and dynamic delivery of assured computing resources. Cloud computing provide Service-Level Agreements (SLAs) for guaranteed uptime availability for enabling convenient and on-demand network access to the distributed and shared computing resources. Though the cloud computing paradigm holds its potential status in the field of distributed computing, cloud platforms are not yet to the attention of majority of the researchers and practitioners. More specifically, still the researchers and practitioners community has fragmented and imperfect knowledge on cloud computing principles and techniques. In this context, one of the primary motivations of the work presented in this paper is to reveal the versatile merits of cloud computing paradigm and hence the objective of this work is defined to bring out the remarkable significances of cloud computing paradigm through an application environment. In this work, a cloud computing model for sof...

  4. Middleware Based Model of Heterogeneous Systems for SCADA Distributed Applications

    Directory of Open Access Journals (Sweden)

    UNGUREAN, I.

    2010-05-01

    Full Text Available Infrastructure underlying the distributed information systems is heterogeneous and very complex. Middleware allows the development of distributed information systems, without knowing the functioning details of an infrastructure, by its abstracting. An essential issue on designing such systems is represented by choosing the middleware technologies. An architectural model of a SCADA system based on middleware is proposed in this paper. This system is formed of servers that centralize data and clients, which receive information from a server, thus allowing the chart displaying of such information. All these components own a specific functionality and can exchange information, by means of a middleware bus. A middleware bus signifies a software bus, where more middleware technologies can coexist.

  5. Physiologically Based Pharmacokinetic (PBPK) Modeling and Simulation Approaches: A Systematic Review of Published Models, Applications, and Model Verification.

    Science.gov (United States)

    Sager, Jennifer E; Yu, Jingjing; Ragueneau-Majlessi, Isabelle; Isoherranen, Nina

    2015-11-01

    Modeling and simulation of drug disposition has emerged as an important tool in drug development, clinical study design and regulatory review, and the number of physiologically based pharmacokinetic (PBPK) modeling related publications and regulatory submissions have risen dramatically in recent years. However, the extent of use of PBPK modeling by researchers, and the public availability of models has not been systematically evaluated. This review evaluates PBPK-related publications to 1) identify the common applications of PBPK modeling; 2) determine ways in which models are developed; 3) establish how model quality is assessed; and 4) provide a list of publically available PBPK models for sensitive P450 and transporter substrates as well as selective inhibitors and inducers. PubMed searches were conducted using the terms "PBPK" and "physiologically based pharmacokinetic model" to collect published models. Only papers on PBPK modeling of pharmaceutical agents in humans published in English between 2008 and May 2015 were reviewed. A total of 366 PBPK-related articles met the search criteria, with the number of articles published per year rising steadily. Published models were most commonly used for drug-drug interaction predictions (28%), followed by interindividual variability and general clinical pharmacokinetic predictions (23%), formulation or absorption modeling (12%), and predicting age-related changes in pharmacokinetics and disposition (10%). In total, 106 models of sensitive substrates, inhibitors, and inducers were identified. An in-depth analysis of the model development and verification revealed a lack of consistency in model development and quality assessment practices, demonstrating a need for development of best-practice guidelines.

  6. Resilience-based application of state-and-transition models

    Science.gov (United States)

    We recommend that several conceptual modifications be incorporated into the state-and-transition model (STM) framework to: 1) explicitly link this framework to the concept of ecological resilience, 2) direct management attention away from thresholds and toward the maintenance of state resilience, an...

  7. NOVEL COMPONENT-BASED DEVELOPMENT MODEL FOR SIP-BASED MOBILE APPLICATION

    Directory of Open Access Journals (Sweden)

    Ahmed Barnawi

    2012-02-01

    Full Text Available Universities and Institutions these days’ deals with issues related to with assessment of large number ofstudents. Various evaluation methods have been adopted by examiners in different institutions to examiningthe ability of an individual, starting from manual means of using paper and pencil to electronic, from oralto written, practical to theoretical and many others.There is a need to expedite the process of examination in order to meet the increasing enrolment of studentsat the universities and institutes. Sip Based Mass Mobile Examination System (SiBMMES expedites theexamination process by automating various activities in an examination such as exam paper setting,Scheduling and allocating examination time and evaluation (auto-grading for objective questions etc.SiBMMES uses the IP Multimedia Subsystem (IMS that is an IP communications framework providing anenvironment for the rapid development of innovative and reusable services Session Initial Protocol (SIP isa signalling (request-response protocol for this architecture and it is used for establishing sessions in anIP network, making it an ideal candidate for supporting terminal mobility in the IMS to deliver the services,with the extended services available in IMS like open APIs, common network services, Quality of Services(QoS like multiple sessions per call, Push to Talk etc often requiring multiple types of media (includingvoice, video, pictures, and text. SiBMMES is an effective solution for mass education evaluation usingmobile and web technology.In this paper, a novel hybrid component based development (CBD model is proposed for SiBMMES. AComponent based Hybrid Model is selected to the fact that IMS takes the concept of layered architectureone step further by defining a horizontal architecture where service enablers and common functions can bereused for multiple applications. This novel model tackle a new domain for IT professionals, its ideal tostart developing services as a small

  8. Modeling Component-based Bragg gratings Application: tunable lasers

    Directory of Open Access Journals (Sweden)

    Hedara Rachida

    2011-09-01

    Full Text Available The principal function of a grating Bragg is filtering, which can be used in optical fibers based component and active or passive semi conductors based component, as well as telecommunication systems. Their ideal use is with lasers with fiber, amplifiers with fiber or Laser diodes. In this work, we are going to show the principal results obtained during the analysis of various types of grating Bragg by the method of the coupled modes. We then present the operation of DBR are tunable. The use of Bragg gratings in a laser provides single-mode sources, agile wavelength. The use of sampled grating increases the tuning range.

  9. Spintronic based superlattice structure modelling for photovoltaic application

    Science.gov (United States)

    Ravindiran, M.; Shankar, P.

    2014-11-01

    This paper deals with the modelling and simulation of heterojunction photovoltaic (PV) device. An extensive comparative analysis of superlattice photovoltaic device structures made of N-Si/FeZnO/N-Si, N-Si/ZnO/N-Si and N-Si/P-Si model was done. Material studio software package has been used to analyze the performance of the above mentioned device structures by using CASTEP simulation. CASTEP involves the Density Functional Theory (DFT) approach to calculate the properties such as absorption, conductivity, density of states (DOS) and band structure. Simulation results reveal that N-Si/FeZnO/N-Si superlattice structure has improved absorption and conductivity when compared to the other two structures. The similar superlattice structure may be also used in electronic device such as transistors without limiting it to photovoltaic.

  10. Computational model for refrigerators based on Peltier effect application

    Energy Technology Data Exchange (ETDEWEB)

    Astrain, D.; Vian, J.G.; Albizua, J. [Departamento de Ingenieria Mecanica, Energetica y de Materiales, Universidad Publica de Navarra, UPNa. Pamplona (Spain)

    2005-12-01

    A computational model, which simulates thermal and electric performance of thermoelectric refrigerators, has been developed. This model solves the non-linear system that is made up of the thermoelectric equations and the heat conduction equations providing values for temperature, electric consumption, heat flow and coefficient of performance of the refrigerator. Finite differences method is used in order to solve the system and also semi empirical expressions for convection coefficients. Subsequently a thermoelectric refrigerator with an inner volume of 55x10{sup -3}m{sup 3} has been designed and tested, whose cold system is composed of a Peltier pellet (50W of maximum power) and a fan of 2W. An experimental analysis of its performance in different conditions has been carried out with this prototype, which, in his turn, has been useful for assessing the accuracy of the developed model. The built thermoelectric refrigerator prototype, offers advantages with respect to vapour compression classical technology such as: a more ecological system, more silent and robust and more precise in the control of temperatures which make it suitable for camping vehicles, buses, special transports for electro medicine, etc. (author)

  11. The timing of disability insurance application: a choice-based semiparametric hazard model

    OpenAIRE

    Richard V. Burkhauser; Butler, J. S.; Yang-Woo Kim

    1996-01-01

    We use a choice-based subsample of Social Security Disability Insurance applicants from the 1978 Social Security Survey of Disability and Work to test the importance of policy variables on the timing of application for disability insurance benefits following the onset of a work limiting health condition. We correct for choice-based sampling by extending the Manski-Lerman (1977) correction to the likelihood function of our continuous time hazard model defined with semiparametric unmeasured het...

  12. Applicability of an exposure model for the determination of emissions from mobile phone base stations

    DEFF Research Database (Denmark)

    Breckenkamp, J; Neitzke, H P; Bornkessel, C;

    2008-01-01

    Applicability of a model to estimate radiofrequency electromagnetic field (RF-EMF) strength in households from mobile phone base stations was evaluated with technical data of mobile phone base stations available from the German Net Agency, and dosimetric measurements, performed in an epidemiologi...

  13. Spatial Decision Support Applications Based on Three-Dimensional City Models

    Institute of Scientific and Technical Information of China (English)

    LI Chaokui; ZHU Qing; ZHANG Yeting; HUANG Duo; ZHAO Jie; CHEN Songlin

    2004-01-01

    The basic mathematic models, such as the statistic model, the time-serial model, the spatial dynamic model etc., and some typical analysis methods based on 3DCM are proposed and discussed. A few typical spatial decision making methods integrating the spatial analysis and the basic mathematical models are also introduced, e.g. Visual impact assessment, dispersion of noise immissions, base station plan for wireless communication. In addition, a new idea of expectation of further applications and add-in-value service of 3DCM is promoted. As an example, the sunshine analysis is studied and some helpful conclusions are drawn.

  14. Application for managing model-based material properties for simulation-based engineering

    Science.gov (United States)

    Hoffman, Edward L.

    2009-03-03

    An application for generating a property set associated with a constitutive model of a material includes a first program module adapted to receive test data associated with the material and to extract loading conditions from the test data. A material model driver is adapted to receive the loading conditions and a property set and operable in response to the loading conditions and the property set to generate a model response for the material. A numerical optimization module is adapted to receive the test data and the model response and operable in response to the test data and the model response to generate the property set.

  15. Developing a model for application of electronic banking based on electronic trust

    Directory of Open Access Journals (Sweden)

    Amir Hooshang Nazarpoori

    2014-05-01

    Full Text Available This study develops a model for application of electronic banking based on electronic trust among costumers of Day bank in KhoramAbad city. A sample of 150 people was selected based on stratified random sampling. Questionnaires were used for the investigation. Results indicate that technology-based factors, user-based factors, and trust had negative relationships with perceived risk types including financial, functional, personal, and private. Moreover, trust including trust in system and trust in bank had a positive relationship with tendency to use and real application of electronic banking.

  16. The Model and Design for COM-Based e-Commerce Application System

    Institute of Scientific and Technical Information of China (English)

    TANG Xiao-mei; SUN Li

    2002-01-01

    From the point of constructing e-commerce application system, based on the structured analysis and ObjectOriented Design method, a combined modeling method Business-Process Driven(BPD) is proposed. This method focuses on the business process through the development process of the system. First, the business model of the system, then commercial object model is introduced according to the business model. At last the COM-model for the system is established. The system is implemented in an iterative and incremental way. The design and analysis result of each stage is illustrated by series of views using the modeling tool UML.

  17. SOFT SENSING MODEL BASED ON SUPPORT VECTOR MACHINE AND ITS APPLICATION

    Institute of Scientific and Technical Information of China (English)

    Yan Weiwu; Shao Huihe; Wang Xiaofan

    2004-01-01

    Soft sensor is widely used in industrial process control.It plays an important role to improve the quality of product and assure safety in production.The core of soft sensor is to construct soft sensing model.A new soft sensing modeling method based on support vector machine (SVM) is proposed.SVM is a new machine learning method based on statistical learning theory and is powerful for the problem characterized by small sample, nonlinearity, high dimension and local minima.The proposed methods are applied to the estimation of frozen point of light diesel oil in distillation column.The estimated outputs of soft sensing model based on SVM match the real values of frozen point and follow varying trend of frozen point very well.Experiment results show that SVM provides a new effective method for soft sensing modeling and has promising application in industrial process applications.

  18. A Comparative Study of Relational and Non-Relational Database Models in a Web- Based Application

    OpenAIRE

    Cornelia Gyorödi; Robert Gyorödi; Roxana Sotoc

    2015-01-01

    The purpose of this paper is to present a comparative study between relational and non-relational database models in a web-based application, by executing various operations on both relational and on non-relational databases thus highlighting the results obtained during performance comparison tests. The study was based on the implementation of a web-based application for population records. For the non-relational database, we used MongoDB and for the relational database, we used MSSQL 2014. W...

  19. Supporting Seamful Development of Positioning Applications through Model Based Translucent Middleware

    DEFF Research Database (Denmark)

    Jensen, Jakob Langdal

    middleware, and how that middleware can provide developers with methods for controlling application qualities that are related to the positioning process. One key challenge is to understand how to support application development in a heterogeneous domain like that of positioning. Recent trends in application...... with sufficient control over the positioning technologies while maximizing the number of responsibilities that can be delegated to the middleware. We address these challenges by proposing model based translucency as a technique for building middleware that supports run-time inspection, is open to adaptation...... and extension, and can be used to realize a seamfully designed middleware. For position based applications, overall application quality often depends on properties that are orthogonal to the core positioning functionality; there- fore, quality management tend to cross-cut various abstractions of the positioning...

  20. Security Model for Microsoft Based Mobile Sales Management Application in Private Cloud Computing

    Directory of Open Access Journals (Sweden)

    Kuan Chee Houng

    2013-05-01

    Full Text Available The Microsoft-based mobile sales management application is a sales force management application that currently running on Windows Mobile 6.5. It handles sales-related activity and cuts down the administrative task of sales representative. Then, Windows launch a new mobile operating system, Windows Phone and stop providing support to Windows Mobile. This has become an obstacle for Windows Mobile development. From time to time, Windows Mobile will be eliminated from the market due to no support provided by Microsoft. Besides that, Windows Mobile application cannot run on Windows Phone mobile operating system due to lack of compatibility. Therefore, applications those run on Windows Mobile need to find a solution addressing this problem. The rise of cloud computing technology in delivering software as a service becomes a solution. The Microsoft-based mobile sales management application delivers a service to run in a web browser, rather than limited by certain type of mobile that run the Windows Mobile operating system. However, there are some security issues need to concern in order to deliver the Microsoft-based mobile application as a service in private cloud computing. Therefore, security model is needed to answer the security issues in private cloud computing. This research is to propose a security model for the Microsoft-based mobile sales management application in private cloud computing. Lastly, a User Acceptance Test (UAT is carried out to test the compatibility between proposed security model of Microsoft-based mobile sales management application in a private cloud and tablet computers.

  1. Application of soft computing based hybrid models in hydrological variables modeling: a comprehensive review

    Science.gov (United States)

    Fahimi, Farzad; Yaseen, Zaher Mundher; El-shafie, Ahmed

    2016-02-01

    Since the middle of the twentieth century, artificial intelligence (AI) models have been used widely in engineering and science problems. Water resource variable modeling and prediction are the most challenging issues in water engineering. Artificial neural network (ANN) is a common approach used to tackle this problem by using viable and efficient models. Numerous ANN models have been successfully developed to achieve more accurate results. In the current review, different ANN models in water resource applications and hydrological variable predictions are reviewed and outlined. In addition, recent hybrid models and their structures, input preprocessing, and optimization techniques are discussed and the results are compared with similar previous studies. Moreover, to achieve a comprehensive view of the literature, many articles that applied ANN models together with other techniques are included. Consequently, coupling procedure, model evaluation, and performance comparison of hybrid models with conventional ANN models are assessed, as well as, taxonomy and hybrid ANN models structures. Finally, current challenges and recommendations for future researches are indicated and new hybrid approaches are proposed.

  2. Application-Oriented Confidentiality and Integrity Dynamic Union Security Model Based on MLS Policy

    Science.gov (United States)

    Xue, Mingfu; Hu, Aiqun; He, Chunlong

    We propose a new security model based on MLS Policy to achieve a better security performance on confidentiality, integrity and availability. First, it realizes a combination of BLP model and Biba model through a two-dimensional independent adjustment of integrity and confidentiality. And, the subject's access range is adjusted dynamically according to the security label of related objects and the subject's access history. Second, the security level of the trusted subject is extended to writing and reading privilege range respectively, following the principle of least privilege. Third, it adjusts the objects' security levels after adding confidential information to prevent the information disclosure. Fourth, it uses application-oriented logic to protect specific applications to avoid the degradation of security levels. Thus, it can ensure certain applications operate smoothly. Lastly, examples are presented to show the effectiveness and usability of the proposed model.

  3. Model-Based Instruction: Theory and Application in a Teacher Education Program.

    Science.gov (United States)

    Steinley, Gary; Reisetter, Marcy; Penrod, Kathryn; Haar, Jean; Ellingson, Janna

    Model-Based Instruction (MBI) plays a significant role in the undergraduate teacher education program at South Dakota State University. Integrated into the program 8 years ago, the understandings and applications of MBI have evolved into a powerful and comprehensive framework that leads to rich and varied instruction with students directly in the…

  4. A Comparative Study of Relational and Non-Relational Database Models in a Web- Based Application

    Directory of Open Access Journals (Sweden)

    Cornelia Gyorödi

    2015-11-01

    Full Text Available The purpose of this paper is to present a comparative study between relational and non-relational database models in a web-based application, by executing various operations on both relational and on non-relational databases thus highlighting the results obtained during performance comparison tests. The study was based on the implementation of a web-based application for population records. For the non-relational database, we used MongoDB and for the relational database, we used MSSQL 2014. We will also present the advantages of using a non-relational database compared to a relational database integrated in a web-based application, which needs to manipulate a big amount of data.

  5. CACM: A New Coordination Model in Mobile Agent-Based Information Retrieval Applications

    Institute of Scientific and Technical Information of China (English)

    TANGXinhuai; ZHANGYaying; YAOYinxiong; YOUJinyuan

    2005-01-01

    In mobile agent systems, an application may be composed of several mobile agents that cooperatively perform a task. Multiple mobile agents need to communicate and interact with each other to accomplish their cooperative goal. Coordination model aims to provide solutions to interactions between concurrent activities, hiding the computing details and focusing on interaction between activities. A Context-aware coordination model (CACM), which combines mobility and coordination, is proposed for mobile agent applications, i.e. in mobile agent based information retrieval applications. The context-aware coordination model transfers interactions between agents from globally coupling interactions to locally uncoupling tuple space interactions. In addition, programmable tuple space is adopted to solve the problems of context-aware coordination introduced by mobility and data heterogeneity in mobile agent systems. Furthermore, environment specific and application specific coordination policy can be integrated into the programmable tuple space for customized requirements. Finally an application sample system-information retrieval in mobile agent applications is carried out to test the performance of the proposed model.

  6. Design and implementation of space physics multi-model application integration based on web

    Science.gov (United States)

    Jiang, Wenping; Zou, Ziming

    With the development of research on space environment and space science, how to develop network online computing environment of space weather, space environment and space physics models for Chinese scientific community is becoming more and more important in recent years. Currently, There are two software modes on space physics multi-model application integrated system (SPMAIS) such as C/S and B/S. the C/S mode which is traditional and stand-alone, demands a team or workshop from many disciplines and specialties to build their own multi-model application integrated system, that requires the client must be deployed in different physical regions when user visits the integrated system. Thus, this requirement brings two shortcomings: reducing the efficiency of researchers who use the models to compute; inconvenience of accessing the data. Therefore, it is necessary to create a shared network resource access environment which could help users to visit the computing resources of space physics models through the terminal quickly for conducting space science research and forecasting spatial environment. The SPMAIS develops high-performance, first-principles in B/S mode based on computational models of the space environment and uses these models to predict "Space Weather", to understand space mission data and to further our understanding of the solar system. the main goal of space physics multi-model application integration system (SPMAIS) is to provide an easily and convenient user-driven online models operating environment. up to now, the SPMAIS have contained dozens of space environment models , including international AP8/AE8、IGRF、T96 models,and solar proton prediction model、geomagnetic transmission model,etc. which are developed by Chinese scientists. another function of SPMAIS is to integrate space observation data sets which offers input data for models online high-speed computing. In this paper, service-oriented architecture (SOA) concept that divides

  7. A bootstrap based space-time surveillance model with an application to crime occurrences

    Science.gov (United States)

    Kim, Youngho; O'Kelly, Morton

    2008-06-01

    This study proposes a bootstrap-based space-time surveillance model. Designed to find emerging hotspots in near-real time, the bootstrap based model is characterized by its use of past occurrence information and bootstrap permutations. Many existing space-time surveillance methods, using population at risk data to generate expected values, have resulting hotspots bounded by administrative area units and are of limited use for near-real time applications because of the population data needed. However, this study generates expected values for local hotspots from past occurrences rather than population at risk. Also, bootstrap permutations of previous occurrences are used for significant tests. Consequently, the bootstrap-based model, without the requirement of population at risk data, (1) is free from administrative area restriction, (2) enables more frequent surveillance for continuously updated registry database, and (3) is readily applicable to criminology and epidemiology surveillance. The bootstrap-based model performs better for space-time surveillance than the space-time scan statistic. This is shown by means of simulations and an application to residential crime occurrences in Columbus, OH, year 2000.

  8. Integrated knowledge-based modeling and its application for classification problems

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Knowledge discovery from data directly can hardly avoid the fact that it is biased towards the collected experimental data, whereas, expert systems are always baffled with the manual knowledge acquisition bottleneck. So it is believable that integrating the knowledge embedded in data and those possessed by experts can lead to a superior modeling approach. Aiming at the classification problems, a novel integrated knowledge-based modeling methodology, oriented by experts and driven by data, is proposed. It starts from experts identifying modeling parameters, and then the input space is partitioned followed by fuzzification. Afterwards, single rules are generated and then aggregated to form a rule base. on which a fuzzy inference mechanism is proposed. The experts are allowed to make necessary changes on the rule base to improve the model accuracy. A real-world application, welding fault diagnosis, is presented to demonstrate the effectiveness of the methodology.

  9. Application of Holdridge life-zone model based on the terrain factor in Xinjiang Automous Region

    Institute of Scientific and Technical Information of China (English)

    NI Yong-ming; OUYANG Zhi-yun; WANG Xiao-ke

    2005-01-01

    This study improved the application of the Holdridge life-zone model to simulate the distribution of desert vegetation in China which gives statistics to support eco-recovery and ecosystem reconstruction in desert area. This study classified the desert vegetation into four types: (1) LAD: little arbor desert; (2) SD: shrub desert; (3) HLHSD: half-shrub, little half-shrub desert; (4) LHSCD: little halfshrub cushion desert. Based on the classification of Xinjiang desert vegetation, the classical Holdridge life-zone model was used to simulate Xinjiang desert vegetation's distribution and compare the Kappa coefficient result of the model with table of accuracy represented by Kappa values. The Kappa value of the model was only 0.19, it means the simulation result was poor. To improve the life-zone model application to Xinjiang desert vegetation type, a set of plot standards for terrain factors was developed by using the plot standard as the reclassification criterion to climate sub-regime. Then the desert vegetation in Xinjiang was simulated. The average Kappa value of the second simulation to the respective climate regime was 0.45. The Kappa value of final modeling result was 0.64, which is the better value.The modification of the model made it in more application region. In the end, the model' s ecological relevance to the Xinjiang desert vegetation types was studied.

  10. Application of chaotic prediction model based on wavelet transform on water quality prediction

    Science.gov (United States)

    Zhang, L.; Zou, Z. H.; Zhao, Y. F.

    2016-08-01

    Dissolved oxygen (DO) is closely related to water self-purification capacity. In order to better forecast its concentration, the chaotic prediction model, based on the wavelet transform, is proposed and applied to a certain monitoring section of the Mentougou area of the Haihe River Basin. The result is compared with the simple application of the chaotic prediction model. The study indicates that the new model aligns better with the real data and has a higher accuracy. Therefore, it will provide significant decision support for water protection and water environment treatment.

  11. [Study on the Application of NAS-Based Algorithm in the NIR Model Optimization].

    Science.gov (United States)

    Geng, Ying; Xiang, Bing-ren; He, Lan

    2015-10-01

    In this paper, net analysis signal (NAS)-based concept was introduced to the analysis of multi-component Ginkgo biloba leaf extracts. NAS algorithm was utilized for the preprocessing of spectra, and NAS-based two-dimensional correlation analysis was used for the optimization of NIR model building. Simultaneous quantitative models for three flavonol aglycones: quercetin, keampferol and isorhamnetin were established respectively. The NAS vectors calculated using two algorithms introduced from Lorber and Goicoechea and Olivieri (HLA/GO) were applied in the development of calibration models, the reconstructed spectra were used as input of PLS modeling. For the first time, NAS-based two-dimensional correlation spectroscopy was used for wave number selection. The regions appeared in the main diagonal were selected as useful regions for model building. The results implied that two NAS-based preprocessing methods were successfully used for the analysis of quercetin, keampferol and isorhamnetin with a decrease of factor number and an improvement of model robustness. NAS-based algorithm was proven to be a useful tool for the preprocessing of spectra and for optimization of model calibration. The above research showed a practical application value for the NIRS in the analysis of complex multi-component petrochemical medicine with unknown interference. PMID:26904808

  12. Model-Based Evaluation Of System Scalability: Bandwidth Analysis For Smartphone-Based Biosensing Applications

    DEFF Research Database (Denmark)

    Patou, François; Madsen, Jan; Dimaki, Maria;

    2016-01-01

    -engineering efforts for scaling a system specification efficaciously. We demonstrate the value of our methodology by investigating a smartphone-based biosensing instrumentation platform. Specifically, we carry out scalability analysis for the system’s bandwidth specification: the maximum analog voltage waveform...... excitation frequency the system could output while allowing continuous acquisition and wireless streaming of bioimpedance measurements. We rely on several SysML modelling tools, including dependency matrices, as well as a fault-detection Simulink Stateflow executable model to conclude on how the successive...

  13. Artificial neural networks: Principle and application to model based control of drying systems -- A review

    Energy Technology Data Exchange (ETDEWEB)

    Thyagarajan, T.; Ponnavaikko, M. [Crescent Engineering Coll., Madras (India); Shanmugam, J. [Madras Inst. of Tech. (India); Panda, R.C.; Rao, P.G. [Central Leather Research Inst., Madras (India)

    1998-07-01

    This paper reviews the developments in the model based control of drying systems using Artificial Neural Networks (ANNs). Survey of current research works reveals the growing interest in the application of ANN in modeling and control of non-linear, dynamic and time-variant systems. Over 115 articles published in this area are reviewed. All landmark papers are systematically classified in chronological order, in three distinct categories; namely, conventional feedback controllers, model based controllers using conventional methods and model based controllers using ANN for drying process. The principles of ANN are presented in detail. The problems and issues of the drying system and the features of various ANN models are dealt with up-to-date. ANN based controllers lead to smoother controller outputs, which would increase actuator life. The paper concludes with suggestions for improving the existing modeling techniques as applied to predicting the performance characteristics of dryers. The hybridization techniques, namely, neural with fuzzy logic and genetic algorithms, presented, provide, directions for pursuing further research for the implementation of appropriate control strategies. The authors opine that the information presented here would be highly beneficial for pursuing research in modeling and control of drying process using ANN. 118 refs.

  14. Development of Design Procedures for Flexural Applications of Textile Composite Systems Based on Tension Stiffening Models

    OpenAIRE

    Mobasher, Barzin

    2011-01-01

    The Aveston Copper and Kelly (ACK) Method has been routinely used in estimating the efficiency of the bond between the textile and cementitious matrix. This method however has a limited applicability due to the simplifying assumptions such as perfect bond. A numerical model for simulation of tensile behavior of reinforced cement-based composites is presented to capture the inefficiency of the bond mechanisms. In this approach the role of interface properties which are instrumental in the simu...

  15. Application of a CFD based containment model to different large-scale hydrogen distribution experiments

    International Nuclear Information System (INIS)

    Highlights: • A CFD based model developed in ANSYS-FLUENT for simulating the distribution of hydrogen in the containment of a nuclear power plant during a severe accident is validated against four large-scale experiments. • The successive formation and mixing of a stratified gas-layer in experiments performed in the THAI and PANDA facilities are predicted well by the CFD model. • The pressure evolution and related condensation rate during different mixed convection flow conditions in the TOSQAN facility are predicted well by the CFD model. • The results give confidence in the general applicability of the CFD model and model settings. - Abstract: In the event of core degradation during a severe accident in water-cooled nuclear power plants (NPPs), large amounts of hydrogen are generated that may be released into the reactor containment. As the hydrogen mixes with the air in the containment, it can form a flammable mixture. Upon ignition it can damage relevant safety systems and put the integrity of the containment at risk. Despite the installation of mitigation measures, it has been recognized that the temporary existence of combustible or explosive gas clouds cannot be fully excluded during certain postulated accident scenarios. The distribution of hydrogen in the containment and mitigation of the risk are, therefore, important safety issues for NPPs. Complementary to lumped parameter code modelling, Computational Fluid Dynamics (CFD) modelling is needed for the detailed assessment of the hydrogen risk in the containment and for the optimal design of hydrogen mitigation systems in order to reduce this risk as far as possible. The CFD model applied by NRG makes use of the well-developed basic features of the commercial CFD package ANSYS-FLUENT. This general purpose CFD package is complemented with specific user-defined sub-models required to capture the relevant thermal-hydraulic phenomena in the containment during a severe accident as well as the effect of

  16. GRace: a MATLAB-based application for fitting the discrimination-association model.

    Science.gov (United States)

    Stefanutti, Luca; Vianello, Michelangelo; Anselmi, Pasquale; Robusto, Egidio

    2014-10-28

    The Implicit Association Test (IAT) is a computerized two-choice discrimination task in which stimuli have to be categorized as belonging to target categories or attribute categories by pressing, as quickly and accurately as possible, one of two response keys. The discrimination association model has been recently proposed for the analysis of reaction time and accuracy of an individual respondent to the IAT. The model disentangles the influences of three qualitatively different components on the responses to the IAT: stimuli discrimination, automatic association, and termination criterion. The article presents General Race (GRace), a MATLAB-based application for fitting the discrimination association model to IAT data. GRace has been developed for Windows as a standalone application. It is user-friendly and does not require any programming experience. The use of GRace is illustrated on the data of a Coca Cola-Pepsi Cola IAT, and the results of the analysis are interpreted and discussed.

  17. An emission source inversion model based on satellite data and its application in air quality forecasts

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    This paper aims at constructing an emission source inversion model using a variational processing method and adaptive nudging scheme for the Community Multiscale Air Quality Model (CMAQ) based on satellite data to investigate the applicability of high resolution OMI (Ozone Monitoring Instrument) column concentration data for air quality forecasts over the North China. The results show a reasonable consistency and good correlation between the spatial distributions of NO2 from surface and OMI satellite measurements in both winter and summer. Such OMI products may be used to implement integrated variational analysis based on observation data on the ground. With linear and variational corrections made, the spatial distribution of OMI NO2 clearly revealed more localized distributing characteristics of NO2 concentration. With such information, emission sources in the southwest and southeast of North China are found to have greater impacts on air quality in Beijing. When the retrieved emission source inventory based on high-resolution OMI NO2 data was used, the coupled Weather Research Forecasting CMAQ model (WRF-CMAQ) performed significantly better in forecasting NO2 concentration level and its tendency as reflected by the more consistencies between the NO2 concentrations from surface observation and model result. In conclusion, satellite data are particularly important for simulating NO2 concentrations on urban and street-block scale. High-resolution OMI NO2 data are applicable for inversing NOx emission source inventory, assessing the regional pollution status and pollution control strategy, and improving the model forecasting results on urban scale.

  18. Model Test Based Soil Spring Model and Application in Pipeline Thermal Buckling Analysis

    Institute of Scientific and Technical Information of China (English)

    GAO Xi-feng; LIU Run; YAN Shu-wang

    2011-01-01

    The buckling of submarine pipelines may occur due to the action of axial soil frictional force caused by relative movement of soil and pipeline,which is induced by the thermal and internal pressure.The likelihood of occurrence of this buckling phenomenon is largely determined by soil resistance.A series of large-scale model tests were carried out to facilitate the establishment of substantial data base for a variety of burial pipeline relationships.Based on the test data,nonlinear soil spring can be adopted to simulate the soil behavior during the pipeline movement.For uplift resistance,an ideal elasticity plasticity model is recommended in the case of H/D (depth-to-diameter ratio)>5 and an elasticity softened model is recommended in the case of H/D≤5.The soil resistance along the pipeline axial direction can be simulated by an ideal elasticity plasticity model.The numerical analyzing results show that the capacity of pipeline against thermal buckling decreases with its initial imperfection enlargement and increases with the burial depth enhancement.

  19. Application of geometry based hysteresis modelling in compensation of hysteresis of piezo bender actuator

    Science.gov (United States)

    Milecki, Andrzej; Pelic, Marcin

    2016-10-01

    This paper presents results of studies of an application of a new method of piezo bender actuators modelling. A special hysteresis simulation model was developed and is presented. The model is based on a geometrical deformation of main hysteresis loop. The piezoelectric effect is described and the history of the hysteresis modelling is briefly reviewed. Firstly, a simple model for main loop modelling is proposed. Then, a geometrical description of the non-saturated hysteresis is presented and its modelling method is introduced. The modelling makes use of the function describing the geometrical shape of the two hysteresis main curves, which can be defined theoretically or obtained by measurement. These main curves are stored in the memory and transformed geometrically in order to obtain the minor curves. Such model was prepared in the Matlab-Simulink software, but can be easily implemented using any programming language and applied in an on-line controller. In comparison to the other known simulation methods, the one presented in the paper is easy to understand, and uses simple arithmetical equations, allowing to quickly obtain the inversed model of hysteresis. The inversed model was further used for compensation of a non-saturated hysteresis of the piezo bender actuator and results have also been presented in the paper.

  20. Dynamic Modeling and Performance Analysis of PMSG based Wind Generation System for Residential Applications

    Directory of Open Access Journals (Sweden)

    Rashmi S

    2014-03-01

    Full Text Available This paper proposes the Dynamic modeling and performance analysis of Permanent magnet synchronous generator (PMSG based Wind Generation System (WGS. This system consists of Wind Turbine, PMSG, Diode Rectifier, Buck- Boost converter, Voltage source Inverter (VSI. PMSG and Buck Boost converter are employed in WGS to get efficient output according to the load requirement without damaging the system. The output of the VSI is injected to the grid and used for Home Application. The proposed model dynamic simulation results are tested in MATLAB Simulink

  1. Constructing a raster-based spatio-temporal hierarchical data model for marine fisheries application

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Marine information has been increasing quickly. The traditional database technologies have disadvantages in manipulating large amounts of marine information which relates to the position in 3-D with the time. Recently, greater emphasis has been placed on GIS (geographical information system)to deal with the marine information. The GIS has shown great success for terrestrial applications in the last decades, but its use in marine fields has been far more restricted. One of the main reasons is that most of the GIS systems or their data models are designed for land applications. They cannot do well with the nature of the marine environment and for the marine information. And this becomes a fundamental challenge to the traditional GIS and its data structure. This work designed a data model,the raster-based spatio-temporal hierarchical data model (RSHDM), for the marine information system, or for the knowledge discovery from spatio-temporal data, which bases itself on the nature of the marine data and overcomes the shortages of the current spatio-temporal models when they are used in the field. As an experiment, the marine fishery data warehouse (FDW) for marine fishery management was set up, which was based on the RSHDM. The experiment proved that the RSHDM can do well with the data and can extract easily the aggregations that the management needs at different levels.

  2. Development of web-based reliability data analysis algorithm model and its application

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Seok-Won, E-mail: swhwang@khnp.co.k [Korea Hydro and Nuclear Power Co. Ltd., Jang-Dong 25-1, Yuseong-Gu, 305-343 Daejeon (Korea, Republic of); Oh, Ji-Yong [Korea Hydro and Nuclear Power Co. Ltd., Jang-Dong 25-1, Yuseong-Gu, 305-343 Daejeon (Korea, Republic of); Moosung-Jae [Department of Nuclear Engineering Hanyang University 17 Haengdang, Sungdong, Seoul (Korea, Republic of)

    2010-02-15

    For this study, a database model of plant reliability was developed for the effective acquisition and management of plant-specific data that can be used in various applications of plant programs as well as in Probabilistic Safety Assessment (PSA). Through the development of a web-based reliability data analysis algorithm, this approach systematically gathers specific plant data such as component failure history, maintenance history, and shift diary. First, for the application of the developed algorithm, this study reestablished the raw data types, data deposition procedures and features of the Enterprise Resource Planning (ERP) system process. The component codes and system codes were standardized to make statistical analysis between different types of plants possible. This standardization contributes to the establishment of a flexible database model that allows the customization of reliability data for the various applications depending on component types and systems. In addition, this approach makes it possible for users to perform trend analyses and data comparisons for the significant plant components and systems. The validation of the algorithm is performed through a comparison of the importance measure value (Fussel-Vesely) of the mathematical calculation and that of the algorithm application. The development of a reliability database algorithm is one of the best approaches for providing systemic management of plant-specific reliability data with transparency and continuity. This proposed algorithm reinforces the relationships between raw data and application results so that it can provide a comprehensive database that offers everything from basic plant-related data to final customized data.

  3. A Multitarget Land Use Change Simulation Model Based on Cellular Automata and Its Application

    Directory of Open Access Journals (Sweden)

    Jun Yang

    2014-01-01

    Full Text Available Based on the analysis of the existing land use change simulation model, combined with macroland use change driving factors and microlocal land use competition, and through the application of Python language integrated technical approaches such as CA, GIS, AHP, and Markov, a multitarget land use change simulation model based on cellular automata(CA is established. This model was applied to conduct scenario simulation of land use/cover change of the Jinzhou New District, based on 1:10000 map scale land use, planning, topography, statistics, and other data collected in the year of 1988, 2003, and 2012. The simulation results indicate the following: (1 this model can simulate the mutual transformation of multiple land use types in a relatively satisfactory way; it takes land use system as a whole and simultaneously takes the land use demand in the macrolevel and the land use suitability in the local scale into account; and (2 the simulation accuracy of the model reaches 72%, presenting higher creditability. The model is capable of providing auxiliary decision-making support for coastal regions with the analysis of the land use change driving mechanism, prediction of land use change tendencies, and establishment of land resource sustainable utilization policies.

  4. A Stress Vector-Based Constitutive Model for Cohesionless Soil( Ⅱ )-Application

    Institute of Scientific and Technical Information of China (English)

    史宏彦; 谢定义; 白琳

    2002-01-01

    The stress vector-based constitutive model for cohesionless soil, proposed by SHI Hong-yan et al., was applied to analyze the deformation behaviors of materials subjected to various stress paths. The result of analysis shows that the constitutive model can capture well the main deformation behavior of cohesionless soil, such as stress-strain nonlinearity,hardening property, dilatancy , stress path dependency, non- coaxiality between the principal stress and the principal strain increment directions, and the coupling of mean effective and deviatoric stress with deformation. In addition, the model can also take into account the rotation of principal stress axes and the influence of intermediate principal stress on deformation and strength of soil simultaneously. The excellent agreement between the predicted and measured behavior indicates the comprehensive applicability of the model.

  5. Physiologically based pharmacokinetic modeling using microsoft excel and visual basic for applications.

    Science.gov (United States)

    Marino, Dale J

    2005-01-01

    Abstract Physiologically based pharmacokinetic (PBPK) models are mathematical descriptions depicting the relationship between external exposure and internal dose. These models have found great utility for interspecies extrapolation. However, specialized computer software packages, which are not widely distributed, have typically been used for model development and utilization. A few physiological models have been reported using more widely available software packages (e.g., Microsoft Excel), but these tend to include less complex processes and dose metrics. To ascertain the capability of Microsoft Excel and Visual Basis for Applications (VBA) for PBPK modeling, models for styrene, vinyl chloride, and methylene chloride were coded in Advanced Continuous Simulation Language (ACSL), Excel, and VBA, and simulation results were compared. For styrene, differences between ACSL and Excel or VBA compartment concentrations and rates of change were less than +/-7.5E-10 using the same numerical integration technique and time step. Differences using VBA fixed step or ACSL Gear's methods were generally Excel and VBA PBPK model dose metrics differed by no more than -0.013% or -0.23%, respectively, from ACSL results. These differences are likely attributable to different step sizes rather than different numerical integration techniques. These results indicate that Microsoft Excel and VBA can be useful tools for utilizing PBPK models, and given the availability of these software programs, it is hoped that this effort will help facilitate the use and investigation of PBPK modeling. PMID:20021074

  6. Web-based Services for Earth Observing and Model Data in National Applications and Hazards

    Science.gov (United States)

    Kafatos, M.; Boybeyi, Z.; Cervone, G.; di, L.; Sun, D.; Yang, C.; Yang, R.

    2005-12-01

    The ever-growing large volumes of Earth system science data, collected by Earth observing platforms, in situ stations and as model output data, are increasingly being used by discipline scientists and by wider classes of users. In particular, applications of Earth system science data to environmental and hazards as well as other national applications, require tailored or specialized data, as well as web-based tools and infrastructure. The latter are driven by applications and usage drivers which include ease of access, visualization of complex data, ease of producing value-added data, GIS and open source analysis usage, metadata, etc. Here we present different aspects of such web-based services and access, and discuss several applications in the hazards and environmental areas, including earthquake signatures and observations and model runs of hurricanes. Examples and lessons learned from the consortium Mid-Atlantic Geospatial Information Consortium will be presented. We discuss a NASA-funded, open source on-line data analysis system that is being applied to climate studies for the ESIP Federation. Since enhanced, this project and the next-generation Metadata Integrated Data Analysis System allow users not only to identify data but also to generate new data products on-the-fly. The functionalities extend from limited predefined functions, to sophisticated functions described by general-purposed GrADS (Grid Analysis and Display System) commands. The Federation system also allows third party data products to be combined with local data. Software component are available for converting the output from MIDAS (OPenDAP) into OGC compatible software. The on-going Grid efforts at CEOSR and LAITS in the School of Computational Sciences (SCS) include enhancing the functions of Globus to provide support for a geospatial system so the system can share the computing power to handle problems with different peak access times and improve the stability and flexibility of a rapid

  7. Copula based prediction models: an application to an aortic regurgitation study

    Directory of Open Access Journals (Sweden)

    Shoukri Mohamed M

    2007-06-01

    Full Text Available Abstract Background: An important issue in prediction modeling of multivariate data is the measure of dependence structure. The use of Pearson's correlation as a dependence measure has several pitfalls and hence application of regression prediction models based on this correlation may not be an appropriate methodology. As an alternative, a copula based methodology for prediction modeling and an algorithm to simulate data are proposed. Methods: The method consists of introducing copulas as an alternative to the correlation coefficient commonly used as a measure of dependence. An algorithm based on the marginal distributions of random variables is applied to construct the Archimedean copulas. Monte Carlo simulations are carried out to replicate datasets, estimate prediction model parameters and validate them using Lin's concordance measure. Results: We have carried out a correlation-based regression analysis on data from 20 patients aged 17–82 years on pre-operative and post-operative ejection fractions after surgery and estimated the prediction model: Post-operative ejection fraction = - 0.0658 + 0.8403 (Pre-operative ejection fraction; p = 0.0008; 95% confidence interval of the slope coefficient (0.3998, 1.2808. From the exploratory data analysis, it is noted that both the pre-operative and post-operative ejection fractions measurements have slight departures from symmetry and are skewed to the left. It is also noted that the measurements tend to be widely spread and have shorter tails compared to normal distribution. Therefore predictions made from the correlation-based model corresponding to the pre-operative ejection fraction measurements in the lower range may not be accurate. Further it is found that the best approximated marginal distributions of pre-operative and post-operative ejection fractions (using q-q plots are gamma distributions. The copula based prediction model is estimated as: Post -operative ejection fraction = - 0.0933 + 0

  8. Model-based geostatistics

    CERN Document Server

    Diggle, Peter J

    2007-01-01

    Model-based geostatistics refers to the application of general statistical principles of modeling and inference to geostatistical problems. This volume provides a treatment of model-based geostatistics and emphasizes on statistical methods and applications. It also features analyses of datasets from a range of scientific contexts.

  9. UNDERSTANDING THE APPLICABILITY OF LINEAR & NON-LINEAR MODELS USING A CASE-BASED STUDY

    OpenAIRE

    Gaurav Singh Thakur; Anubhav Gupta; Ankur Bhardwaj; Biju R Mohan

    2014-01-01

    This paper uses a case based study – “product sales estimation” on real-time data to help us understand the applicability of linear and non-linear models in machine learning and data mining. A systematic approach has been used here to address the given problem statement of sales estimation for a particular set of products in multiple categories by applying both linear and non-linear machine learning techniques on a data set of selected features from the original data set. Feature ...

  10. Agent Based Fuzzy T-S Multi-Model System and Its Applications

    Directory of Open Access Journals (Sweden)

    Xiaopeng Zhao

    2015-11-01

    Full Text Available Based on the basic concepts of agent and fuzzy T-S model, an agent based fuzzy T-S multi-model (ABFT-SMM system is proposed in this paper. Different from the traditional method, the parameters and the membership value of the agent can be adjusted along with the process. In this system, each agent can be described as a dynamic equation, which can be seen as the local part of the multi-model, and it can execute the task alone or collaborate with other agents to accomplish a fixed goal. It is proved in this paper that the agent based fuzzy T-S multi-model system can approximate any linear or nonlinear system at arbitrary accuracy. The applications to the benchmark problem of chaotic time series prediction, water heater system and waste heat utilizing process illustrate the viability and the efficiency of the mentioned approach. At the same time, the method can be easily used to a number of engineering fields, including identification, nonlinear control, fault diagnostics and performance analysis.

  11. Establishment of Winter Wheat Regional Simulation Model Based on Remote Sensing Data and Its Application

    Institute of Scientific and Technical Information of China (English)

    MA Yuping; WANG Shili; ZHANG Li; HOU Yingyu; ZHUANG Liwei; WANG Futang

    2006-01-01

    Accurate crop growth monitoring and yield forecasting are significant to the food security and the sus tainable development of agriculture. Crop yield estimation by remote sensing and crop growth simulation models have highly potential application in crop growth monitoring and yield forecasting. However, both of them have limitations in mechanism and regional application, respectively. Therefore, approach and methodology study on the combination of remote sensing data and crop growth simulation models are con cerned by many researchers. In this paper, adjusted and regionalized WOFOST (World Food Study) in North China and Scattering by Arbitrarily Inclined Leaves-a model of leaf optical PROperties SPECTra (SAIL-PROSFPECT) were coupled through LAI to simulate Soil Adjusted Vegetation Index (SAVI) of crop canopy, by which crop model was re-initialized by minimizing differences between simulated and synthesized SAVI from remote sensing data using an optimization software (FSEOPT). Thus, a regional remote-sensing crop-simulation-framework-model (WSPFRS) was established under potential production level (optimal soil water condition). The results were as follows: after re-initializing regional emergence date by using remote sensing data, anthesis, and maturity dates simulated by WSPFRS model were more close to measured values than simulated results of WOFOST; by re-initializing regional biomass weight at turn-green stage, the spa tial distribution of simulated storage organ weight was more consistent with measured yields and the area with high values was nearly consistent with actual high yield area. This research is a basis for developing regional crop model in water stress production level based on remote sensing data.

  12. A constriction resistance model of conjugated polymer based piezoresistive sensors for electronic skin applications.

    Science.gov (United States)

    Khalili, N; Naguib, H E; Kwon, R H

    2016-05-14

    Human intervention can be replaced through the development of tools resulting from utilization of sensing devices possessing a wide range of applications including humanoid robots or remote and minimally invasive surgeries. Similar to the five human senses, sensors interface with their surroundings to stimulate a suitable response or action. The sense of touch which arises in human skin is among the most challenging senses to emulate due to its ultra high sensitivity. This has brought forth novel challenging issues to consider in the field of biomimetic robotics. In this work, using a multiphase reaction, a polypyrrole (PPy) based hydrogel is developed as a resistive type pressure sensor with an intrinsically elastic microstructure stemming from three dimensional hollow spheres. It is shown that the electrical conductivity of the fabricated PPy based piezoresistive sensors is enhanced as a result of adding conductive fillers and therefore, endowing the sensors with a higher sensitivity. A semi-analytical constriction resistance based model accounting for the real contact area between the PPy hydrogel sensors and the electrode along with the dependency of the contact resistance change on the applied load is developed. The model is then solved using a Monte Carlo technique and its corresponding sensitivity is obtained. Comparing the results with their experimental counterparts, the proposed modeling methodology offers a good tracking ability. PMID:27035514

  13. Autonomous Reactor Control Using Model Based Predictive Control for Space Propulsion Applications

    International Nuclear Information System (INIS)

    Reliable reactor control is important to reactor safety, both in terrestrial and space systems. For a space system, where the time for communication to Earth is significant, autonomous control is imperative. Based on feedback from reactor diagnostics, a controller must be able to automatically adjust to changes in reactor temperature and power level to maintain nominal operation without user intervention. Model-based predictive control (MBPC) (Clarke 1994; Morari 1994) is investigated as a potential control methodology for reactor start-up and transient operation in the presence of an external source. Bragg-Sitton and Holloway (2004) assessed the applicability of MBPC to reactor start-up from a cold, zero-power condition in the presence of a time-varying external radiation source, where large fluctuations in the external radiation source can significantly impact a reactor during start-up operations. The MBPC algorithm applied the point kinetics model to describe the reactor dynamics, using a single group of delayed neutrons; initial application considered a fast neutron lifetime (10-3 sec) to simplify calculations during initial controller analysis. The present study will more accurately specify the dynamics of a fast reactor, using a more appropriate fast neutron lifetime (10-7 sec) than in the previous work. Controller stability will also be assessed by carefully considering the dependencies of each component in the defined cost (objective) function and its subsequent effect on the selected 'optimal' control maneuvers

  14. Application of uncertainty reasoning based on cloud model in time series prediction

    Institute of Scientific and Technical Information of China (English)

    张锦春; 胡谷雨

    2003-01-01

    Time series prediction has been successfully used in several application areas, such as meteoro-logical forecasting, market prediction, network traffic forecasting, etc. , and a number of techniques have been developed for modeling and predicting time series. In the traditional exponential smoothing method, a fixed weight is assigned to data history, and the trend changes of time series are ignored. In this paper, an uncertainty reasoning method, based on cloud model, is employed in time series prediction, which uses cloud logic controller to adjust the smoothing coefficient of the simple exponential smoothing method dynamically to fit the current trend of the time series. The validity of this solution was proved by experiments on various data sets.

  15. Application of uncertainty reasoning based on cloud model in time series prediction

    Institute of Scientific and Technical Information of China (English)

    张锦春; 胡谷雨

    2003-01-01

    Time series prediction has been successfully used in several application areas, such as meteorological forecasting, market prediction, network traffic forecasting, etc., and a number of techniques have been developed for modeling and predicting time series. In the traditional exponential smoothing method, a fixed weight is assigned to data history, and the trend changes of time series are ignored. In this paper, an uncertainty reasoning method, based on cloud model, is employed in time series prediction, which uses cloud logic controller to adjust the smoothing coefficient of the simple exponential smoothing method dynamically to fit the current trend of the time series. The validity of this solution was proved by experiments on various data sets.

  16. Research on application of intelligent computation based LUCC model in urbanization process

    Science.gov (United States)

    Chen, Zemin

    2007-06-01

    Global change study is an interdisciplinary and comprehensive research activity with international cooperation, arising in 1980s, with the largest scopes. The interaction between land use and cover change, as a research field with the crossing of natural science and social science, has become one of core subjects of global change study as well as the front edge and hot point of it. It is necessary to develop research on land use and cover change in urbanization process and build an analog model of urbanization to carry out description, simulation and analysis on dynamic behaviors in urban development change as well as to understand basic characteristics and rules of urbanization process. This has positive practical and theoretical significance for formulating urban and regional sustainable development strategy. The effect of urbanization on land use and cover change is mainly embodied in the change of quantity structure and space structure of urban space, and LUCC model in urbanization process has been an important research subject of urban geography and urban planning. In this paper, based upon previous research achievements, the writer systematically analyzes the research on land use/cover change in urbanization process with the theories of complexity science research and intelligent computation; builds a model for simulating and forecasting dynamic evolution of urban land use and cover change, on the basis of cellular automation model of complexity science research method and multi-agent theory; expands Markov model, traditional CA model and Agent model, introduces complexity science research theory and intelligent computation theory into LUCC research model to build intelligent computation-based LUCC model for analog research on land use and cover change in urbanization research, and performs case research. The concrete contents are as follows: 1. Complexity of LUCC research in urbanization process. Analyze urbanization process in combination with the contents

  17. A Novel Application of Agent-based Modeling: Projecting Water Access and Availability Using a Coupled Hydrologic Agent-based Model in the Nzoia Basin, Kenya

    Science.gov (United States)

    Le, A.; Pricope, N. G.

    2015-12-01

    Projections indicate that increasing population density, food production, and urbanization in conjunction with changing climate conditions will place stress on water resource availability. As a result, a holistic understanding of current and future water resource distribution is necessary for creating strategies to identify the most sustainable means of accessing this resource. Currently, most water resource management strategies rely on the application of global climate predictions to physically based hydrologic models to understand potential changes in water availability. However, the need to focus on understanding community-level social behaviors that determine individual water usage is becoming increasingly evident, as predictions derived only from hydrologic models cannot accurately represent the coevolution of basin hydrology and human water and land usage. Models that are better equipped to represent the complexity and heterogeneity of human systems and satellite-derived products in place of or in conjunction with historic data significantly improve preexisting hydrologic model accuracy and application outcomes. We used a novel agent-based sociotechnical model that combines the Soil and Water Assessment Tool (SWAT) and Agent Analyst and applied it in the Nzoia Basin, an area in western Kenya that is becoming rapidly urbanized and industrialized. Informed by a combination of satellite-derived products and over 150 household surveys, the combined sociotechnical model provided unique insight into how populations self-organize and make decisions based on water availability. In addition, the model depicted how population organization and current management alter water availability currently and in the future.

  18. A Decision Model for Supplier Selection based on Business System Management and Safety Criteria and Application of the Model

    Directory of Open Access Journals (Sweden)

    Semih Coşkun

    2015-08-01

    Full Text Available In modern market conditions, sustainable and effective management of main manufacturers, suppliers and customer relationship is a necessity for competitiveness. Suppliers must satisfy customers’ expectations such as cost minimization, quality maximization, improved flexibility and achieved deadlines; which is also required for systematic management of products, work and environmental safety. The supplier selection process is consistently getting more complicated by the effect of increasing amount of suppliers and supplier selection criteria. Supplier selection decisions which take an important role in efficient supplier management will be more consistent by the application of decision making models which integrate the quantitative and qualitative evaluation factors. In this study, a dynamic process is designed and modeled for supplier selection. For this purpose, evaluation criteria are established according to the Balanced Scorecard perspectives, system sustainability and safety requirements. Fuzzy Analytic Hierarchy Process method is used for evaluating the importance of supplier selection criteria. A utility range-based interactive group decision making method is used for the selection of the best supplier. In order to test the proposed model, a representative from airport operation sector is selected. Finally, it is revealed that the application of the proposed model generates consistent results for supplier selection decisions.

  19. UNDERSTANDING THE APPLICABILITY OF LINEAR & NON-LINEAR MODELS USING A CASE-BASED STUDY

    Directory of Open Access Journals (Sweden)

    Gaurav Singh Thakur

    2014-11-01

    Full Text Available This paper uses a case based study – “product sales estimation” on real-time data to help us understand the applicability of linear and non-linear models in machine learning and data mining. A systematic approach has been used here to address the given problem statement of sales estimation for a particular set of products in multiple categories by applying both linear and non-linear machine learning techniques on a data set of selected features from the original data set. Feature selection is a process that reduces the dimensionality of the data set by excluding those features which contribute minimal to the prediction of the dependent variable. The next step in this process is training the model that is done using multiple techniques from linear & non-linear domains, one of the best ones in their respective areas. Data Remodeling has then been done to extract new features from the data set by changing the structure of the dataset & the performance of the models is checked again. Data Remodeling often plays a very crucial and important role in boosting classifier accuracies by changing the properties of the given dataset. We then try to explore and analyze the various reasons due to which one model performs better than the other & hence try and develop an understanding about the applicability of linear & non-linear machine learning models. The target mentioned above being our primary goal, we also aim to find the classifier with the best possible accuracy for product sales estimation in the given scenario.

  20. Model-based Reinforcement Learning with Model Error and Its Application

    OpenAIRE

    Tajima, Yoshiyuki; Onisawa, Takehisa

    2010-01-01

    In this chapter, the learning algorithm ME-FPRL is discussed. And it applied to the pursuit target task. Application results show that the ME-FPRL is more efficient than a RL or Modelbased RL. As a result, ME-FPRL is found to be able to apply to practical tasks. Our future work is constructing more an efficient learning system by using some advice and communication.

  1. A web-based federated neuroinformatics model for surgical planning and clinical research applications in epilepsy.

    Science.gov (United States)

    Cao, Xinhua; Wong, Stephen T C; Hoo, Kent Soo; Tjandra, Donny; Fu, J C; Lowenstein, Daniel H

    2004-01-01

    There is an increasing need to efficiently share diverse clinical and image data among different clinics, labs, and departments of a medical center enterprise to facilitate better quality care and more effective clinical research. In this paper, we describe a web-based, federated information model as a viable technical solution with applications in medical refractory epilepsy and other neurological disorders. We describe four such online applications developed in a federated system prototype: surgical planning, image analysis, statistical data analysis, and dynamic extraction, transforming, and loading (ETL) of data from a heterogeneous collection of data sources into an epilepsy multimedia data warehouse (EMDW). The federated information system adopts a three-tiered architecture, consisting of a user-interface layer, an application logic layer, and a data service layer. We implemented two complementary federated information technologies, i.e., XML (eXtensible Markup Language) and CORBA (Common Object Request Broker Architecture), in the prototype to enable multimedia data exchange and brain images transmission. The preliminary results show that the federated prototype system provides a uniform interface, heterogeneous information integration and efficient data sharing for users in our institution who are concerned with the care of patients with epilepsy and who pursue research in this area.

  2. Applicability of an exposure model for the determination of emissions from mobile phone base stations

    International Nuclear Information System (INIS)

    Applicability of a model to estimate radiofrequency electromagnetic field (RF-EMF) strength in households from mobile phone base stations was evaluated with technical data of mobile phone base stations available from the German Net Agency, and dosimetric measurements, performed in an epidemiological study. Estimated exposure and exposure measured with dosemeters in 1322 participating households were compared. For that purpose, the upper 10. percentiles of both outcomes were defined as the 'higher exposed' groups. To assess the agreement of the defined 'higher' exposed groups, kappa coefficient, sensitivity and specificity were calculated. The present results show only a weak agreement of calculations and measurements (kappa values between -0.03 and 0.28, sensitivity between 7.1 and 34.6). Only in some of the sub-analyses, a higher agreement was found, e.g. when measured instead of interpolated geo-coordinates were used to calculate the distance between households and base stations, which is one important parameter in modelling exposure. During the development of the exposure model, more precise input data were available for its internal validation, which yielded kappa values between 0.41 and 0.68 and sensitivity between 55 and 76 for different types of housing areas. Contrary to this, the calculation of exposure - on the basis of the available imprecise data from the epidemiological study - is associated with a relatively high degree of uncertainty. Thus, the model can only be applied in epidemiological studies, when the uncertainty of the input data is considerably reduced. Otherwise, the use of dosemeters to determine the exposure from RF-EMF in epidemiological studies is recommended. (authors)

  3. Constitutive modeling of a nickel base superalloy -with a focus on gas turbine applications

    Energy Technology Data Exchange (ETDEWEB)

    Almroth, Per

    2003-05-01

    Gas turbines are used where large amounts of energy is needed, typically as engines in aircraft, ferries and power plants. From an efficiency point of view it is desirable to increase the service temperature as much as possible. One of the limiting factors is then the maximum allowable metal temperatures in the turbine stages, primarily in the blades of the first stage, that are exposed to the highest gas temperatures. Specially designed materials are used to cope with these severe conditions, such as the nickel base superalloy IN792. In order to be able to design the components for higher temperatures and tighter tolerances, a detailed understanding and computationel models of the material behaviour is needed. The models presented in this work have been developed with the objective of being physically well motivated, and with the intention of avoiding excessive numbers of parameters. The influence of the parameters should also be as easy as possible to interpret. The models are to describe the behaviour of IN792, under conditions typically found for a gas turbine blade. Specifically the high- and intermediate temperature isothermal modelling of IN792 have been addressed. One main issue when characterising the material and calibrating the models is the use of relevant tests, that are representative of component conditions. Therefore isothermal tests with an eye on the typical environment of a turbine blade have been planned and performed. Using numerical optimization techniques the material parameters for the isothermal behaviour of IN792 at 650 deg and 850 deg have been estimated. The good overall calibration results for these specific temperatures, using the presented modeling concept and nonstandard constitutive tests, suggests that the model can describe the behaviour of IN792 in gas turbine hot part applications.

  4. Anisotropic Sheet Forming Simulations Based on the ALAMEL Model: Application on Cup Deep Drawing and Ironing

    Science.gov (United States)

    Eyckens, P.; Gawad, J.; Xie, Q.; Van Bael, A.; Roose, D.; Samaey, G.; Moerman, J.; Vegter, H.; Van Houtte, P.

    2011-08-01

    The grain interaction ALAMEL model [1] allows predicting the evolution of the crystallographic texture and the accompanying evolution in plastic anisotropy. A FE constitutive law, based on this multilevel model, is presented and assessed for a cup deep drawing process followed by an ironing process. A Numisheet2011 benchmark (BM-1) is used for the application. The FE material model makes use of the Facet plastic potential [2] for a relatively fast evaluation of the yield locus. A multi-scale approach [3] has been recently developed in order to adaptively update the constitutive law by accommodating it to the evolution of the crystallographic texture. The identification procedure of the Facet coefficients, which describe instantaneous plastic anisotropy, is accomplished through virtual testing by means of the ALAMEL model, as described in more detail in the accompanying conference paper [4]. Texture evolution during deformation is included explicitly by re-identification of Facet coefficients in the course of the FE simulation. The focus of this paper lies on the texture-induced anisotropy and the resulting earing profile during both stages of the forming process. For the considered AKDQ steel material, it is seen that texture evolution during deep drawing is such that the anisotropic plastic flow evolves towards a more isotropic flow in the course of deformation. Texture evolution only slightly influences the obtained cup height for this material. The ironing step enlarges the earing height.

  5. Physiologically-based toxicokinetic modeling of zearalenone and its metabolites: application to the Jersey girl study.

    Directory of Open Access Journals (Sweden)

    Dwaipayan Mukherjee

    Full Text Available Zearalenone (ZEA, a fungal mycotoxin, and its metabolite zeranol (ZAL are known estrogen agonists in mammals, and are found as contaminants in food. Zeranol, which is more potent than ZEA and comparable in potency to estradiol, is also added as a growth additive in beef in the US and Canada. This article presents the development and application of a Physiologically-Based Toxicokinetic (PBTK model for ZEA and ZAL and their primary metabolites, zearalenol, zearalanone, and their conjugated glucuronides, for rats and for human subjects. The PBTK modeling study explicitly simulates critical metabolic pathways in the gastrointestinal and hepatic systems. Metabolic events such as dehydrogenation and glucuronidation of the chemicals, which have direct effects on the accumulation and elimination of the toxic compounds, have been quantified. The PBTK model considers urinary and fecal excretion and biliary recirculation and compares the predicted biomarkers of blood, urinary and fecal concentrations with published in vivo measurements in rats and human subjects. Additionally, the toxicokinetic model has been coupled with a novel probabilistic dietary exposure model and applied to the Jersey Girl Study (JGS, which involved measurement of mycoestrogens as urinary biomarkers, in a cohort of young girls in New Jersey, USA. A probabilistic exposure characterization for the study population has been conducted and the predicted urinary concentrations have been compared to measurements considering inter-individual physiological and dietary variability. The in vivo measurements from the JGS fall within the high and low predicted distributions of biomarker values corresponding to dietary exposure estimates calculated by the probabilistic modeling system. The work described here is the first of its kind to present a comprehensive framework developing estimates of potential exposures to mycotoxins and linking them with biologically relevant doses and biomarker

  6. Genetic Modeling of GIS-Based Cell Clusters and Its Application in Mineral Resources Prediction

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    This paper presents a synthetic analysis method for multi-sourced geological data from geographic information system (GIS). In the previous practices of mineral resources prediction, a usually adopted methodology has been statistical analysis of cells delimitated based on thoughts of random sampiing. That might lead to insufficient utilization of local spatial information, for a cell is treated as a point without internal structure. We now take "cell clusters", i. e. , spatial associations of cells, as basic units of statistics, thus the spatial configuration information of geological variables is easier to be detected and utilized, and the accuracy and reliability of prediction are improved. We build a linear multi-discriminating model for the clusters via genetic algorithm. Both the right-judgment rates and the in-class vs. between-class distance ratios are considered to form the evolutional adaptive values of the population. An application of the method in gold mineral resources prediction in east Xinjiang, China is presented.

  7. Modeling dependence based on mixture copulas and its application in risk management

    Institute of Scientific and Technical Information of China (English)

    OUYANG Zi-sheng; LIAO Hui; YANG Xiang-qun

    2009-01-01

    This paper is concerned with the statistical modeling of the dependence structure of multivariate financial data using the copula, and the application of copula functions in VaR valuation. After the introduction of the pure copula method and the maximum and minimum mixture copula method, authors present a new algorithm based on the more generalized mixture copula functions and the dependence measure, and apply the method to the portfolio of Shanghai stock composite index and Shenzhen stock component index. Comparing with the results from various methods, one can find that the mixture copula method is better than the pure Gaussia copula method and the maximum and minimum mixture copula method on different VaR level.

  8. Novel Component Based Development Model For Sip-Based Mobile Application

    CERN Document Server

    Barnawi, Ahmed; Qureshi, M Rizwan Jameel; Khan, Asif Irshad; 10.5121/ijsea.2012.3107

    2012-01-01

    Universities and Institutions these days' deals with issues related to with assessment of large number of students. Various evaluation methods have been adopted by examiners in different institutions to examining the ability of an individual, starting from manual means of using paper and pencil to electronic, from oral to written, practical to theoretical and many others. There is a need to expedite the process of examination in order to meet the increasing enrolment of students at the universities and institutes. Sip Based Mass Mobile Examination System (SiBMMES) expedites the examination process by automating various activities in an examination such as exam paper setting, Scheduling and allocating examination time and evaluation (auto-grading for objective questions) etc. SiBMMES uses the IP Multimedia Subsystem (IMS) that is an IP communications framework providing an environment for the rapid development of innovative and reusable services Session Initial Protocol (SIP) is a signalling (request-response)...

  9. Novel Component-Based Development Model for SIP-Based Mobile Application (1202)

    CERN Document Server

    Barnawi, Ahmed; Qureshi, M Rizwan Jameel; Khan, Asif Irshad

    2012-01-01

    Universities and Institutions these days' deals with issues related to with assessment of large number of students. Various evaluation methods have been adopted by examiners in different institutions to examining the ability of an individual, starting from manual means of using paper and pencil to electronic, from oral to written, practical to theoretical and many others. There is a need to expedite the process of examination in order to meet the increasing enrolment of students at the universities and institutes. Sip Based Mass Mobile Examination System (SiBMMES) expedites the examination process by automating various activities in an examination such as exam paper setting, Scheduling and allocating examination time and evaluation (auto-grading for objective questions) etc. SiBMMES uses the IP Multimedia Subsystem (IMS) that is an IP communications framework providing an environment for the rapid development of innovative and reusable services Session Initial Protocol (SIP) is a signalling (request-response)...

  10. Development, fabrication, and modeling of highly sensitive conjugated polymer based piezoresistive sensors in electronic skin applications

    Science.gov (United States)

    Khalili, Nazanin; Naguib, Hani E.; Kwon, Roy H.

    2016-04-01

    Human intervention can be replaced through development of tools resulted from utilizing sensing devices possessing a wide range of applications including humanoid robots or remote and minimally invasive surgeries. Similar to the five human senses, sensors interface with their surroundings to stimulate a suitable response or action. The sense of touch which arises in human skin is among the most challenging senses to emulate due to its ultra high sensitivity. This has brought forth novel challenging issues to consider in the field of biomimetic robotics. In this work, using a multiphase reaction, a polypyrrole (PPy) based hydrogel is developed as a resistive type pressure sensor with an intrinsically elastic microstructure stemming from three dimensional hollow spheres. Furthermore, a semi-analytical constriction resistance model accounting for the real contact area between the PPy hydrogel sensors and the electrode along with the dependency of the contact resistance change on the applied load is developed. The model is then solved using a Monte Carlo technique and the sensitivity of the sensor is obtained. The experimental results showed the good tracking ability of the proposed model.

  11. Applications For Real Time NOMADS At NCEP To Disseminate NOAA's Operational Model Data Base

    Science.gov (United States)

    Alpert, J. C.; Wang, J.; Rutledge, G.

    2007-05-01

    A wide range of environmental information, in digital form, with metadata descriptions and supporting infrastructure is contained in the NOAA Operational Modeling Archive Distribution System (NOMADS) and its Real Time (RT) project prototype at the National Centers for Environmental Prediction (NCEP). NOMADS is now delivering on its goal of a seamless framework, from archival to real time data dissemination for NOAA's operational model data holdings. A process is under way to make NOMADS part of NCEP's operational production of products. A goal is to foster collaborations among the research and education communities, value added retailers, and public access for science and development. In the National Research Council's "Completing the Forecast", Recommendation 3.4 states: "NOMADS should be maintained and extended to include (a) long-term archives of the global and regional ensemble forecasting systems at their native resolution, and (b) re-forecast datasets to facilitate post-processing." As one of many participants of NOMADS, NCEP serves the operational model data base using data application protocol (Open-DAP) and other services for participants to serve their data sets and users to obtain them. Using the NCEP global ensemble data as an example, we show an Open-DAP (also known as DODS) client application that provides a request-and-fulfill mechanism for access to the complex ensemble matrix of holdings. As an example of the DAP service, we show a client application which accesses the Global or Regional Ensemble data set to produce user selected weather element event probabilities. The event probabilities are easily extended over model forecast time to show probability histograms defining the future trend of user selected events. This approach insures an efficient use of computer resources because users transmit only the data necessary for their tasks. Data sets are served by OPeN-DAP allowing commercial clients such as MATLAB or IDL as well as freeware clients

  12. Central Puget Sound Ecopath/Ecosim model biological parameters - Developing food web models for ecosystem-based management applications in Puget Sound

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This project is developing food web models for ecosystem-based management applications in Puget Sound. It is primarily being done by NMFS FTEs and contractors, in...

  13. Agent-based land-use models: a review of applications

    OpenAIRE

    Matthews, RB; Gilbert, NG; Roach, A.; Polhill, JG; Gotts, NM

    2007-01-01

    Agent-based modelling is an approach that has been receiving attention by the land use modelling community in recent years, mainly because it offers a way of incorporating the influence of human decision-making on land use in a mechanistic, formal, and spatially explicit way, taking into account social interaction, adaptation, and decision-making at different levels. Specific advantages of agent-based models include their ability to model individual decision-making entities and their interact...

  14. SAR Imagery Simulation of Ship Based on Electromagnetic Calculations and Sea Clutter Modelling for Classification Applications

    International Nuclear Information System (INIS)

    Ship detection and classification with space-borne SAR has many potential applications within the maritime surveillance, fishery activity management, monitoring ship traffic, and military security. While ship detection techniques with SAR imagery are well established, ship classification is still an open issue. One of the main reasons may be ascribed to the difficulties on acquiring the required quantities of real data of vessels under different observation and environmental conditions with precise ground truth. Therefore, simulation of SAR images with high scenario flexibility and reasonable computation costs is compulsory for ship classification algorithms development. However, the simulation of SAR imagery of ship over sea surface is challenging. Though great efforts have been devoted to tackle this difficult problem, it is far from being conquered. This paper proposes a novel scheme for SAR imagery simulation of ship over sea surface. The simulation is implemented based on high frequency electromagnetic calculations methods of PO, MEC, PTD and GO. SAR imagery of sea clutter is modelled by the representative K-distribution clutter model. Then, the simulated SAR imagery of ship can be produced by inserting the simulated SAR imagery chips of ship into the SAR imagery of sea clutter. The proposed scheme has been validated with canonical and complex ship targets over a typical sea scene

  15. Enhancement Factors in Ozone Absorption Based on the Surface Renewal Model and its Application

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Based on the Danckwerts surface renewal model, a simple explicit expression of theenhancement factor in ozone absorption with a first order ozone self-decomposition and parallel secondorder ozonation reactions has been derived. The results are compared with our previous work based onthe film theory. The 2,4-dichlorophenol destruction rate by ozonation is predicted using the enhancementfactor model in this paper.

  16. Multi-Agent Application System Model Based on UML%UML与多Agent应用系统建模

    Institute of Scientific and Technical Information of China (English)

    孙华志

    2003-01-01

    In order to guarantee the quality and raising the reliability and maintainability of the system, we need to provide the support for designing the Agent-based software system. In view of the consistency of the Agent's conceptwith Object's, we analyze the thought of modeling on UML and then write this paper. This paper has made the help-ful attempt to build Multi-Agent application system model based on UML, involving the descriptions such as staticstructure and dynamic action. It lists the major steps and method about system modeling based on expanding UML,also.

  17. Gradient-based Kriging approximate model and its application research to optimization design

    Institute of Scientific and Technical Information of China (English)

    XUAN Ying; XIANG JunHua; ZHANG WeiHua; ZHANG YuLin

    2009-01-01

    In the process of multidisciplinary design optimization, there exits the calculation complexity problem due to frequently calling high fidelity system analysis models. The high fidelity system analysis models can be surrogated by approximate models. The sensitivity analysis and numerical noise filtering can be done easily by coupling approximate models to optimization. Approximate models can reduce the number of executions of the problem's simulation code during optimization, so the solution efficiency of the multidisciplinary design optimization problem can be improved. Most optimization methods are based on gradient. The gradients of the objective and constrain functions are gained easily. The gradient-based Kriging (GBK) approximate model can be constructed by using system response value and its gradients. The gradients can greatly improve prediction precision of system response. The hybrid optimization method is constructed by coupling GBK approximate models to gradient-based optimization methods. An aircraft aerodynamics shape optimization design example indicates that the methods of this paper can achieve good feasibility and validity.

  18. A Model for Protein Sequence Evolution Based on Selective Pressure for Protein Stability: Application to Hemoglobins

    OpenAIRE

    Lorraine Marsh

    2009-01-01

    Negative selection against protein instability is a central influence on evolution of proteins. Protein stability is maintained over evolution despite changes in underlying sequences. An empirical all-site stability-based model of evolution was developed to focus on the selection of residues arising from their contributions to protein stability. In this model, site rates could vary. A structure-based method was used to predict stationary frequencies of hemoglobin residues based on their prope...

  19. Security Model for Microsoft Based Mobile Sales Management Application in Private Cloud Computing

    OpenAIRE

    Kuan Chee Houng; Bharanidharan Shanmugam; Ganthan Narayana Samy; Sameer Hasan Albakri; Azuan Ahmad

    2013-01-01

    The Microsoft-based mobile sales management application is a sales force management application that currently running on Windows Mobile 6.5. It handles sales-related activity and cuts down the administrative task of sales representative. Then, Windows launch a new mobile operating system, Windows Phone and stop providing support to Windows Mobile. This has become an obstacle for Windows Mobile development. From time to time, Windows Mobile will be eliminated from the market due to no support...

  20. Finite mathematics models and applications

    CERN Document Server

    Morris, Carla C

    2015-01-01

    Features step-by-step examples based on actual data and connects fundamental mathematical modeling skills and decision making concepts to everyday applicability Featuring key linear programming, matrix, and probability concepts, Finite Mathematics: Models and Applications emphasizes cross-disciplinary applications that relate mathematics to everyday life. The book provides a unique combination of practical mathematical applications to illustrate the wide use of mathematics in fields ranging from business, economics, finance, management, operations research, and the life and social sciences.

  1. Proceedings First Workshop on Applications of Membrane computing, Concurrency and Agent-based modelling in POPulation biology

    CERN Document Server

    Milazzo, Paolo; 10.4204/EPTCS.33

    2010-01-01

    This volume contains the papers presented at the first International Workshop on Applications of Membrane Computing, Concurrency and Agent-based Modelling in Population Biology (AMCA-POP 2010) held in Jena, Germany on August 25th, 2010 as a satellite event of the 11th Conference on Membrane Computing (CMC11). The aim of the workshop is to investigate whether formal modelling and analysis techniques could be applied with profit to systems of interest for population biology and ecology. The considered modelling notations include membrane systems, Petri nets, agent-based notations, process calculi, automata-based notations, rewriting systems and cellular automata. Such notations enable the application of analysis techniques such as simulation, model checking, abstract interpretation and type systems to study systems of interest in disciplines such as population biology, ecosystem science, epidemiology, genetics, sustainability science, evolution and other disciplines in which population dynamics and interactions...

  2. Methodology and applications in non-linear model-based geostatistics

    DEFF Research Database (Denmark)

    Christensen, Ole Fredslund

    Today geostatistics is used in a number of research areas, among others agricultural and environmental sciences.This thesis concerns data and applications where the classical Gaussian spatial model is not appropriate. A transformation could be used in an attempt to obtain data that are approximat......Today geostatistics is used in a number of research areas, among others agricultural and environmental sciences.This thesis concerns data and applications where the classical Gaussian spatial model is not appropriate. A transformation could be used in an attempt to obtain data....... Conditioned by an underlying and unobserved Gaussian process the observations at the measured locations follow a generalised linear model. Concerning inference Markov chain Monte Carlo methods are used. The study of these models is the main topic of the thesis. Construction of priors, and the use of flat...

  3. Inverse Modeling of Human Knee Joint Based on Geometry and Vision Systems for Exoskeleton Applications

    Directory of Open Access Journals (Sweden)

    Eduardo Piña-Martínez

    2015-01-01

    Full Text Available Current trends in Robotics aim to close the gap that separates technology and humans, bringing novel robotic devices in order to improve human performance. Although robotic exoskeletons represent a breakthrough in mobility enhancement, there are design challenges related to the forces exerted to the users’ joints that result in severe injuries. This occurs due to the fact that most of the current developments consider the joints as noninvariant rotational axes. This paper proposes the use of commercial vision systems in order to perform biomimetic joint design for robotic exoskeletons. This work proposes a kinematic model based on irregular shaped cams as the joint mechanism that emulates the bone-to-bone joints in the human body. The paper follows a geometric approach for determining the location of the instantaneous center of rotation in order to design the cam contours. Furthermore, the use of a commercial vision system is proposed as the main measurement tool due to its noninvasive feature and for allowing subjects under measurement to move freely. The application of this method resulted in relevant information about the displacements of the instantaneous center of rotation at the human knee joint.

  4. Convergence rates for rank-based models with applications to portfolio theory

    CERN Document Server

    Ichiba, Tomoyuki; Shkolnikov, Mykhaylo

    2011-01-01

    We determine rates of convergence of rank-based interacting diffusions and semimartingale reflecting Brownian motions to equilibrium. Convergence rate for the total variation metric is derived using Lyapunov functions. Sharp fluctuations of additive functionals are obtained using Transportation Cost-Information inequalities for Markov processes. We work out various applications to the rank-based abstract equity markets used in Stochastic Portfolio Theory. For example, we produce quantitative bounds, including constants, for fluctuations of market weights and occupation times of various ranks for individual coordinates. Another important application is the comparison of performance between symmetric functionally generated portfolios and the market portfolio. This produces estimates of probabilities of "beating the market".

  5. Tensor Product Model Transformation-based Controller Design for Gantry Crane Control System – An Application Approach

    Directory of Open Access Journals (Sweden)

    Fetah Kolonic

    2006-10-01

    Full Text Available The Tensor Product (TP model transformation is a recently proposed techniquefor transforming given Linear Parameter Varying (LPV state-space models into polytopicmodel form, namely, to parameter varying convex combination of Linear Time Invariant(LTI systems. The main advantage of the TP model transformation is that it is executablein a few minutes and the Linear Matrix Inequality (LMI-based control design frameworkscan immediately be applied to the resulting polytopc models to yield controllers withtractable and guaranteed performance. Various applications of the TP modeltransformation-based design were studied via academic complex and benchmark problems,but no real experimental environment-based study was published. Thus, the main objectiveof this paper is to study how the TP model transformation performs in a real world problemand control setup. The laboratory concept for TP model-based controller design,simulation and real time running on an electromechanical system is presented.Development system for TP model-based controller with one hardware/software platformand target system with real-time hardware/ software support are connected in the uniquesystem. Proposed system is based on microprocessor of personal computer (PC forsimulation and software development as well as for real-time control. Control algorithm,designed and simulated in MATLAB/SIMULINK environment, use graphically orientedsoftware interface for real-time code generation. Some specific conflicting industrial tasksin real industrial crane application, such as fast load positioning control and load swingangle minimization, are considered and compared with other controller types.

  6. Development and application of a model for analysis and design phases of Web-based system development

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    Despite a short history of the Web development, Web-related technologies are rapidly develop-ing. However, the Web application quality is improving slowly, which requires efficient methods for devel-oping Web systems. This study presents a model for Web-based software development for analysis and design phases based on the ISO/IEC 12207 standard. It describes the methods used to define processes and entities in order to reflect the contents in Web applications. It applies the methodology of Web-Road Map by KCC Information and Technology using this model to the public project. As a result, Web-Road Map is proven to be an efficient model to analyze and design Web-applications.

  7. Recent approaches to quadrupole collectivity: models, solutions and applications based on the Bohr hamiltonian

    Science.gov (United States)

    Buganu, Petricǎ; Fortunato, Lorenzo

    2016-09-01

    We review and discuss several recent approaches to quadrupole collectivity and developments of collective models and their solutions with many applications, examples and references. We focus in particular on analytic and approximate solutions of the Bohr hamiltonian of the last decade, because most of the previously published material has been already reviewed in other publications.

  8. Enhancement Factors in Ozone Absorption Based on the Surface Renewal Model and its Application

    Institute of Scientific and Technical Information of China (English)

    程江; 杨卓如; 陈焕钦; C.H.Kuo; M.E.Zappi

    2000-01-01

    Based on the Danckwerts surface renewal model, a simple explicit expression of the enhancement factor in ozone absorption with a first order ozone self-decomposition and parallel second order ozonation reactions has been derived. The results are compared with our previous work based on the film theory. The 2,4-dichlorophenol destruction rate by ozonation is predicted using the enhancement factor model in this paper.

  9. Application of the mass-based UNIQUAC model to membrane systems: A critical revision

    OpenAIRE

    Chovau, Simon; Van der Bruggen, Bart; Luis, Patricia

    2012-01-01

    The UNIQUAC model is very suitable in describing (liquid + liquid) as well as (vapor + liquid) equilibrium for a wide range of systems. It can be extended to (solvent + polymer) systems for describing sorption equilibria. The original model is expressed in molar-based terms, but requires knowledge of structural parameters and molar masses of all components. Since these cannot always be easily determined for membranes, a conversion to mass-based terms is often performed, which eliminates this ...

  10. Measuring Model-Based High School Science Instruction: Development and Application of a Student Survey

    Science.gov (United States)

    Fulmer, Gavin W.; Liang, Ling L.

    2013-02-01

    This study tested a student survey to detect differences in instruction between teachers in a modeling-based science program and comparison group teachers. The Instructional Activities Survey measured teachers' frequency of modeling, inquiry, and lecture instruction. Factor analysis and Rasch modeling identified three subscales, Modeling and Reflecting, Communicating and Relating, and Investigative Inquiry. As predicted, treatment group teachers engaged in modeling and inquiry instruction more than comparison teachers, with effect sizes between 0.55 and 1.25. This study demonstrates the utility of student report data in measuring teachers' classroom practices and in evaluating outcomes of a professional development program.

  11. An Empirically Based Method of Q-Matrix Validation for the DINA Model: Development and Applications

    Science.gov (United States)

    de la Torre, Jimmy

    2008-01-01

    Most model fit analyses in cognitive diagnosis assume that a Q matrix is correct after it has been constructed, without verifying its appropriateness. Consequently, any model misfit attributable to the Q matrix cannot be addressed and remedied. To address this concern, this paper proposes an empirically based method of validating a Q matrix used…

  12. Modeller subjectivity and calibration impacts on hydrological model applications: an event-based comparison for a road-adjacent catchment in south-east Norway.

    Science.gov (United States)

    Kalantari, Zahra; Lyon, Steve W; Jansson, Per-Erik; Stolte, Jannes; French, Helen K; Folkeson, Lennart; Sassner, Mona

    2015-01-01

    Identifying a 'best' performing hydrologic model in a practical sense is difficult due to the potential influences of modeller subjectivity on, for example, calibration procedure and parameter selection. This is especially true for model applications at the event scale where the prevailing catchment conditions can have a strong impact on apparent model performance and suitability. In this study, two lumped models (CoupModel and HBV) and two physically-based distributed models (LISEM and MIKE SHE) were applied to a small catchment upstream of a road in south-eastern Norway. All models were calibrated to a single event representing typical winter conditions in the region and then applied to various other winter events to investigate the potential impact of calibration period and methodology on model performance. Peak flow and event-based hydrographs were simulated differently by all models leading to differences in apparent model performance under this application. In this case-study, the lumped models appeared to be better suited for hydrological events that differed from the calibration event (i.e., events when runoff was generated from rain on non-frozen soils rather than from rain and snowmelt on frozen soil) while the more physical-based approaches appeared better suited during snowmelt and frozen soil conditions more consistent with the event-specific calibration. This was due to the combination of variations in subsurface conditions over the eight events considered, the subsequent ability of the models to represent the impact of the conditions (particularly when subsurface conditions varied greatly from the calibration event), and the different approaches adopted to calibrate the models. These results indicate that hydrologic models may not only need to be selected on a case-by-case basis but also have their performance evaluated on an application-by-application basis since how a model is applied can be equally important as inherent model structure.

  13. Variable Selection and Updating In Model-Based Discriminant Analysis for High Dimensional Data with Food Authenticity Applications*

    OpenAIRE

    Murphy, Thomas Brendan; Dean, Nema; Raftery, Adrian E.

    2010-01-01

    Food authenticity studies are concerned with determining if food samples have been correctly labeled or not. Discriminant analysis methods are an integral part of the methodology for food authentication. Motivated by food authenticity applications, a model-based discriminant analysis method that includes variable selection is presented. The discriminant analysis model is fitted in a semi-supervised manner using both labeled and unlabeled data. The method is shown to give ...

  14. A temperature-precipitation based leafing model and its application in Northeast China.

    Directory of Open Access Journals (Sweden)

    Rong-Ping Li

    Full Text Available Plant phenology models, especially leafing models, play critical roles in evaluating the impact of climate change on the primary production of temperate plants. Existing models based on temperature alone could not accurately simulate plant leafing in arid and semi-arid regions. The objective of the present study was to test the suitability of the existing temperature-based leafing models in arid and semi-arid regions, and to develop a temperature-precipitation based leafing model (TP, based on the long-term (i.e., 12-27 years ground leafing observation data and meteorological data in Northeast China. The better simulation of leafing for all the plant species in Northeast China was given by TP with the fixed starting date (TPn than with the parameterized starting date (TPm, which gave the smallest average root mean square error (RMSE of 4.21 days. Tree leafing models were validated with independent data, and the coefficient of determination (R(2 was greater than 0.60 in 75% of the estimates by TP and the spring warming model (SW with the fixed starting date. The average RMSE of herb leafing simulated by TPn was 5.03 days, much lower than other models (>9.51 days, while the average R(2 of TPn and TPm were 0.68 and 0.57, respectively, much higher than the other models (<0.22. It indicates that TPn is a universal model and more suitable for simulating leafing of trees and herbs than the prior models. Furthermore, water is an important factor determining herb leafing in arid and semi-arid temperate regions.

  15. Advances In Global Aerosol Modeling Applications Through Assimilation of Satellite-Based Lidar Measurements

    Science.gov (United States)

    Campbell, James; Hyer, Edward; Zhang, Jianglong; Reid, Jeffrey; Westphal, Douglas; Xian, Peng; Vaughan, Mark

    2010-05-01

    Modeling the instantaneous three-dimensional aerosol field and its downwind transport represents an endeavor with many practical benefits foreseeable to air quality, aviation, military and science agencies. The recent proliferation of multi-spectral active and passive satellite-based instruments measuring aerosol physical properties has served as an opportunity to develop and refine the techniques necessary to make such numerical modeling applications possible. Spurred by high-resolution global mapping of aerosol source regions, and combined with novel multivariate data assimilation techniques designed to consider these new data streams, operational forecasts of visibility and aerosol optical depths are now available in near real-time1. Active satellite-based aerosol profiling, accomplished using lidar instruments, represents a critical element for accurate analysis and transport modeling. Aerosol source functions, alone, can be limited in representing the macrophysical structure of injection scenarios within a model. Two-dimensional variational (2D-VAR; x, y) assimilation of aerosol optical depth from passive satellite observations significantly improves the analysis of the initial state. However, this procedure can not fully compensate for any potential vertical redistribution of mass required at the innovation step. The expense of an inaccurate vertical analysis of aerosol structure is corresponding errors downwind, since trajectory paths within successive forecast runs will likely diverge with height. In this paper, the application of a newly-designed system for 3D-VAR (x,y,z) assimilation of vertical aerosol extinction profiles derived from elastic-scattering lidar measurements is described [Campbell et al., 2009]. Performance is evaluated for use with the U. S. Navy Aerosol Analysis and Prediction System (NAAPS) by assimilating NASA/CNES satellite-borne Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) 0.532 μm measurements [Winker et al., 2009

  16. Application of a hazard-based visual predictive check to evaluate parametric hazard models.

    Science.gov (United States)

    Huh, Yeamin; Hutmacher, Matthew M

    2016-02-01

    Parametric models used in time to event analyses are evaluated typically by survival-based visual predictive checks (VPC). Kaplan-Meier survival curves for the observed data are compared with those estimated using model-simulated data. Because the derivative of the log of the survival curve is related to the hazard--the typical quantity modeled in parametric analysis--isolation, interpretation and correction of deficiencies in the hazard model determined by inspection of survival-based VPC's is indirect and thus more difficult. The purpose of this study is to assess the performance of nonparametric hazard estimators of hazard functions to evaluate their viability as VPC diagnostics. Histogram-based and kernel-smoothing estimators were evaluated in terms of bias of estimating the hazard for Weibull and bathtub-shape hazard scenarios. After the evaluation of bias, these nonparametric estimators were assessed as a method for VPC evaluation of the hazard model. The results showed that nonparametric hazard estimators performed reasonably at the sample sizes studied with greater bias near the boundaries (time equal to 0 and last observation) as expected. Flexible bandwidth and boundary correction methods reduced these biases. All the nonparametric estimators indicated a misfit of the Weibull model when the true hazard was a bathtub shape. Overall, hazard-based VPC plots enabled more direct interpretation of the VPC results compared to survival-based VPC plots. PMID:26563504

  17. Engine Modelling for Control Applications

    DEFF Research Database (Denmark)

    Hendricks, Elbert

    1997-01-01

    engine data for this purpose. It is especially well suited to embedded model applications in engine controllers, such as nonlinear observer based air/fuel ratio and advanced idle speed control. After a brief review of this model, it will be compared with other similar models which can be found......In earlier work published by the author and co-authors, a dynamic engine model called a Mean Value Engine Model (MVEM) was developed. This model is physically based and is intended mainly for control applications. In its newer form, it is easy to fit to many different engines and requires little...

  18. A Danger Theory Based Mobile Virus Detection Model and Its Application in Inhibiting Virus

    Directory of Open Access Journals (Sweden)

    Tianliang Lu

    2012-08-01

    Full Text Available According to the propagation and destruction characteristics of mobile phone viruses, a virus detection model based on the Danger Theory is proposed. This model includes four phases: danger capture, antigen presentation, antibody generation and antibody distribution. In this model, local knowledge of mobile phones is exploited by the agents that are running in mobile phones to discover danger caused by viruses. The Antigen Presenting Cells (APCs present the antigen from mobile phones in the danger zone, and the Decision Center confirms the infection of viruses. After the antibody is generated by self-tolerating using the negative selection algorithm, the Decision Center distributes the antibody to mobile phones. Due to the distributed and cooperative mechanism of artificial immune system, the proposed model lowers the storage and computing consumption of mobile phones. The simulation results show that based on the mobile phone virus detection model, the proposed virus immunization strategy can effectively inhibit the propagation of mobile phone viruses.

  19. X-parameter Based GaN Device Modeling and its Application to a High-efficiency PA Design

    DEFF Research Database (Denmark)

    Wang, Yelin; Nielsen, Troels Studsgaard; Jensen, Ole Kiel;

    2014-01-01

    X-parameters are supersets of S-parameters and applicable to both linear and nonlinear system modeling. In this paper, a packaged 6 W Gallium Nitride (GaN) RF power transistor is modeled using load-dependent X-parameters by simulations. During the device characterization the load impedance is tuned...... only up to the 2nd-order harmonic. However, it proves that the model can still accurately approximate the behavior of the transistor under impedance tuning up to the 3rd-order harmonic. The simulation results preliminarily validate the concept of utilizing the X-parameter based modeling technique...

  20. Model Theory and Applications

    CERN Document Server

    Mangani, P

    2011-01-01

    This title includes: Lectures - G.E. Sacks - Model theory and applications, and H.J. Keisler - Constructions in model theory; and, Seminars - M. Servi - SH formulas and generalized exponential, and J.A. Makowski - Topological model theory.

  1. Object-oriented modelling with unified modelling language 2.0 for simple software application based on agile methodology

    CERN Document Server

    Warnars, Spits

    2010-01-01

    Unified modelling language (UML) 2.0 introduced in 2002 has been developing and influencing object-oriented software engineering and has become a standard and reference for information system analysis and design modelling. There are many concepts and theories to model the information system or software application with UML 2.0, which can make ambiguities and inconsistencies for a novice to learn to how to model the system with UML especially with UML 2.0. This article will discuss how to model the simple software application by using some of the diagrams of UML 2.0 and not by using the whole diagrams as suggested by agile methodology. Agile methodology is considered as convenient for novices because it can deliver the information technology environment to the end-user quickly and adaptively with minimal documentation. It also has the ability to deliver best performance software application according to the customer's needs. Agile methodology will make simple model with simple documentation, simple team and si...

  2. Modeling and Deployment of Model-Based Decentralized Embedded Diagnosis inside Vehicles: Application to Smart Distance Keeping Function

    Directory of Open Access Journals (Sweden)

    Othman Nasri

    2012-01-01

    Full Text Available The deployment of a fault diagnosis strategy in the Smart Distance Keeping (SDK system with a decentralized architecture is presented. The SDK system is an advanced Adaptive Cruise Control (ACC system implemented in a Renault-Volvo Trucks vehicle to increase safety by overcoming some ACC limitations. One of the main differences between this new system and the classical ACC is the choice of the safe distance. This latter is the distance between the vehicle equipped with the ACC or the SDK system and the obstacle-in-front (which may be another vehicle. It is supposed fixed in the case of the ACC, while variable in the case of the SDK. The variation of this distance depends essentially on the relative velocity between the vehicle and the obstacle-in-front. The main goal of this work is to analyze measurements, issued from the SDK elements, in order to detect, to localize, and to identify some faults that may occur. Our main contribution is the proposition of a decentralized approach permitting to carry out an on-line diagnosis without computing the global model and to achieve most of the work locally avoiding huge extra diagnostic information traffic between components. After a detailed description of the SDK system, this paper explains the model-based decentralized solution and its application to the embedded diagnosis of the SDK system inside Renault-Volvo Truck with five control units connected via a CAN-bus using “Hardware in the Loop” (HIL technique. We also discuss the constraints that must be fulfilled.

  3. FANN-Based Surface Water Quality Evaluation Model and Its Application in the Shaoguan Area

    Institute of Scientific and Technical Information of China (English)

    YANG Meini; LI Dingfang; YANG Jinbo; XIONG Wei

    2007-01-01

    A fuzzy neural network model is proposed to evaluate water quality. The model contains two parts: first, fuzzy mathematics theory is used to standardize the samples; second, the RBF neural network and the BP neural network are used to train the standardized samples. The proposed model was applied to assess the water quality of 16 sections in 9 rivers in the Shaoguan area in 2005. The evaluation result was compared with that of the RBF neural network method and the reported results in the Shaoguan area in 2005. It indicated that the performance of the proposed fuzzy neural network model is practically feasible in the application of water quality assessment and its operation is simple.

  4. Application of nuclear models

    International Nuclear Information System (INIS)

    The development of extensive experimental nuclear data base over the past three decades has been accompanied by parallel advancement of nuclear theory and models used to describe and interpret the measurements. This theoretical capability is important because of many nuclear data requirements that are still difficult, impractical, or even impossible to meet with present experimental techniques. Examples of such data needs are neutron cross sections for unstable fission products, which are required for neutron absorption corrections in reactor calculations; cross sections for transactinide nuclei that control production of long-lived nuclear wastes; and the extensive dosimetry, activation, and neutronic data requirements to 40 MeV that must accompany development of the Fusion Materials Irradation Test (FMIT) facility. In recent years systematic improvements have been made in the nuclear models and codes used in data evaluation and, most importantly, in the methods used to derive physically based parameters for model calculations. The newly issued ENDF/B-V evaluated data library relies in many cases on nuclear reaction theory based on compound-nucleus Hauser-Feshbach, preequilibrium and direct reaction mechanisms as well as spherical and deformed optical-model theories. The development and applications of nuclear models for data evaluation are discussed with emphasis on the 1 to 40 MeV neutron energy range

  5. Scenario-based, closed-loop model predictive control with application to emergency vehicle scheduling

    Science.gov (United States)

    Goodwin, Graham. C.; Medioli, Adrian. M.

    2013-08-01

    Model predictive control has been a major success story in process control. More recently, the methodology has been used in other contexts, including automotive engine control, power electronics and telecommunications. Most applications focus on set-point tracking and use single-sequence optimisation. Here we consider an alternative class of problems motivated by the scheduling of emergency vehicles. Here disturbances are the dominant feature. We develop a novel closed-loop model predictive control strategy aimed at this class of problems. We motivate, and illustrate, the ideas via the problem of fluid deployment of ambulance resources.

  6. Multi-scale Modelling of bcc-Fe Based Alloys for Nuclear Applications

    International Nuclear Information System (INIS)

    , advanced techniques to fit interatomic potentials consistent with thermodynamics are proposed and the results of their application to the mentioned alloys are presented. Next, the development of advanced methods, based on the use of artificial intelligence, to improve both the physical reliability and the computational efficiency of kinetic Monte Carlo codes for the study of point-defect clustering and phase changes beyond the scale of MD, is reported. These recent progresses bear the promise of being able, in the near future, of producing reliable tools for the description of the microstructure evolution of realistic model alloys under irradiation. (author)

  7. Gradient-based Kriging approximate model and its application research to optimization design

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    In the process of multidisciplinary design optimization, there exits the calculation complexity problem due to frequently calling high fidelity system analysis models. The high fidelity system analysis models can be surrogated by approximate models. The sensitivity analysis and numerical noise filtering can be done easily by coupling approximate models to optimization. Approximate models can reduce the number of executions of the problem’s simulation code during optimization, so the solution efficiency of the multidisciplinary design optimization problem can be improved. Most optimization methods are based on gradient. The gradients of the objective and constrain functions are gained easily. The gra- dient-based Kriging (GBK) approximate model can be constructed by using system response value and its gradients. The gradients can greatly improve prediction precision of system response. The hybrid optimization method is constructed by coupling GBK approximate models to gradient-based optimiza- tion methods. An aircraft aerodynamics shape optimization design example indicates that the methods of this paper can achieve good feasibility and validity.

  8. Developments in model-based optimization and control distributed control and industrial applications

    CERN Document Server

    Grancharova, Alexandra; Pereira, Fernando

    2015-01-01

    This book deals with optimization methods as tools for decision making and control in the presence of model uncertainty. It is oriented to the use of these tools in engineering, specifically in automatic control design with all its components: analysis of dynamical systems, identification problems, and feedback control design. Developments in Model-Based Optimization and Control takes advantage of optimization-based formulations for such classical feedback design objectives as stability, performance and feasibility, afforded by the established body of results and methodologies constituting optimal control theory. It makes particular use of the popular formulation known as predictive control or receding-horizon optimization. The individual contributions in this volume are wide-ranging in subject matter but coordinated within a five-part structure covering material on: · complexity and structure in model predictive control (MPC); · collaborative MPC; · distributed MPC; · optimization-based analysis and desi...

  9. Quality-Driven Model-Based Design of MultiProcessor Embedded Systems for Highlydemanding Applications

    DEFF Research Database (Denmark)

    Jozwiak, Lech; Madsen, Jan

    2013-01-01

    C optimization, adequate resolution of numerous complex design tradeoffs, reduction of the design productivity gap for the increasingly complex and sophisticated systems, reduction of the time-to market and development costs without compromising the system quality, etc. These challenges cannot be well addressed......C technology is introduced, and situation, trends and problems are discussed in the area of heterogeneous MPSoCs based on adaptable ASIPs and hardware accelerators for highly-demanding applications....

  10. Application of Transfer Matrix Approach to Modeling and Decentralized Control of Lattice-Based Structures

    Science.gov (United States)

    Cramer, Nick; Swei, Sean Shan-Min; Cheung, Kenny; Teodorescu, Mircea

    2015-01-01

    This paper presents a modeling and control of aerostructure developed by lattice-based cellular materials/components. The proposed aerostructure concept leverages a building block strategy for lattice-based components which provide great adaptability to varying ight scenarios, the needs of which are essential for in- ight wing shaping control. A decentralized structural control design is proposed that utilizes discrete-time lumped mass transfer matrix method (DT-LM-TMM). The objective is to develop an e ective reduced order model through DT-LM-TMM that can be used to design a decentralized controller for the structural control of a wing. The proposed approach developed in this paper shows that, as far as the performance of overall structural system is concerned, the reduced order model can be as e ective as the full order model in designing an optimal stabilizing controller.

  11. Evolving MCDM applications using hybrid expert-based ISM and DEMATEL models: an example of sustainable ecotourism.

    Science.gov (United States)

    Chuang, Huan-Ming; Lin, Chien-Ku; Chen, Da-Ren; Chen, You-Shyang

    2013-01-01

    Ecological degradation is an escalating global threat. Increasingly, people are expressing awareness and priority for concerns about environmental problems surrounding them. Environmental protection issues are highlighted. An appropriate information technology tool, the growing popular social network system (virtual community, VC), facilitates public education and engagement with applications for existent problems effectively. Particularly, the exploration of related involvement behavior of VC member engagement is an interesting topic. Nevertheless, member engagement processes comprise interrelated sub-processes that reflect an interactive experience within VCs as well as the value co-creation model. To address the top-focused ecotourism VCs, this study presents an application of a hybrid expert-based ISM model and DEMATEL model based on multi-criteria decision making tools to investigate the complex multidimensional and dynamic nature of member engagement. Our research findings provide insightful managerial implications and suggest that the viral marketing of ecotourism protection is concerned with practitioners and academicians alike. PMID:24453902

  12. Evolving MCDM applications using hybrid expert-based ISM and DEMATEL models: an example of sustainable ecotourism.

    Science.gov (United States)

    Chuang, Huan-Ming; Lin, Chien-Ku; Chen, Da-Ren; Chen, You-Shyang

    2013-01-01

    Ecological degradation is an escalating global threat. Increasingly, people are expressing awareness and priority for concerns about environmental problems surrounding them. Environmental protection issues are highlighted. An appropriate information technology tool, the growing popular social network system (virtual community, VC), facilitates public education and engagement with applications for existent problems effectively. Particularly, the exploration of related involvement behavior of VC member engagement is an interesting topic. Nevertheless, member engagement processes comprise interrelated sub-processes that reflect an interactive experience within VCs as well as the value co-creation model. To address the top-focused ecotourism VCs, this study presents an application of a hybrid expert-based ISM model and DEMATEL model based on multi-criteria decision making tools to investigate the complex multidimensional and dynamic nature of member engagement. Our research findings provide insightful managerial implications and suggest that the viral marketing of ecotourism protection is concerned with practitioners and academicians alike.

  13. A new model for the grid size optimization of the finite element method --Based on its application to the water quality modeling of the topographically complicated river

    Institute of Scientific and Technical Information of China (English)

    ZENG Guangming; SU Xiaokang; HUANG Guohe; XIE Gengxin

    2003-01-01

    The finite element method is one of the typical methods that are used for numerical water quality modeling of the topographically complicated river. In this paper, based on the principle of probability theory the probability density of pollutants is introduced. A new model for the grid size optimization based on the finite element method is developed with the incorporation of the maximum information entropy theory when the length of the grid is given. Combined with the experiential evaluation approach of the flow discharge per unit river width, this model can be used to determine the grid size of the finite element method applied to water quality modeling of the topographically complicated river when the velocity field of the river is not given. The calculating results of the application of the model to an ideal river testified the correctness of the model. In a practical case-the application of the model to the Xingjian River (the Hengyang section of the Xiangjiang River), the optimized width of the grid of the finite element method was gained and the influence of parameters was studied, which demonstrated that the model reflected the real situation of the pollutants in the river, and that the model had many excellent characteristics such as stabilization, credibility and high applicability in practical applications.

  14. Spatial Rule-Based Modeling: A Method and Its Application to the Human Mitotic Kinetochore

    Directory of Open Access Journals (Sweden)

    Jan Huwald

    2013-07-01

    Full Text Available A common problem in the analysis of biological systems is the combinatorial explosion that emerges from the complexity of multi-protein assemblies. Conventional formalisms, like differential equations, Boolean networks and Bayesian networks, are unsuitable for dealing with the combinatorial explosion, because they are designed for a restricted state space with fixed dimensionality. To overcome this problem, the rule-based modeling language, BioNetGen, and the spatial extension, SRSim, have been developed. Here, we describe how to apply rule-based modeling to integrate experimental data from different sources into a single spatial simulation model and how to analyze the output of that model. The starting point for this approach can be a combination of molecular interaction data, reaction network data, proximities, binding and diffusion kinetics and molecular geometries at different levels of detail. We describe the technique and then use it to construct a model of the human mitotic inner and outer kinetochore, including the spindle assembly checkpoint signaling pathway. This allows us to demonstrate the utility of the procedure, show how a novel perspective for understanding such complex systems becomes accessible and elaborate on challenges that arise in the formulation, simulation and analysis of spatial rule-based models.

  15. Applications of sediment sudden deposition model based on the third-generation numerical model for shallow water wave

    Institute of Scientific and Technical Information of China (English)

    BAI Yuchuan; ZHANG Yinqi; ZHANG Bin

    2007-01-01

    The existing numerical models for nearshore waves are briefly introduced, and the third-generation numerical model for shallow water wave, which makes use of the most advanced productions of wave research and has been adapted well to be used in the environment of seacoast, lake and estuary area, is particularly discussed. The applied model realizes the significant wave height distribution at different wind directions. To integrate the model into the coastal area sediment, sudden deposition mechanism, the distribution of average silt content and the change of sediment sudden deposition thickness over time in the nearshore area are simulated. The academic productions can give some theoretical guidance to the applications of sediment sudden deposition mechanism for stormy waves in the coastal area. And the advancing directions of sediment sudden deposition model are prospected.

  16. A hypergraph-based model for graph clustering: application to image indexing

    OpenAIRE

    Jouili, Salim; Tabbone, Salvatore

    2009-01-01

    Version finale disponible : www.springerlink.com International audience In this paper, we introduce a prototype-based clustering algorithm dealing with graphs. We propose a hypergraph-based model for graph data sets by allowing clusters overlapping. More precisely, in this representation one graph can be assigned to more than one cluster. Using the concept of the graph median and a given threshold, the proposed algorithm detects automatically the number of classes in the graph database....

  17. Mathematical modeling and simulation in animal health - Part II: principles, methods, applications, and value of physiologically based pharmacokinetic modeling in veterinary medicine and food safety assessment.

    Science.gov (United States)

    Lin, Z; Gehring, R; Mochel, J P; Lavé, T; Riviere, J E

    2016-10-01

    This review provides a tutorial for individuals interested in quantitative veterinary pharmacology and toxicology and offers a basis for establishing guidelines for physiologically based pharmacokinetic (PBPK) model development and application in veterinary medicine. This is important as the application of PBPK modeling in veterinary medicine has evolved over the past two decades. PBPK models can be used to predict drug tissue residues and withdrawal times in food-producing animals, to estimate chemical concentrations at the site of action and target organ toxicity to aid risk assessment of environmental contaminants and/or drugs in both domestic animals and wildlife, as well as to help design therapeutic regimens for veterinary drugs. This review provides a comprehensive summary of PBPK modeling principles, model development methodology, and the current applications in veterinary medicine, with a focus on predictions of drug tissue residues and withdrawal times in food-producing animals. The advantages and disadvantages of PBPK modeling compared to other pharmacokinetic modeling approaches (i.e., classical compartmental/noncompartmental modeling, nonlinear mixed-effects modeling, and interspecies allometric scaling) are further presented. The review finally discusses contemporary challenges and our perspectives on model documentation, evaluation criteria, quality improvement, and offers solutions to increase model acceptance and applications in veterinary pharmacology and toxicology.

  18. Physiologically Based Pharmacokinetic Modeling: Methodology, Applications, and Limitations with a Focus on Its Role in Pediatric Drug Development

    Directory of Open Access Journals (Sweden)

    Feras Khalil

    2011-01-01

    Full Text Available The concept of physiologically based pharmacokinetic (PBPK modeling was introduced years ago, but it has not been practiced significantly. However, interest in and implementation of this modeling technique have grown, as evidenced by the increased number of publications in this field. This paper demonstrates briefly the methodology, applications, and limitations of PBPK modeling with special attention given to discuss the use of PBPK models in pediatric drug development and some examples described in detail. Although PBPK models do have some limitations, the potential benefit from PBPK modeling technique is huge. PBPK models can be applied to investigate drug pharmacokinetics under different physiological and pathological conditions or in different age groups, to support decision-making during drug discovery, to provide, perhaps most important, data that can save time and resources, especially in early drug development phases and in pediatric clinical trials, and potentially to help clinical trials become more “confirmatory” rather than “exploratory”.

  19. Developing and Evaluating Creativity Gamification Rehabilitation System: The Application of PCA-ANFIS Based Emotions Model

    Science.gov (United States)

    Su, Chung-Ho; Cheng, Ching-Hsue

    2016-01-01

    This study aims to explore the factors in a patient's rehabilitation achievement after a total knee replacement (TKR) patient exercises, using a PCA-ANFIS emotion model-based game rehabilitation system, which combines virtual reality (VR) and motion capture technology. The researchers combine a principal component analysis (PCA) and an adaptive…

  20. Model based fault diagnosis in a centrifugal pump application using structural analysis

    DEFF Research Database (Denmark)

    Kallesøe, C. S.; Izadi-Zamanabadi, Roozbeh; Rasmussen, Henrik;

    2004-01-01

    A model based approach for fault detection and isolation in a centrifugal pump is proposed in this paper. The fault detection algorithm is derived using a combination of structural analysis, Analytical Redundant Relations (ARR) and observer designs. Structural considerations on the system are used...

  1. Model Based Fault Diagnosis in a Centrifugal Pump Application using Structural Analysis

    DEFF Research Database (Denmark)

    Kallesøe, C. S.; Izadi-Zamanabadi, Roozbeh; Rasmussen, Henrik;

    2004-01-01

    A model based approach for fault detection and isolation in a centrifugal pump is proposed in this paper. The fault detection algorithm is derived using a combination of structural analysis, Analytical Redundant Relations (ARR) and observer designs. Structural considerations on the system are used...

  2. ARX-NNPLS Model Based Optimization Strategy and Its Application in Polymer Grade Transition Process

    Institute of Scientific and Technical Information of China (English)

    费正顺; 胡斌; 叶鲁彬; 梁军

    2012-01-01

    Since it is often difficult to build differential algebraic equations (DAEs) for chemical processes, a new data-based modeling approach is proposed using ARX (AutoRegressive with eXogenous inputs) combined with neural network under partial least squares framework (ARX-NNPLS), in which less specific knowledge of the process is required but the input and output data. To represent the dynamic and nonlinear behavior of the process, the ARX combined with neural network is used in the partial least squares (PLS) inner model between input and output latent variables. In the proposed dynamic optimization strategy based on the ARX-NNPLS model, neither parameterization nor iterative solving process for DAEs is needed as the ARX-NNPLS model gives a proper representation for the dynamic behavior of the process, and the computing time is greatly reduced compared to conventional control vector parameterization method. To demonstrate the ARX-NNPLS model based optimization strategy, the polyethylene grade transition in gas phase fluidized-bed reactor is taken into account. The optimization results show that the final optimal trajectory of quality index determined by the new approach moves faster to the target values and the computing time is much less.

  3. Improvements in mode-based waveform modeling and application to Eurasian velocity structure

    Science.gov (United States)

    Panning, M. P.; Marone, F.; Kim, A.; Capdeville, Y.; Cupillard, P.; Gung, Y.; Romanowicz, B.

    2006-12-01

    We introduce several recent improvements to mode-based 3D and asymptotic waveform modeling and examine how to integrate them with numerical approaches for an improved model of upper-mantle structure under eastern Eurasia. The first step in our approach is to create a large-scale starting model including shear anisotropy using Nonlinear Asymptotic Coupling Theory (NACT; Li and Romanowicz, 1995), which models the 2D sensitivity of the waveform to the great-circle path between source and receiver. We have recently improved this approach by implementing new crustal corrections which include a non-linear correction for the difference between the average structure of several large regions from the global model with further linear corrections to account for the local structure along the path between source and receiver (Marone and Romanowicz, 2006; Panning and Romanowicz, 2006). This model is further refined using a 3D implementation of Born scattering (Capdeville, 2005). We have made several recent improvements to this method, in particular introducing the ability to represent perturbations to discontinuities. While the approach treats all sensitivity as linear perturbations to the waveform, we have also experimented with a non-linear modification analogous to that used in the development of NACT. This allows us to treat large accumulated phase delays determined from a path-average approximation non-linearly, while still using the full 3D sensitivity of the Born approximation. Further refinement of shallow regions of the model is obtained using broadband forward finite-difference waveform modeling. We are also integrating a regional Spectral Element Method code into our tomographic modeling, allowing us to move beyond many assumptions inherent in the analytic mode-based approaches, while still taking advantage of their computational efficiency. Illustrations of the effects of these increasingly sophisticated steps will be presented.

  4. A Patch-Based Structural Masking Model with an Application to Compression

    Directory of Open Access Journals (Sweden)

    Damon M. Chandler

    2009-01-01

    Full Text Available The ability of an image region to hide or mask a given target signal continues to play a key role in the design of numerous image processing and vision systems. However, current state-of-the-art models of visual masking have been optimized for artificial targets placed upon unnatural backgrounds. In this paper, we (1 measure the ability of natural-image patches in masking distortion; (2 analyze the performance of a widely accepted standard masking model in predicting these data; and (3 report optimal model parameters for different patch types (textures, structures, and edges. Our results reveal that the standard model of masking does not generalize across image type; rather, a proper model should be coupled with a classification scheme which can adapt the model parameters based on the type of content contained in local image patches. The utility of this adaptive approach is demonstrated via a spatially adaptive compression algorithm which employs patch-based classification. Despite the addition of extra side information and the high degree of spatial adaptivity, this approach yields an efficient wavelet compression strategy that can be combined with very accurate rate-control procedures.

  5. A Patch-Based Structural Masking Model with an Application to Compression

    Directory of Open Access Journals (Sweden)

    Gaubatz MatthewD

    2009-01-01

    Full Text Available Abstract The ability of an image region to hide or mask a given target signal continues to play a key role in the design of numerous image processing and vision systems. However, current state-of-the-art models of visual masking have been optimized for artificial targets placed upon unnatural backgrounds. In this paper, we (1 measure the ability of natural-image patches in masking distortion; (2 analyze the performance of a widely accepted standard masking model in predicting these data; and (3 report optimal model parameters for different patch types (textures, structures, and edges. Our results reveal that the standard model of masking does not generalize across image type; rather, a proper model should be coupled with a classification scheme which can adapt the model parameters based on the type of content contained in local image patches. The utility of this adaptive approach is demonstrated via a spatially adaptive compression algorithm which employs patch-based classification. Despite the addition of extra side information and the high degree of spatial adaptivity, this approach yields an efficient wavelet compression strategy that can be combined with very accurate rate-control procedures.

  6. A new crossover sine model based on trigonometric model and its application to the crossover lattice equation of state

    Science.gov (United States)

    Lee, Yongjin; Shin, Moon Sam; Kim, Hwayong

    2008-12-01

    In this study, a new crossover sine model (CSM) n was developed from a trigonometric model [M. E. Fisher, S. Zinn, and P. J. Upton, Phys. Rev. B 59, 14533 (1999)]. The trigonometric model is a parametric formulation model that is used to represent the thermodynamic variables near a critical point. Although there are other crossover models based on this trigonometric model, such as the CSM and the analytical sine model, which is an analytic formulation of the CSM, the new sine model (NSM) employs a different approach from these two models in terms of the connections between the parametric variables of the trigonometric model and thermodynamic variables. In order to test the performance of the NSM, the crossover lattice equation of state [M. S. Shin, Y. Lee, and H. Kim, J. Chem. Thermodyn. 40, 174 (2008)] was applied using the NSM for correlations of various pure fluids and fluid mixtures. The results showed that over a wide range of states, the crossover lattice fluid (xLF)/NSM yields the saturated properties of pure fluids and the phase behavior of binary mixtures more accurately than the original lattice equation of state. Moreover, a comparison with the crossover lattice equation of state using the CSM (xLF/CSM) showed that the new model presents good correlation results that are comparable to the xLF/CSM.

  7. Risk Evaluation Approach and Application Research on Fuzzy-FMECA Method Based on Cloud Model

    Directory of Open Access Journals (Sweden)

    Zhengjie Xu

    2013-09-01

    Full Text Available In order to safeguard the safety of passengers and reducemaintenance costs, it is necessary to analyze and evaluate the security risk ofthe Railway Signal System. However, the conventional Fuzzy Analytical HierarchyProcess (FAHP can not describe the fuzziness and randomness of the judgment,accurately, and once the fuzzy sets are described using subjection degreefunction, the concept of fuzziness will be no longer fuzzy. Thus Fuzzy-FMECAmethod based on cloud model is put forward. Failure Modes Effects andCriticality Analysis (FMECA method is used to identify the risk and FAHP basedon cloud model is used for determining the subjection degree function in fuzzymethod, finally the group decision can be gained with the syntheticallyaggregated cloud model, the method’s feasibility and effectiveness are shown inthe practical examples. Finally Fuzzy-FMECA based on cloud model and theconventional FAHP are used to assess the risk respectively, evaluation resultsshow that the cloud model which is introduced into the risk assessment ofRailway Signal System can realize the transition between precise value andquality value by combining the fuzziness and randomness and provide moreabundant information than subjection degree function of the conventional FAHP.

  8. Is equine colic seasonal? Novel application of a model based approach

    Directory of Open Access Journals (Sweden)

    Proudman Christopher J

    2006-08-01

    Full Text Available Abstract Background Colic is an important cause of mortality and morbidity in domesticated horses yet many questions about this condition remain to be answered. One such question is: does season have an effect on the occurrence of colic? Time-series analysis provides a rigorous statistical approach to this question but until now, to our knowledge, it has not been used in this context. Traditional time-series modelling approaches have limited applicability in the case of relatively rare diseases, such as specific types of equine colic. In this paper we present a modelling approach that respects the discrete nature of the count data and, using a regression model with a correlated latent variable and one with a linear trend, we explored the seasonality of specific types of colic occurring at a UK referral hospital between January 1995–December 2004. Results Six- and twelve-month cyclical patterns were identified for all colics, all medical colics, epiploic foramen entrapment (EFE, equine grass sickness (EGS, surgically treated and large colon displacement/torsion colic groups. A twelve-month cyclical pattern only was seen in the large colon impaction colic group. There was no evidence of any cyclical pattern in the pedunculated lipoma group. These results were consistent irrespective of whether we were using a model including latent correlation or trend. Problems were encountered in attempting to include both trend and latent serial dependence in models simultaneously; this is likely to be a consequence of a lack of power to separate these two effects in the presence of small counts, yet in reality the underlying physical effect is likely to be a combination of both. Conclusion The use of a regression model with either an autocorrelated latent variable or a linear trend has allowed us to establish formally a seasonal component to certain types of colic presented to a UK referral hospital over a 10 year period. These patterns appeared to coincide

  9. The Application of PPE Model Based on RAGA in Benefit Evaluating of Rice Water Saving

    Institute of Scientific and Technical Information of China (English)

    FU Qiang; YANG Guang-lin; FU Hong

    2003-01-01

    Through applying PPE model based on RAGA to evaluate the benefit of rice water saving,the author turns multi-dimension data into low dimension space.So the optimum projection direction can stand for the best influence on the collectivity.Thus,the value of projection function can evaluate each item good or not.The PPE model can avoid jamming of weight matrix in the method of fuzzy synthesize judgement,and obtain better result.The author wants to provide a new method and thought for readers who are engaged in investment decision-making of water saving irrigation and other relative study.

  10. An algorithm of multi-model spatial overlay based on three-dimensional terrain model TIN and its application

    Institute of Scientific and Technical Information of China (English)

    王少安; 张子平; 龚健雅

    2001-01-01

    3D-GIS spatial overlay analysis is being broadly concerned about in international academe and is a research focus. It is one of the important functions of spatial analysis using GIS technology. An algorithm of multi-model spatial overlay based on three-dimensional terrain model TIN is introduced in this paper which can be used to solve the TIN-based thrcc-dimensional overlay operation in spatial analysis. The feasibility arid validity of this algorithm is identified. This algorithm is used successfully in three-dimensional overlay and region variation overlay analysis.

  11. An algorithm of multi-model spatial overlay based on three-dimensional terrain model TIN and its application

    Institute of Scientific and Technical Information of China (English)

    WANG Shao-an; ZHANG Zi-ping; GONG Jian-ya

    2001-01-01

    3D-GIS spatial overlay analysis is being broadly concerned about in in ternational academe and is a research focus. It is one of the important function s of spatial analysis using GIS technology. An algorithm of multi-model spatial overlay based on three-dimensional terrain model TIN is introduced in this pape r which can be used to solve the TIN-based three-dimensional overlay operation in spatial analysis. The feasibility and validity of this algorithm is identified. This algorithm is used successfully in three-dimensional overlay and region va riation overlay analysis.

  12. Alternatives to accuracy and bias metrics based on percentage errors for radiation belt modeling applications

    Energy Technology Data Exchange (ETDEWEB)

    Morley, Steven Karl [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-01

    This report reviews existing literature describing forecast accuracy metrics, concentrating on those based on relative errors and percentage errors. We then review how the most common of these metrics, the mean absolute percentage error (MAPE), has been applied in recent radiation belt modeling literature. Finally, we describe metrics based on the ratios of predicted to observed values (the accuracy ratio) that address the drawbacks inherent in using MAPE. Speci cally we de ne and recommend the median log accuracy ratio as a measure of bias and the median symmetric accuracy as a measure of accuracy.

  13. Research and application of mineral resources assessment by weights of evidence model based on SIG

    Institute of Scientific and Technical Information of China (English)

    Yuanyuan Chuai; Keyan Xiao; Yihua Xuan; Shaobin Zhan

    2006-01-01

    Geological data are usually of the characteristics of multi-source, large amount and multi-scale. The construction of Spatial Information Grid overcomes the shortages of personal computers when dealing with geological data. The authors introduce the definition, architecture and flow of mineral resources assessment by weights of evidence model based on Spatial Information Grid (SIG). Meanwhile, a case study on the prediction of copper mineral occurrence in the Middle-Lower Yangtze metallogenic belt is given. The results show that mineral resources assessement based on SIG is an effective new method which provides a way of sharing and integrating distributed geospatial information and improves the efficiency greatly.

  14. Alternatives to accuracy and bias metrics based on percentage errors for radiation belt modeling applications

    Energy Technology Data Exchange (ETDEWEB)

    Morley, Steven Karl [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-01

    This report reviews existing literature describing forecast accuracy metrics, concentrating on those based on relative errors and percentage errors. We then review how the most common of these metrics, the mean absolute percentage error (MAPE), has been applied in recent radiation belt modeling literature. Finally, we describe metrics based on the ratios of predicted to observed values (the accuracy ratio) that address the drawbacks inherent in using MAPE. Specifically we define and recommend the median log accuracy ratio as a measure of bias and the median symmetric accuracy as a measure of accuracy.

  15. Alternatives to accuracy and bias metrics based on percentage errors for radiation belt modeling applications

    Energy Technology Data Exchange (ETDEWEB)

    Morley, Steven Karl [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-01

    This report reviews existing literature describing forecast accuracy metrics, concentrating on those based on relative errors and percentage errors. We then review how the most common of these metrics, the mean absolute percentage error (MAPE), has been applied in recent radiation belt modeling literature. Finally, we describe metrics based on the ratios of predicted to observed values (the accuracy ratio) that address the drawbacks inherent in using MAPE. Specifically, we define and recommend the median log accuracy ratio as a measure of bias and the median symmetric accuracy as a measure of accuracy.

  16. Radioactive Threat Detection with Scattering Physics: A Model-Based Application

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J V; Chambers, D H; Breitfeller, E F; Guidry, B L; Verbeke, J M; Axelrod, M A; Sale, K E; Meyer, A M

    2010-01-21

    The detection of radioactive contraband is a critical problem in maintaining national security for any country. Emissions from threat materials challenge both detection and measurement technologies especially when concealed by various types of shielding complicating the transport physics significantly. The development of a model-based sequential Bayesian processor that captures both the underlying transport physics including scattering offers a physics-based approach to attack this challenging problem. It is shown that this processor can be used to develop an effective detection technique.

  17. [The application of cybernetic modeling methods for the forensic medical personality identification based on the voice and sounding speech characteristics].

    Science.gov (United States)

    Kaganov, A Sh; Kir'yanov, P A

    2015-01-01

    The objective of the present publication was to discuss the possibility of application of cybernetic modeling methods to overcome the apparent discrepancy between two kinds of the speech records, viz. initial ones (e.g. obtained in the course of special investigation activities) and the voice prints obtained from the persons subjected to the criminalistic examination. The paper is based on the literature sources and the materials of original criminalistics expertises performed by the authors. PMID:26245103

  18. LINKING SATELLITE REMOTE SENSING BASED ENVIRONMENTAL PREDICTORS TO DISEASE: AN APPLICATION TO THE SPATIOTEMPORAL MODELLING OF SCHISTOSOMIASIS IN GHANA

    OpenAIRE

    Wrable, M.; Liss, A.; Kulinkina, A.; Koch, M.; Biritwum, N. K.; Ofosu, A.; Kosinski, K. C.; Gute, D M; Naumova, E. N.

    2016-01-01

    90% of the worldwide schistosomiasis burden falls on sub-Saharan Africa. Control efforts are often based on infrequent, small-scale health surveys, which are expensive and logistically difficult to conduct. Use of satellite imagery to predictively model infectious disease transmission has great potential for public health applications. Transmission of schistosomiasis requires specific environmental conditions to sustain freshwater snails, however has unknown seasonality, and is difficult to s...

  19. REST based mobile applications

    Science.gov (United States)

    Rambow, Mark; Preuss, Thomas; Berdux, Jörg; Conrad, Marc

    2008-02-01

    Simplicity is the major advantage of REST based webservices. Whereas SOAP is widespread in complex, security sensitive business-to-business aplications, REST is widely used for mashups and end-user centric applicatons. In that context we give an overview of REST and compare it to SOAP. Furthermore we apply the GeoDrawing application as an example for REST based mobile applications and emphasize on pros and cons for the use of REST in mobile application scenarios.

  20. Modelling of the Relaxation Least Squares-Based Neural Networks and Its Application

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    A relaxation least squares-based learning algorithm for neural networks is proposed. Not only does it have a fast convergence rate, but it involves less computation quantity. Therefore, it is suitable to deal with the case when a network has a large scale but the number of training data is very limited. It has been used in converting furnace process modelling, and impressive result has been obtained.

  1. Model-based Measures of Output Gap: Application to the Thai Economy

    OpenAIRE

    Vimut Vanitcharearnthum

    2012-01-01

    In this paper we compare two model-based measures of the output gap. The first measure, as proposed by Gali (2011), defines output gap as the difference between actual output and the output level that would be if the economy operates under a perfectly competitive market without price or wage stickiness. We used annual data of relevant variables for Thailand and computed the output gap under this approach. The calculated output gap for Thailand shows that the Thai economy performs consistently...

  2. An engineering process for security patterns application in component based models

    OpenAIRE

    Bouaziz, Rahma; Kallel, Slim; Coulette, Bernard

    2013-01-01

    Security engineering with patterns is currently a very active area of research. Security patterns - an adaptation of Design Patterns to security - capture experts' experience in order to solve recurrent security problems in a structured and reusable way. In this paper, our objective is to describe an engineering process, called SCRIP (SeCurity patteRn Integration Process), which provides guidelines for integrating security patterns into component-based models. SCRIP defines activities and pro...

  3. Modeling of Soft sensor based on Artificial Neural Network for Galactic Cosmic Rays Application

    International Nuclear Information System (INIS)

    For successful designing of space radiation Galactic Cosmic Rays (GCRs) model, we develop a soft sensor based on the Artificial Neural Network (ANN) model. At the first step, the soft sensor based ANN was constructed as an alternative to model space radiation environment. The structure of ANN in this model is using Multilayer Perceptron (MLP) and Levenberg Marquardt algorithms with 3 inputs and 2 outputs. In the input variable, we use 12 years data (Corr, Uncorr and Press) of GCR particles obtained from Neutron Monitor of Bartol University (Fort Smith area) and the target output is (Corr and Press) from the same source but for Inuvik area in the Polar Regions. In the validation step, we obtained the Root Mean Square Error (RMSE) value of Corr 3.8670e-004 and Press 1.3414e-004 and Variance Accounted For (VAF) of Corr 99.9839 % and Press 99.9831% during the training section. After all the results obtained, then we applied into a Matlab GUI simulation (soft sensor simulation). This simulation will display the estimation of output value from input (Corr and Press). Testing results showed an error of 0.133% and 0.014% for Corr and Press, respectively

  4. Model-based analysis for qualitative data: an application in Drosophila germline stem cell regulation.

    Directory of Open Access Journals (Sweden)

    Michael Pargett

    2014-03-01

    Full Text Available Discovery in developmental biology is often driven by intuition that relies on the integration of multiple types of data such as fluorescent images, phenotypes, and the outcomes of biochemical assays. Mathematical modeling helps elucidate the biological mechanisms at play as the networks become increasingly large and complex. However, the available data is frequently under-utilized due to incompatibility with quantitative model tuning techniques. This is the case for stem cell regulation mechanisms explored in the Drosophila germarium through fluorescent immunohistochemistry. To enable better integration of biological data with modeling in this and similar situations, we have developed a general parameter estimation process to quantitatively optimize models with qualitative data. The process employs a modified version of the Optimal Scaling method from social and behavioral sciences, and multi-objective optimization to evaluate the trade-off between fitting different datasets (e.g. wild type vs. mutant. Using only published imaging data in the germarium, we first evaluated support for a published intracellular regulatory network by considering alternative connections of the same regulatory players. Simply screening networks against wild type data identified hundreds of feasible alternatives. Of these, five parsimonious variants were found and compared by multi-objective analysis including mutant data and dynamic constraints. With these data, the current model is supported over the alternatives, but support for a biochemically observed feedback element is weak (i.e. these data do not measure the feedback effect well. When also comparing new hypothetical models, the available data do not discriminate. To begin addressing the limitations in data, we performed a model-based experiment design and provide recommendations for experiments to refine model parameters and discriminate increasingly complex hypotheses.

  5. Monitoring of rainfall by ground-based passive microwave systems: models, measurements and applications

    Directory of Open Access Journals (Sweden)

    F. S. Marzano

    2005-01-01

    Full Text Available A large set of ground-based multi-frequency microwave radiometric simulations and measurements during different precipitation regimes are analysed. Simulations are performed for a set of frequencies from 22 to 60 GHz, representing the channels currently available on an operational ground-based radiometric system. Results are illustrated in terms of comparisons between measurements and model data in order to show that the observed radiometric signatures can be attributed to rainfall scattering and absorption. An inversion algorithm has been developed, basing on the simulated data, to retrieve rain rate from passive radiometric observations. As a validation of the approach, we have analyzed radiometric measurements during rain events occurred in Boulder, Colorado, and at the Atmospheric Radiation Measurement (ARM Program's Southern Great Plains (SGP site in Lamont, Oklahoma, USA, comparing rain rate estimates with available simultaneous rain gauge data.

  6. Downsizer - A Graphical User Interface-Based Application for Browsing, Acquiring, and Formatting Time-Series Data for Hydrologic Modeling

    Science.gov (United States)

    Ward-Garrison, Christian; Markstrom, Steven L.; Hay, Lauren E.

    2009-01-01

    The U.S. Geological Survey Downsizer is a computer application that selects, downloads, verifies, and formats station-based time-series data for environmental-resource models, particularly the Precipitation-Runoff Modeling System. Downsizer implements the client-server software architecture. The client presents a map-based, graphical user interface that is intuitive to modelers; the server provides streamflow and climate time-series data from over 40,000 measurement stations across the United States. This report is the Downsizer user's manual and provides (1) an overview of the software design, (2) installation instructions, (3) a description of the graphical user interface, (4) a description of selected output files, and (5) troubleshooting information.

  7. Application of model-based spectral analysis to wind-profiler radar observations

    Directory of Open Access Journals (Sweden)

    E. Boyer

    Full Text Available A classical way to reduce a radar’s data is to compute the spectrum using FFT and then to identify the different peak contributions. But in case an overlapping between the different echoes (atmospheric echo, clutter, hydrometeor echo. . . exists, Fourier-like techniques provide poor frequency resolution and then sophisticated peak-identification may not be able to detect the different echoes. In order to improve the number of reduced data and their quality relative to Fourier spectrum analysis, three different methods are presented in this paper and applied to actual data. Their approach consists of predicting the main frequency-components, which avoids the development of very sophisticated peak-identification algorithms. The first method is based on cepstrum properties generally used to determine the shift between two close identical echoes. We will see in this paper that this method cannot provide a better estimate than Fourier-like techniques in an operational use. The second method consists of an autoregressive estimation of the spectrum. Since the tests were promising, this method was applied to reduce the radar data obtained during two thunder-storms. The autoregressive method, which is very simple to implement, improved the Doppler-frequency data reduction relative to the FFT spectrum analysis. The third method exploits a MUSIC algorithm, one of the numerous subspace-based methods, which is well adapted to estimate spectra composed of pure lines. A statistical study of performances of this method is presented, and points out the very good resolution of this estimator in comparison with Fourier-like techniques. Application to actual data confirms the good qualities of this estimator for reducing radar’s data.

    Key words. Meteorology and atmospheric dynamics (tropical meteorology- Radio science (signal processing- General (techniques applicable in three or more fields

  8. Modeling non-saturated ferrite-based devices: Application to twin toroid ferrite phase shifters

    Science.gov (United States)

    Le Gouellec, A.; Vérissimo, G.; Laur, V.; Queffelec, P.; Albert, I.; Girard, T.

    2016-08-01

    This article describes a new set of tools developed to improve the conception and modeling of non-saturated ferrite-based devices such as twin toroid phase shifters. These new simulation tools benefit from a generalized permeability tensor model able to describe the permeability tensor of a ferrite sample whatever its magnetization state. This model is coupled to a homemade 3D multi-scale magnetostatic analysis program, which describes the evolution of the magnetization through the definition of a hysteresis loop in every mesh cell. These computed spectra are then integrated into 3D electromagnetic simulation software that retains the spatial variations of the ferrite properties by using freshly developed macro programming functions. This new approach allows the designers to accurately model complex ferrite devices such as twin toroid phase shifters. In particular, we demonstrated a good agreement between simulated and measured phase shifts as a function of applied current values with a predicted maximum phase shift of 0.96 times the measured value.

  9. RECURRENT NEURAL NETWORK MODEL BASED ON PROJECTIVE OPERATOR AND ITS APPLICATION TO OPTIMIZATION PROBLEMS

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    The recurrent neural network (RNN) model based on projective operator was studied. Different from the former study, the value region of projective operator in the neural network in this paper is a general closed convex subset of n-dimensional Euclidean space and it is not a compact convex set in general, that is, the value region of projective operator is probably unbounded. It was proved that the network has a global solution and its solution trajectory converges to some equilibrium set whenever objective function satisfies some conditions. After that, the model was applied to continuously differentiable optimization and nonlinear or implicit complementarity problems. In addition, simulation experiments confirm the efficiency of the RNN.

  10. New Combined Weighting Model Based on Maximizing the Difference in Evaluation Results and Its Application

    Directory of Open Access Journals (Sweden)

    Bin Meng

    2015-01-01

    Full Text Available This paper presents an approach for weighting indices in the comprehensive evaluation. In accordance with the principle that the entire difference of various evaluation objects is to be maximally differentiated, an adjusted weighting coefficient is introduced. Based on the idea of maximizing the difference between the adjusted evaluation scores of each evaluation object and their mean, an objective programming model is established with more obvious differentiation between evaluation scores and the combined weight coefficient determined, thereby avoiding contradictory and less distinguishable evaluation results of single weighting methods. The proposed model is demonstrated using 2,044 observations. The empirical results show that the combined weighting method has the least misjudgment probability, as well as the least error probability, when compared with four single weighting methods, namely, G1, G2, variation coefficient, and deviation methods.

  11. A Common Reasoning Model and Its Application in Knowledge—Based System

    Institute of Scientific and Technical Information of China (English)

    郑方青

    1991-01-01

    To use reasoning knowledge accurately and efficiently,many reasoning methods have been proposed.However,the differences in form among the methods may obstruct the systematical analysis and harmonious integration of them.In this paper,a common reasoning model JUM(Judgement Model)is introduced.According to JUM,a common knowledge representation form is abstracted from different reasoning methods and its limitation is reduced.We also propose an algorithm for transforming one type of JUMs into another.In some cases,the algorithm can be used to resolve the key problem of integrating different types of JUM in one system.It is possible that a new architecture of knowledge-based system can be realized under JUM.

  12. A Critical Review on Wind Turbine Power Curve Modelling Techniques and Their Applications in Wind Based Energy Systems

    Directory of Open Access Journals (Sweden)

    Vaishali Sohoni

    2016-01-01

    Full Text Available Power curve of a wind turbine depicts the relationship between output power and hub height wind speed and is an important characteristic of the turbine. Power curve aids in energy assessment, warranty formulations, and performance monitoring of the turbines. With the growth of wind industry, turbines are being installed in diverse climatic conditions, onshore and offshore, and in complex terrains causing significant departure of these curves from the warranted values. Accurate models of power curves can play an important role in improving the performance of wind energy based systems. This paper presents a detailed review of different approaches for modelling of the wind turbine power curve. The methodology of modelling depends upon the purpose of modelling, availability of data, and the desired accuracy. The objectives of modelling, various issues involved therein, and the standard procedure for power performance measurement with its limitations have therefore been discussed here. Modelling methods described here use data from manufacturers’ specifications and actual data from the wind farms. Classification of modelling methods, various modelling techniques available in the literature, model evaluation criteria, and application of soft computing methods for modelling are then reviewed in detail. The drawbacks of the existing methods and future scope of research are also identified.

  13. Application of Hyperelastic-based Active Mesh Model in Cardiac Motion Recovery.

    Science.gov (United States)

    Yousefi-Banaem, Hossein; Kermani, Saeed; Daneshmehr, Alireza; Saneie, Hamid

    2016-01-01

    Considering the nonlinear hyperelastic or viscoelastic nature of soft tissues has an important effect on modeling results. In medical applications, accounting nonlinearity begets an ill posed problem, due to absence of external force. Myocardium can be considered as a hyperelastic material, and variational approaches are proposed to estimate stiffness matrix, which take into account the linear and nonlinear properties of myocardium. By displacement estimation of some points in the four-dimensional cardiac magnetic resonance imaging series, using a similarity criterion, the elementary deformations are estimated, then using the Moore-Penrose inverse matrix approach, all point deformations are obtained. Using this process, the cardiac wall motion is quantized to mechanically determine local parameters to investigate the cardiac wall functionality. This process was implemented and tested over 10 healthy and 20 patients with myocardial infarction. In all patients, the process was able to precisely determine the affected region. The proposed approach was also compared with linear one and the results demonstrated its superiority respect to the linear model. PMID:27563570

  14. A Parallel Decision Model Based on Support Vector Machines and Its Application to Fault Diagnosis

    Institute of Scientific and Technical Information of China (English)

    Yan Weiwu(阎威武); Shao Huihe

    2004-01-01

    Many industrial process systems are becoming more and more complex and are characterized by distributed features. To ensure such a system to operate under working order, distributed parameter values are often inspected from subsystems or different points in order to judge working conditions of the system and make global decisions. In this paper, a parallel decision model based on Support Vector Machine (PDMSVM) is introduced and applied to the distributed fault diagnosis in industrial process. PDMSVM is convenient for information fusion of distributed system and it performs well in fault diagnosis with distributed features. PDMSVM makes decision based on synthetic information of subsystems and takes the advantage of Support Vector Machine. Therefore decisions made by PDMSVM are highly reliable and accurate.

  15. Variable Selection and Updating In Model-Based Discriminant Analysis for High Dimensional Data with Food Authenticity Applications.

    Science.gov (United States)

    Murphy, Thomas Brendan; Dean, Nema; Raftery, Adrian E

    2010-03-01

    Food authenticity studies are concerned with determining if food samples have been correctly labelled or not. Discriminant analysis methods are an integral part of the methodology for food authentication. Motivated by food authenticity applications, a model-based discriminant analysis method that includes variable selection is presented. The discriminant analysis model is fitted in a semi-supervised manner using both labeled and unlabeled data. The method is shown to give excellent classification performance on several high-dimensional multiclass food authenticity datasets with more variables than observations. The variables selected by the proposed method provide information about which variables are meaningful for classification purposes. A headlong search strategy for variable selection is shown to be efficient in terms of computation and achieves excellent classification performance. In applications to several food authenticity datasets, our proposed method outperformed default implementations of Random Forests, AdaBoost, transductive SVMs and Bayesian Multinomial Regression by substantial margins.

  16. Phase-based binarization of ancient document images: model and applications.

    Science.gov (United States)

    Nafchi, Hossein Ziaei; Moghaddam, Reza Farrahi; Cheriet, Mohamed

    2014-07-01

    In this paper, a phase-based binarization model for ancient document images is proposed, as well as a postprocessing method that can improve any binarization method and a ground truth generation tool. Three feature maps derived from the phase information of an input document image constitute the core of this binarization model. These features are the maximum moment of phase congruency covariance, a locally weighted mean phase angle, and a phase preserved denoised image. The proposed model consists of three standard steps: 1) preprocessing; 2) main binarization; and 3) postprocessing. In the preprocessing and main binarization steps, the features used are mainly phase derived, while in the postprocessing step, specialized adaptive Gaussian and median filters are considered. One of the outputs of the binarization step, which shows high recall performance, is used in a proposed postprocessing method to improve the performance of other binarization methodologies. Finally, we develop a ground truth generation tool, called PhaseGT, to simplify and speed up the ground truth generation process for ancient document images. The comprehensive experimental results on the DIBCO'09, H-DIBCO'10, DIBCO'11, H-DIBCO'12, DIBCO'13, PHIBD'12, and BICKLEY DIARY data sets show the robustness of the proposed binarization method on various types of degradation and document images. PMID:24816587

  17. APPLICATION OF A MODIFIED QUICK SCHEME TO DEPTHAVERAGED k-( TURBULENCE MODEL BASED ON UNSTRUCTURED GRIDS

    Institute of Scientific and Technical Information of China (English)

    HUA Zu-lin; XING Ling-hang; GU Li

    2008-01-01

    The modified QUICK scheme on unstructured grid was used to improve the advection flux approximation, and the depth-averaged turbulence model with the scheme based on FVM by SIMPLE series algorithm was established and applied to spur-dike flow computation. In this model, the over-relaxed approach was adopted to estimate the diffusion flux in view of its advantages in reducing errors and sustaining numerical stability usually encountered in non-orthogonal meshes. Two spur-dike cases with different defection angles (90oand 135o) were analyzed to validate the model. Computed results show that the predicted velocities and recirculation lengths are in good agreement with the observed data. Moreover, the computations on structured and unstructured grids were compared in terms of the approximately equivalent grid numbers. It can be concluded that the precision with unstructured grids is higher than that with structured grids in spite that the CPU time required is slightly more with unstructured grids. Thus, it is significant to apply the method to numerical simulation of practical hydraulic engineering.

  18. An Agent-Based Modeling Framework and Application for the Generic Nuclear Fuel Cycle

    Science.gov (United States)

    Gidden, Matthew J.

    Key components of a novel methodology and implementation of an agent-based, dynamic nuclear fuel cycle simulator, Cyclus , are presented. The nuclear fuel cycle is a complex, physics-dependent supply chain. To date, existing dynamic simulators have not treated constrained fuel supply, time-dependent, isotopic-quality based demand, or fuel fungibility particularly well. Utilizing an agent-based methodology that incorporates sophisticated graph theory and operations research techniques can overcome these deficiencies. This work describes a simulation kernel and agents that interact with it, highlighting the Dynamic Resource Exchange (DRE), the supply-demand framework at the heart of the kernel. The key agent-DRE interaction mechanisms are described, which enable complex entity interaction through the use of physics and socio-economic models. The translation of an exchange instance to a variant of the Multicommodity Transportation Problem, which can be solved feasibly or optimally, follows. An extensive investigation of solution performance and fidelity is then presented. Finally, recommendations for future users of Cyclus and the DRE are provided.

  19. Modelling Foundations and Applications

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 8th European Conference on Modelling Foundations and Applications, held in Kgs. Lyngby, Denmark, in July 2012. The 20 revised full foundations track papers and 10 revised full applications track papers presented were carefully reviewed and sel...

  20. An overview of current applications, challenges, and future trends in distributed process-based models in hydrology

    Science.gov (United States)

    Fatichi, Simone; Vivoni, Enrique R.; Odgen, Fred L; Ivanov, Valeriy Y; Mirus, Benjamin B.; Gochis, David; Downer, Charles W; Camporese, Matteo; Davison, Jason H; Ebel, Brian A.; Jones, Norm; Kim, Jongho; Mascaro, Giuseppe; Niswonger, Richard; Restrepo, Pedro; Rigon, Riccardo; Shen, Chaopeng; Sulis, Mauro; Tarboton, David

    2016-01-01

    Process-based hydrological models have a long history dating back to the 1960s. Criticized by some as over-parameterized, overly complex, and difficult to use, a more nuanced view is that these tools are necessary in many situations and, in a certain class of problems, they are the most appropriate type of hydrological model. This is especially the case in situations where knowledge of flow paths or distributed state variables and/or preservation of physical constraints is important. Examples of this include: spatiotemporal variability of soil moisture, groundwater flow and runoff generation, sediment and contaminant transport, or when feedbacks among various Earth’s system processes or understanding the impacts of climate non-stationarity are of primary concern. These are situations where process-based models excel and other models are unverifiable. This article presents this pragmatic view in the context of existing literature to justify the approach where applicable and necessary. We review how improvements in data availability, computational resources and algorithms have made detailed hydrological simulations a reality. Avenues for the future of process-based hydrological models are presented suggesting their use as virtual laboratories, for design purposes, and with a powerful treatment of uncertainty.

  1. An overview of current applications, challenges, and future trends in distributed process-based models in hydrology

    Science.gov (United States)

    Fatichi, Simone; Vivoni, Enrique R.; Ogden, Fred L.; Ivanov, Valeriy Y.; Mirus, Benjamin; Gochis, David; Downer, Charles W.; Camporese, Matteo; Davison, Jason H.; Ebel, Brian; Jones, Norm; Kim, Jongho; Mascaro, Giuseppe; Niswonger, Richard; Restrepo, Pedro; Rigon, Riccardo; Shen, Chaopeng; Sulis, Mauro; Tarboton, David

    2016-06-01

    Process-based hydrological models have a long history dating back to the 1960s. Criticized by some as over-parameterized, overly complex, and difficult to use, a more nuanced view is that these tools are necessary in many situations and, in a certain class of problems, they are the most appropriate type of hydrological model. This is especially the case in situations where knowledge of flow paths or distributed state variables and/or preservation of physical constraints is important. Examples of this include: spatiotemporal variability of soil moisture, groundwater flow and runoff generation, sediment and contaminant transport, or when feedbacks among various Earth's system processes or understanding the impacts of climate non-stationarity are of primary concern. These are situations where process-based models excel and other models are unverifiable. This article presents this pragmatic view in the context of existing literature to justify the approach where applicable and necessary. We review how improvements in data availability, computational resources and algorithms have made detailed hydrological simulations a reality. Avenues for the future of process-based hydrological models are presented suggesting their use as virtual laboratories, for design purposes, and with a powerful treatment of uncertainty.

  2. Model-Based Testing for Web Applications%基于模型的Web应用测试

    Institute of Scientific and Technical Information of China (English)

    缪淮扣; 陈圣波; 曾红卫

    2011-01-01

    提出了基于模型的Web应用测试方法,包括建模、测试用例生成、测试用例的执行、模型以及测试用例的可视化等关键技术.设计并实现一个基于模型的Web应用测试系统.以FSM作为被测Web应用的形式测试模型,集成了模型转换器、测试目标分析器、测试序列生成器、FSM和测试序列可视化以及Web应用测试执行引擎等工具.除支持状态覆盖、迁移覆盖、迁移对覆盖等传统的覆盖准则外,还改进/提出了优化状态迁移覆盖、完整消息传递覆盖、完整功能交互覆盖和功能循环交互覆盖等覆盖准则.该文以兴宁水库移民信息管理系统为例演示了该系统.%In this paper, a testing approach to model-based testing for Web applications is proposed which involves in Web application modeling, test generation, test execution and visualization for Web models and test sequences. The authors design and implement a model-based testing system for Web applications while the FSM is regarded as a formal testing model of Web applications under test. And this system integrates Model Transformer, Test Purposes Analyzer, Test Sequences Generator, Visualization tools for FSM and test sequences, Test Execution Engine,etc. Furthermore, the system not only supports the traditional test coverage criteria such as State Coverage, Transition Coverage, Transition Pair Coverage, but also the criteria proposed and improved including Optimized State and Transition Coverage, Complete Message Pass Coverage,Complete Function Interaction Coverage and Function Loop Interaction Coverage. Finally, the authors demonstrate the system taking the Xingning Reservoir Resettlement MIS as our Web application under test.

  3. Application of model-based spectral analysis to wind-profiler radar observations

    Energy Technology Data Exchange (ETDEWEB)

    Boyer, E. [ENS, Cachan (France). LESiR; Petitdidier, M.; Corneil, W. [CETP, Velizy (France); Adnet, C. [THALES Air Dfense, Bagneux (France); Larzabal, P. [ENS, Cachan (France). LESiR; IUT, Cachan (France). CRIIP

    2001-08-01

    A classical way to reduce a radar's data is to compute the spectrum using FFT and then to identify the different peak contributions. But in case an overlapping between the different echoes (atmospheric echo, clutter, hydrometer echo..) exists, Fourier-like techniques provide poor frequency resolution and then sophisticated peak-identification may not be able to detect the different echoes. In order to improve the number of reduced data and their quality relative to Fourier spectrum analysis, three different methods are presented in this paper and applied to actual data. Their approach consists of predicting the main frequency-components, which avoids the development of very sophisticated peak-identification algorithms. The first method is based on cepstrum properties generally used to determine the shift between two close identical echoes. We will see in this paper that this method cannot provide a better estimate than Fourier-like techniques in an operational use. The second method consists of an autoregressive estimation of the spectrum. Since the tests were promising, this method was applied to reduce the radar data obtained during two thunderstorms. The autoregressive method, which is very simple to implement, improved the Doppler-frequency data reduction relative to the FFT spectrum analysis. The third method exploits a MUSIC algorithm, one of the numerous subspace-based methods, which is well adapted to estimate spectra composed of pure lines. A statistical study of performances of this method is presented, and points out the very good resolution of this estimator in comparison with Fourier-like techniques. Application to actual data confirms the good qualities of this estimator for reducing radar's data. (orig.)

  4. Application of model-based spectral analysis to wind-profiler radar observations

    Science.gov (United States)

    Boyer, E.; Petitdidier, M.; Corneil, W.; Adnet, C.; Larzabal, P.

    2001-08-01

    A classical way to reduce a radar’s data is to compute the spectrum using FFT and then to identify the different peak contributions. But in case an overlapping between the different echoes (atmospheric echo, clutter, hydrometeor echo. . . ) exists, Fourier-like techniques provide poor frequency resolution and then sophisticated peak-identification may not be able to detect the different echoes. In order to improve the number of reduced data and their quality relative to Fourier spectrum analysis, three different methods are presented in this paper and applied to actual data. Their approach consists of predicting the main frequency-components, which avoids the development of very sophisticated peak-identification algorithms. The first method is based on cepstrum properties generally used to determine the shift between two close identical echoes. We will see in this paper that this method cannot provide a better estimate than Fourier-like techniques in an operational use. The second method consists of an autoregressive estimation of the spectrum. Since the tests were promising, this method was applied to reduce the radar data obtained during two thunder-storms. The autoregressive method, which is very simple to implement, improved the Doppler-frequency data reduction relative to the FFT spectrum analysis. The third method exploits a MUSIC algorithm, one of the numerous subspace-based methods, which is well adapted to estimate spectra composed of pure lines. A statistical study of performances of this method is presented, and points out the very good resolution of this estimator in comparison with Fourier-like techniques. Application to actual data confirms the good qualities of this estimator for reducing radar’s data.

  5. Viscoelastic model based force control for soft tissue interaction and its application in physiological motion compensation.

    Science.gov (United States)

    Moreira, Pedro; Zemiti, Nabil; Liu, Chao; Poignet, Philippe

    2014-09-01

    Controlling the interaction between robots and living soft tissues has become an important issue as the number of robotic systems inside the operating room increases. Many researches have been done on force control to help surgeons during medical procedures, such as physiological motion compensation and tele-operation systems with haptic feedback. In order to increase the performance of such controllers, this work presents a novel force control scheme using Active Observer (AOB) based on a viscoelastic interaction model. The control scheme has shown to be stable through theoretical analysis and its performance was evaluated by in vitro experiments. In order to evaluate how the force control scheme behaves under the presence of physiological motion, experiments considering breathing and beating heart disturbances are presented. The proposed control scheme presented a stable behavior in both static and moving environment. The viscoelastic AOB presented a compensation ratio of 87% for the breathing motion and 79% for the beating heart motion. PMID:24612709

  6. Supply Chain Vulnerability Analysis Using Scenario-Based Input-Output Modeling: Application to Port Operations.

    Science.gov (United States)

    Thekdi, Shital A; Santos, Joost R

    2016-05-01

    Disruptive events such as natural disasters, loss or reduction of resources, work stoppages, and emergent conditions have potential to propagate economic losses across trade networks. In particular, disruptions to the operation of container port activity can be detrimental for international trade and commerce. Risk assessment should anticipate the impact of port operation disruptions with consideration of how priorities change due to uncertain scenarios and guide investments that are effective and feasible for implementation. Priorities for protective measures and continuity of operations planning must consider the economic impact of such disruptions across a variety of scenarios. This article introduces new performance metrics to characterize resiliency in interdependency modeling and also integrates scenario-based methods to measure economic sensitivity to sudden-onset disruptions. The methods will be demonstrated on a U.S. port responsible for handling $36.1 billion of cargo annually. The methods will be useful to port management, private industry supply chain planning, and transportation infrastructure management.

  7. The k-nearest neighbour-based GMDH prediction model and its applications

    Science.gov (United States)

    Li, Qiumin; Tian, Yixiang; Zhang, Gaoxun

    2014-11-01

    This paper centres on a new GMDH (group method of data handling) algorithm based on the k-nearest neighbour (k-NN) method. Instead of the transfer function that has been used in traditional GMDH, the k-NN kernel function is adopted in the proposed GMDH to characterise relationships between the input and output variables. The proposed method combines the advantages of the k-nearest neighbour (k-NN) algorithm and GMDH algorithm, and thus improves the predictive capability of the GMDH algorithm. It has been proved that when the bandwidth of the kernel is less than a certain constant C, the predictive capability of the new model is superior to that of the traditional one. As an illustration, it is shown that the new method can accurately forecast consumer price index (CPI).

  8. Supply Chain Vulnerability Analysis Using Scenario-Based Input-Output Modeling: Application to Port Operations.

    Science.gov (United States)

    Thekdi, Shital A; Santos, Joost R

    2016-05-01

    Disruptive events such as natural disasters, loss or reduction of resources, work stoppages, and emergent conditions have potential to propagate economic losses across trade networks. In particular, disruptions to the operation of container port activity can be detrimental for international trade and commerce. Risk assessment should anticipate the impact of port operation disruptions with consideration of how priorities change due to uncertain scenarios and guide investments that are effective and feasible for implementation. Priorities for protective measures and continuity of operations planning must consider the economic impact of such disruptions across a variety of scenarios. This article introduces new performance metrics to characterize resiliency in interdependency modeling and also integrates scenario-based methods to measure economic sensitivity to sudden-onset disruptions. The methods will be demonstrated on a U.S. port responsible for handling $36.1 billion of cargo annually. The methods will be useful to port management, private industry supply chain planning, and transportation infrastructure management. PMID:26271771

  9. Large eddy simulation based fire modeling applications for Indian nuclear power plant

    International Nuclear Information System (INIS)

    Full text of publication follows: The Nuclear Power Plants (NPPs) are always designed for the highest level of safety against postulated accidents which may be initiated due to internal or external causes. One of the external/internal causes, which may lead to accident in the reactor and its associated systems, is fire in certain vital areas of the plant. Conventionally, the fire containment approach and/or the fire confinement approach is used in designing the fire protection systems of NPPs. Indian NPPs (PHWRs) follow the combined approach to ensure plant safety and all newly designed plants are required to comply with the provisions of Atomic Energy Regulatory Board (AERB) fire safety Guide. In respect of older plants, the reassessment of adequacy of fire safety provisions in the light of current advances has becomes essential so as to decide upon the steps for retrofitting. Keeping this in mind the deterministic fire hazard analysis was carried out for the Madras Atomic Power Station (MAPS). As a part of this exercise, detailed fire consequences analysis was required to be carried out for various critical areas. The choice of CFD based code was considered appropriate for these studies. A dedicated fire hazard analysis code Fire Dynamics Simulator (FDS) from NIST was used to perform these case studies. The code has option to use advanced fire models based on Large Eddy Simulation (LES) technique/ Direct Numerical Simulation (DNS) to model the fire-generated conditions. The LES option has been extensively used in the present studies which were primarily aimed at estimating the damage time for important safety related cable. Present paper describes the salient features of the methodology and important results for one of the most critical areas i.e. cable bridge area of MAPS. The typical dimensions of the cable bridge area are (length x breadth x height) of 12 m x 6 m x 2.5 m with an opening on one side of the cable bridge area. With almost equal gap, six numbers

  10. A novel Q-based online model updating strategy and its application in statistical process control for rubber mixing

    Institute of Scientific and Technical Information of China (English)

    Chunying Zhang; Sun Chen; Fang Wu; Kai Song

    2015-01-01

    To overcome the large time-delay in measuring the hardness of mixed rubber, rheological parameters were used to predict the hardness. A novel Q-based model updating strategy was proposed as a universal platform to track time-varying properties. Using a few selected support samples to update the model, the strategy could dramat-ical y save the storage cost and overcome the adverse influence of low signal-to-noise ratio samples. Moreover, it could be applied to any statistical process monitoring system without drastic changes to them, which is practical for industrial practices. As examples, the Q-based strategy was integrated with three popular algorithms (partial least squares (PLS), recursive PLS (RPLS), and kernel PLS (KPLS)) to form novel regression ones, QPLS, QRPLS and QKPLS, respectively. The applications for predicting mixed rubber hardness on a large-scale tire plant in east China prove the theoretical considerations.

  11. Approximation of skewed interfaces with tensor-based model reduction procedures: Application to the reduced basis hierarchical model reduction approach

    Science.gov (United States)

    Ohlberger, Mario; Smetana, Kathrin

    2016-09-01

    In this article we introduce a procedure, which allows to recover the potentially very good approximation properties of tensor-based model reduction procedures for the solution of partial differential equations in the presence of interfaces or strong gradients in the solution which are skewed with respect to the coordinate axes. The two key ideas are the location of the interface either by solving a lower-dimensional partial differential equation or by using data functions and the subsequent removal of the interface of the solution by choosing the determined interface as the lifting function of the Dirichlet boundary conditions. We demonstrate in numerical experiments for linear elliptic equations and the reduced basis-hierarchical model reduction approach that the proposed procedure locates the interface well and yields a significantly improved convergence behavior even in the case when we only consider an approximation of the interface.

  12. Modeling and Application of a Rapid Fluorescence-Based Assay for Biotoxicity in Anaerobic Digestion.

    Science.gov (United States)

    Chen, Jian Lin; Steele, Terry W J; Stuckey, David C

    2015-11-17

    The sensitivity of anaerobic digestion metabolism to a wide range of solutes makes it important to be able to monitor toxicants in the feed to anaerobic digesters to optimize their operation. In this study, a rapid fluorescence measurement technique based on resazurin reduction using a microplate reader was developed and applied for the detection of toxicants and/or inhibitors to digesters. A kinetic model was developed to describe the process of resazurin reduced to resorufin, and eventually to dihydroresorufin under anaerobic conditions. By modeling the assay results of resazurin (0.05, 0.1, 0.2, and 0.4 mM) reduction by a pure facultative anaerobic strain, Enterococcus faecalis, and fresh mixed anaerobic sludge, with or without 10 mg L(-1) spiked pentachlorophenol (PCP), we found it was clear that the pseudo-first-order rate constant for the reduction of resazurin to resorufin, k1, was a good measure of "toxicity". With lower biomass density and the optimal resazurin addition (0.1 mM), the toxicity of 10 mg L(-1) PCP for E. faecalis and fresh anaerobic sludge was detected in 10 min. By using this model, the toxicity differences among seven chlorophenols to E. faecalis and fresh mixed anaerobic sludge were elucidated within 30 min. The toxicity differences determined by this assay were comparable to toxicity sequences of various chlorophenols reported in the literature. These results suggest that the assay developed in this study not only can quickly detect toxicants for anaerobic digestion but also can efficiently detect the toxicity differences among a variety of similar toxicants. PMID:26457928

  13. Application of Finite Element Modeling Methods in Magnetic Resonance Imaging-Based Research and Clinical Management

    Science.gov (United States)

    Fwu, Peter Tramyeon

    The medical image is very complex by its nature. Modeling built upon the medical image is challenging due to the lack of analytical solution. Finite element method (FEM) is a numerical technique which can be used to solve the partial differential equations. It utilized the transformation from a continuous domain into solvable discrete sub-domains. In three-dimensional space, FEM has the capability dealing with complicated structure and heterogeneous interior. That makes FEM an ideal tool to approach the medical-image based modeling problems. In this study, I will address the three modeling in (1) photon transport inside the human breast by implanting the radiative transfer equation to simulate the diffuse optical spectroscopy imaging (DOSI) in order to measurement the percent density (PD), which has been proven as a cancer risk factor in mammography. Our goal is to use MRI as the ground truth to optimize the DOSI scanning protocol to get a consistent measurement of PD. Our result shows DOSI measurement is position and depth dependent and proper scanning scheme and body configuration are needed; (2) heat flow in the prostate by implementing the Penne's bioheat equation to evaluate the cooling performance of regional hypothermia during the robot assisted radical prostatectomy for the individual patient in order to achieve the optimal cooling setting. Four factors are taken into account during the simulation: blood abundance, artery perfusion, cooling balloon temperature, and the anatomical distance. The result shows that blood abundance, prostate size, and anatomical distance are significant factors to the equilibrium temperature of neurovascular bundle; (3) shape analysis in hippocampus by using the radial distance mapping, and two registration methods to find the correlation between sub-regional change to the age and cognition performance, which might not reveal in the volumetric analysis. The result gives a fundamental knowledge of normal distribution in young

  14. Application of mathematical modeling-based algorithms to 'off-carrier' cobalt-60 irradiation processes

    International Nuclear Information System (INIS)

    The irradiation of materials and products 'off carrier' has historically been performed using a 'drop-and-read' methodology whereby the radioisotope source is raised and lowered repeatedly until the desired absorbed dose is achieved. This approach is time consuming from both a manpower and process perspective. Static irradiation-based processes can also be costly because of the need for repeated experimental verification of target dose delivery. In our paper we address the methods used for predicting Ethicon Endo Surgery's (EES's) off-carrier absorbed dose distributions. The scenarios described herein are complex due to the fact that the on-carrier process stream exhibits a wide range of densities and dose rates. The levels of observed complexity are attributed to the 'just-in-time' production strategy and its related requirements as they apply to the programming of EES's cobalt-60 irradiators. Validation of off-carrier processing methodologies requires sophisticated parametric-based systems utilizing mathematical algorithms that predict off-carrier absorbed dose rate relative to the on-carrier process stream components. Irradiation process simulation is achieved using a point kernel computer modeling approach, coupled with database generation and maintenance. Dose prediction capabilities are validated via routine and transfer standard dosimetry

  15. Electron–phonon coupling in Ni-based binary alloys with application to displacement cascade modeling

    International Nuclear Information System (INIS)

    Energy transfer between lattice atoms and electrons is an important channel of energy dissipation during displacement cascade evolution in irradiated materials. On the assumption of small atomic displacements, the intensity of this transfer is controlled by the strength of electron–phonon (el–ph) coupling. The el–ph coupling in concentrated Ni-based alloys was calculated using electronic structure results obtained within the coherent potential approximation. It was found that Ni0.5Fe0.5, Ni0.5Co0.5 and Ni0.5Pd0.5 are ordered ferromagnetically, whereas Ni0.5Cr0.5 is nonmagnetic. Since the magnetism in these alloys has a Stoner-type origin, the magnetic ordering is accompanied by a decrease of electronic density of states at the Fermi level, which in turn reduces the el–ph coupling. Thus, the el–ph coupling values for all alloys are approximately 50% smaller in the magnetic state than for the same alloy in a nonmagnetic state. As the temperature increases, the calculated coupling initially increases. After passing the Curie temperature, the coupling decreases. The rate of decrease is controlled by the shape of the density of states above the Fermi level. Introducing a two-temperature model based on these parameters in 10 keV molecular dynamics cascade simulation increases defect production by 10–20% in the alloys under consideration. (paper)

  16. An Application in the Ultra Short-term Prediction of UT1--UTC Based on Grey System Model

    Science.gov (United States)

    Lei, Y.; Zhao, D. N.; Cai, H. B.

    2016-05-01

    This work presents an application of the grey system model in the prediction of UT1--UTC. The short-term prediction of UT1--UTC is studied up to 30 days by means of the grey system model. The EOP (Earth orientation parameter) C04 time series with daily values from the International Earth Rotation and Reference Systems Service (IERS) serve as the data base. The results of the prediction are analyzed and compared with those obtained by the artificial neural network (ANN), the combination of least squares (LS) and autoregressive (AR) model (LS+AR), and the Earth Orientation Parameters Prediction Comparison Campaign (EOP PCC). The accuracies of the ultra short-term (1--10 d) prediction are comparable to those obtained by the other prediction methods. The presented method is easy to use.

  17. Geophysical Applications of Vegetation Modeling

    OpenAIRE

    J. O. Kaplan

    2001-01-01

    This thesis describes the development and selected applications of a global vegetation model, BIOME4. The model is applied to problems in high-latitude vegetation distribution and climate, trace gas production, and isotope biogeochemistry. It demonstrates how a modeling approach, based on principles of plant physiology and ecology, can be applied to interdisciplinary problems that cannot be adequately addressed by direct observations or experiments. The work is relevant to understanding the p...

  18. Sznajd model and its applications

    CERN Document Server

    Sznajd-Weron, K

    2005-01-01

    In 2000 we proposed a sociophysics model of opinion formation, which was based on trade union maxim "United we Stand, Divided we Fall" (USDF) and latter due to Dietrich Stauffer became known as the Sznajd model (SM). The main difference between SM compared to voter or Ising-type models is that information flows outward. In this paper we review the modifications and applications of SM that have been proposed in the literature.

  19. Pareto-efficient deployment synthesis for safety-critical applications in seamless model-based development

    OpenAIRE

    Zverlov, Sergey; Khalil, Maged; Chaudhary, Mayank

    2016-01-01

    International audience Increasingly complex functionality in automotive applications demand more and more computing power. As room for computing units in modern vehicles dwindles, centralized ar-chitectures-with larger, more powerful processing units-are the trend. With this trend, applications no longer run on dedicated hardware, but share the same computing resources with others on the centralized platform. Ascertaining efficient deployment and scheduling for co-located applications is c...

  20. A novel physical eco-hydrological model concept for preferential flow based on experimental applications.

    Science.gov (United States)

    Jackisch, Conrad; van Schaik, Loes; Graeff, Thomas; Zehe, Erwin

    2014-05-01

    Preferential flow through macropores often determines hydrological characteristics - especially regarding runoff generation and fast transport of solutes. Macropore settings may yet be very different in nature and dynamics, depending on their origin. While biogenic structures follow activity cycles (e.g. earth worms) and population conditions (e.g. roots), pedogenic and geogenic structures may depend on water stress (e.g. cracks) or large events (e.g. flushed voids between skeleton and soil pipes) or simply persist (e.g. bedrock interface). On the one hand, such dynamic site characteristics can be observed in seasonal changes in its reaction to precipitation. On the other hand, sprinkling experiments accompanied by tracers or time-lapse 3D Ground-Penetrating-Radar are suitable tools to determine infiltration patterns and macropore configuration. However, model representation of the macropore-matrix system is still problematic, because models either rely on effective parameters (assuming well-mixed state) or on explicit advection strongly simplifying or neglecting interaction with the diffusive flow domain. Motivated by the dynamic nature of macropores, we present a novel model approach for interacting diffusive and advective water, solutes and energy transport in structured soils. It solely relies on scale- and process-aware observables. A representative set of macropores (data from sprinkling experiments) determines the process model scale through 1D advective domains. These are connected to a 2D matrix domain which is defined by pedo-physical retention properties. Water is represented as particles. Diffusive flow is governed by a 2D random walk of these particles while advection may take place in the macropore domain. Macropore-matrix interaction is computed as dissipation of the advective momentum of a particle by its experienced drag from the matrix domain. Through a representation of matrix and macropores as connected diffusive and advective domains for water

  1. Application of Model Based Prognostics to Pneumatic Valves in a Cryogenic Propellant Loading Testbed

    Data.gov (United States)

    National Aeronautics and Space Administration — Pneumatic-actuated valves are critical components in many applications, including cryogenic propellant loading for space operations. For these components, failures...

  2. Modeling Evidence-Based Application: Using Team-Based Learning to Increase Higher Order Thinking in Nursing Research

    Directory of Open Access Journals (Sweden)

    Bridget Moore

    2015-06-01

    Full Text Available Nursing practice is comprised of knowledge, theory, and research [1]. Because of its impact on the profession, the appraisal of research evidence is critically important. Future nursing professionals must be introduced to the purpose and utility of nursing research, as early exposure provides an opportunity to embed evidence-based practice (EBP into clinical experiences. The AACN requires baccalaureate education to include an understanding of the research process to integrate reliable evidence to inform practice and enhance clinical judgments [1]. Although the importance of these knowledge competencies are evident to healthcare administrators and nursing leaders within the field, undergraduate students at the institution under study sometimes have difficulty understanding the relevance of nursing research to the baccalaureate prepared nurse, and struggle to grasp advanced concepts of qualitative and quantitative research design and methodologies. As undergraduate nursing students generally have not demonstrated an understanding of the relationship between theoretical concepts found within the undergraduate nursing curriculum and the practical application of these concepts in the clinical setting, the research team decided to adopt an effective pedagogical active learning strategy, team-based learning (TBL. Team-based learning shifts the traditional course design to focus on higher thinking skills to integrate desired knowledge [2]. The purpose of this paper is to discuss the impact of course design with the integration of TBL in an undergraduate nursing research course on increasing higher order thinking. [1] American Association of Colleges of Nursing, The Essentials of Baccalaureate Education for Professional Nursing Practice, Washington, DC: American Association of Colleges of Nursing, 2008. [2] B. Bloom, Taxonomy of Educational Objectives, Handbook I: Cognitive Domain, New York: McKay, 1956.

  3. Model-Based Systems Engineering for Capturing Mission Architecture System Processes with an Application Case Study - Orion Flight Test 1

    Science.gov (United States)

    Bonanne, Kevin H.

    2011-01-01

    Model-based Systems Engineering (MBSE) is an emerging methodology that can be leveraged to enhance many system development processes. MBSE allows for the centralization of an architecture description that would otherwise be stored in various locations and formats, thus simplifying communication among the project stakeholders, inducing commonality in representation, and expediting report generation. This paper outlines the MBSE approach taken to capture the processes of two different, but related, architectures by employing the Systems Modeling Language (SysML) as a standard for architecture description and the modeling tool MagicDraw. The overarching goal of this study was to demonstrate the effectiveness of MBSE as a means of capturing and designing a mission systems architecture. The first portion of the project focused on capturing the necessary system engineering activities that occur when designing, developing, and deploying a mission systems architecture for a space mission. The second part applies activities from the first to an application problem - the system engineering of the Orion Flight Test 1 (OFT-1) End-to-End Information System (EEIS). By modeling the activities required to create a space mission architecture and then implementing those activities in an application problem, the utility of MBSE as an approach to systems engineering can be demonstrated.

  4. An Effective Security Mechanism for M-Commerce Applications Exploiting Ontology Based Access Control Model for Healthcare System

    Directory of Open Access Journals (Sweden)

    S.M. Roychoudri

    2016-09-01

    Full Text Available Health organizations are beginning to move mobile commerce services in recent years to enhance services and quality without spending much investment for IT infrastructure. Medical records are very sensitive and private to any individuals. Hence effective security mechanism is required. The challenges of our research work are to maintain privacy for the users and provide smart and secure environment for accessing the application. It is achieved with the help of personalization. Internet has provided the way for personalization. Personalization is a term which refers to the delivery of information that is relevant to individual or group of individuals in the format, layout specified and in time interval. In this paper we propose an Ontology Based Access Control (OBAC Model that can address the permitted access control among the service providers and users. Personal Health Records sharing is highly expected by the users for the acceptance in mobile commerce applications in health care systems.

  5. Research of Optical Fiber Coil Winding Model Based on Large-deformation Theory of Elasticity and Its Application

    Institute of Scientific and Technical Information of China (English)

    JIA Ming; YANG Gongliu

    2011-01-01

    Optical fiber coil winding model is used to guide proper and high precision coil winding for fiber optic gyroscope(FOG)application.Based on the large-deformation theory of elasticity,stress analysis of optical fiber free end has been made and balance equation of infinitesimal fiber is deduced,then deformation equation is derived by substituting terminal conditions.On condition that only axial tensile force exists,appmximate curve equation has been built in small angle deformation scope.The comparison of tangent point longitudinal coordinate result between theory and approximation gives constant of integration,and expression with tangent point as origin of coordinate is readjusted.Analyzing the winding parameters of an example,it is clear that the horizontal distance from the highest point of wheel to fiber tangent point has millimeter order of magnitude and significant difference with fiber tension variation,and maintains invariant when wheel radius changes.The height of tensioo and accurate position of tangent point are defined for proper fiber guide.For application to fiber optic gyroscope,spiral-disc winding method and nonideal deformation of straddle section are analyzed,and then spiral-disc quadrupole pattern winding method has been introduced and realized by winding system.The winding results approve that the winding model is applicable.

  6. Application of a process-based shallow landslide hazard model over a broad area in Central Italy

    Science.gov (United States)

    Gioia, Eleonora; Speranza, Gabriella; Ferretti, Maurizio; Godt, Jonathan W.; Baum, Rex L.; Marincioni, Fausto

    2015-01-01

    Process-based models are widely used for rainfall-induced shallow landslide forecasting. Previous studies have successfully applied the U.S. Geological Survey’s Transient Rainfall Infiltration and Grid-Based Regional Slope-Stability (TRIGRS) model (Baum et al. 2002) to compute infiltration-driven changes in the hillslopes’ factor of safety on small scales (i.e., tens of square kilometers). Soil data input for such models are difficult to obtain across larger regions. This work describes a novel methodology for the application of TRIGRS over broad areas with relatively uniform hydrogeological properties. The study area is a 550-km2 region in Central Italy covered by post-orogenic Quaternary sediments. Due to the lack of field data, we assigned mechanical and hydrological property values through a statistical analysis based on literature review of soils matching the local lithologies. We calibrated the model using rainfall data from 25 historical rainfall events that triggered landslides. We compared the variation of pressure head and factor of safety with the landslide occurrence to identify the best fitting input conditions. Using calibrated inputs and a soil depth model, we ran TRIGRS for the study area. Receiver operating characteristic (ROC) analysis, comparing the model’s output with a shallow landslide inventory, shows that TRIGRS effectively simulated the instability conditions in the post-orogenic complex during historical rainfall scenarios. The implication of this work is that rainfall-induced landslides over large regions may be predicted by a deterministic model, even where data on geotechnical and hydraulic properties as well as temporal changes in topography or subsurface conditions are not available.

  7. Reaction invariant-based reduction of the activated sludge model ASM1 for batch applications

    DEFF Research Database (Denmark)

    Santa Cruz, Judith A.; Mussati, Sergio F.; Scenna, Nicolás J.;

    2016-01-01

    to batch activated sludge processes described by the Activated Sludge Model No. 1 (ASM1) for carbon and nitrogen removal. The objective of the model reduction is to describe the exact dynamics of the states predicted by the original model with a lower number of ODEs. This leads to a reduction...

  8. Application of statistical emulation to an agent-based model: assortative mating and the reversal of gender inequality in education in Belgium.

    OpenAIRE

    De Mulder, Wim; Grow, André; Molenberghs, Geert; Verbeke, Geert

    2015-01-01

    We describe the application of statistical emulation to the outcomes of an agent-based model. The agent-based model simulates the mechanisms that might have linked the reversal of gender inequality in higher education with observed changes in educational assortative mating in Belgium. Using the statistical emulator as a computationally fast approximation to the expensive agent-based model, it is feasible to use a genetic algorithm in nding the parameter values for which the correspondin...

  9. Application of Nonlinear Predictive Control Based on RBF Network Predictive Model in MCFC Plant

    Institute of Scientific and Technical Information of China (English)

    CHEN Yue-hua; CAO Guang-yi; ZHU Xin-jian

    2007-01-01

    This paper described a nonlinear model predictive controller for regulating a molten carbonate fuel cell (MCFC). A detailed mechanism model of output voltage of a MCFC was presented at first. However, this model was too complicated to be used in a control system. Consequently, an off line radial basis function (RBF) network was introduced to build a nonlinear predictive model. And then, the optimal control sequences were obtained by applying golden mean method. The models and controller have been realized in the MATLAB environment. Simulation results indicate the proposed algorithm exhibits satisfying control effect even when the current densities vary largely.

  10. Fuzzy Identification Based on T-S Fuzzy Model and Its Application for SCR System

    Science.gov (United States)

    Zeng, Fanchun; Zhang, Bin; Zhang, Lu; Ji, Jinfu; Jin, Wenjing

    An improved T-S model was introduced to identify the model of SCR system. Model structure was selected by physical analyzes and mathematics tests. Three different clustering algorithms were introduced to obtain space partitions. Then, space partitions were amended by mathematics methods. At last, model parameters were identified by least square method. Train data was sampled in 1000MW coal-fired unit SCR system. T-S model of it is identified by three cluster methods. Identify results are proved effective. The merit and demerit among them are analyzed in the end.

  11. An Optimal Decision Assessment Model Based on the Acceptable Maximum LGD of Commercial Banks and Its Application

    Directory of Open Access Journals (Sweden)

    Baofeng Shi

    2016-01-01

    Full Text Available This paper introduces a novel decision assessment method which is suitable for customers’ credit risk evaluation and credit decision. First of all, the paper creates an optimal credit rating model, and it consisted of an objective function and two constraint conditions. The first constraint condition of the strictly increasing LGDs eliminates the unreasonable phenomenon that the higher the credit rating is, the higher the LGD (loss given default is. Secondly, on the basis of the credit rating results, a credit decision-making assessment model based on measuring the acceptable maximum LGD of commercial banks is established. Thirdly, empirical results using the data on 2817 farmers’ microfinance of a Chinese commercial bank suggest that the proposed approach can accurately find out the good customers from all the loan applications. Moreover, our approach contributes to providing a reference for decision assessment of customers in other commercial banks in the world.

  12. Two-stage robust UC including a novel scenario-based uncertainty model for wind power applications

    International Nuclear Information System (INIS)

    Highlights: • Methodological framework for obtaining Robust Unit Commitment (UC) policies. • Wind-power forecast using a revisited bootstrap predictive inference approach. • Novel scenario-based model for wind-power uncertainty. • Efficient modeling framework for obtaining nearly optimal UC policies in reasonable time. • Effective incorporation of wind-power uncertainty in the UC modeling. - Abstract: The complex processes involved in the determination of the availability of power from renewable energy sources, such as wind power, impose great challenges in the forecasting processes carried out by transmission system operators (TSOs). Nowadays, many of these TSOs use operation planning tools that take into account the uncertainty of the wind-power. However, most of these methods typically require strict assumptions about the probabilistic behavior of the forecast error, and usually ignore the dynamic nature of the forecasting process. In this paper a methodological framework to obtain Robust Unit Commitment (UC) policies is presented; such methodology considers a novel scenario-based uncertainty model for wind power applications. The proposed method is composed by three main phases. The first two phases generate a sound wind-power forecast using a bootstrap predictive inference approach. The third phase corresponds to modeling and solving a one-day ahead Robust UC considering the output of the first phase. The performance of proposed approach is evaluated using as case study a new wind farm to be incorporated into the Northern Interconnected System (NIS) of Chile. A projection of wind-based power installation, as well as different characteristic of the uncertain data, are considered in this study

  13. Sequential application of ligand and structure based modeling approaches to index chemicals for their hH4R antagonism.

    Directory of Open Access Journals (Sweden)

    Matteo Pappalardo

    Full Text Available The human histamine H4 receptor (hH4R, a member of the G-protein coupled receptors (GPCR family, is an increasingly attractive drug target. It plays a key role in many cell pathways and many hH4R ligands are studied for the treatment of several inflammatory, allergic and autoimmune disorders, as well as for analgesic activity. Due to the challenging difficulties in the experimental elucidation of hH4R structure, virtual screening campaigns are normally run on homology based models. However, a wealth of information about the chemical properties of GPCR ligands has also accumulated over the last few years and an appropriate combination of these ligand-based knowledge with structure-based molecular modeling studies emerges as a promising strategy for computer-assisted drug design. Here, two chemoinformatics techniques, the Intelligent Learning Engine (ILE and Iterative Stochastic Elimination (ISE approach, were used to index chemicals for their hH4R bioactivity. An application of the prediction model on external test set composed of more than 160 hH4R antagonists picked from the chEMBL database gave enrichment factor of 16.4. A virtual high throughput screening on ZINC database was carried out, picking ∼ 4000 chemicals highly indexed as H4R antagonists' candidates. Next, a series of 3D models of hH4R were generated by molecular modeling and molecular dynamics simulations performed in fully atomistic lipid membranes. The efficacy of the hH4R 3D models in discrimination between actives and non-actives were checked and the 3D model with the best performance was chosen for further docking studies performed on the focused library. The output of these docking studies was a consensus library of 11 highly active scored drug candidates. Our findings suggest that a sequential combination of ligand-based chemoinformatics approaches with structure-based ones has the potential to improve the success rate in discovering new biologically active GPCR drugs and

  14. An open-source Java-based Toolbox for environmental model evaluation: The MOUSE Software Application

    Science.gov (United States)

    A consequence of environmental model complexity is that the task of understanding how environmental models work and identifying their sensitivities/uncertainties, etc. becomes progressively more difficult. Comprehensive numerical and visual evaluation tools have been developed such as the Monte Carl...

  15. Applications for Mission Operations Using Multi-agent Model-based Instructional Systems with Virtual Environments

    Science.gov (United States)

    Clancey, William J.

    2004-01-01

    This viewgraph presentation provides an overview of past and possible future applications for artifical intelligence (AI) in astronaut instruction and training. AI systems have been used in training simulation for the Hubble Space Telescope repair, the International Space Station, and operations simulation for the Mars Exploration Rovers. In the future, robots such as may work as partners with astronauts on missions such as planetary exploration and extravehicular activities.

  16. Rigorous model-based uncertainty quantification with application to terminal ballistics, part I: Systems with controllable inputs and small scatter

    Science.gov (United States)

    Kidane, A.; Lashgari, A.; Li, B.; McKerns, M.; Ortiz, M.; Owhadi, H.; Ravichandran, G.; Stalzer, M.; Sullivan, T. J.

    2012-05-01

    This work is concerned with establishing the feasibility of a data-on-demand (DoD) uncertainty quantification (UQ) protocol based on concentration-of-measure inequalities. Specific aims are to establish the feasibility of the protocol and its basic properties, including the tightness of the predictions afforded by the protocol. The assessment is based on an application to terminal ballistics and a specific system configuration consisting of 6061-T6 aluminum plates struck by spherical S-2 tool steel projectiles at ballistic impact speeds. The system's inputs are the plate thickness and impact velocity and the perforation area is chosen as the sole performance measure of the system. The objective of the UQ analysis is to certify the lethality of the projectile, i.e., that the projectile perforates the plate with high probability over a prespecified range of impact velocities and plate thicknesses. The net outcome of the UQ analysis is an M/U ratio, or confidence factor, of 2.93, indicative of a small probability of no perforation of the plate over its entire operating range. The high-confidence (>99.9%) in the successful operation of the system afforded the analysis and the small number of tests (40) required for the determination of the modeling-error diameter, establishes the feasibility of the DoD UQ protocol as a rigorous yet practical approach for model-based certification of complex systems.

  17. Application of Novel Rotation Angular Model for 3D Mouse System Based on MEMS Accelerometers

    Institute of Scientific and Technical Information of China (English)

    QIAN Li; CHEN Wen-yuan; XU Guo-ping

    2009-01-01

    A new scheme is proposed to model 3D angular motion of a revolving regular object with miniature, low-cost micro electro mechanical systems (MEMS) accelerometers (instead of gyroscope), which is employed in 3D mouse system. To sense 3D angular motion, the static property of MEMS accelerometer, sensitive to gravity acceleration, is exploited. With the three outputs of configured accelerometers, the proposed model is implemented to get the rotary motion of the rigid object. In order to validate the effectiveness of the proposed model, an input device is developed with the configuration of the scheme. Experimental results show that a simulated 3D cube can accurately track the rotation of the input device. The result indicates the feasibility and effectiveness of the proposed model in the 3D mouse system.

  18. A microcantilever-based alcohol vapor sensor-application and response model

    DEFF Research Database (Denmark)

    Jensenius, Henriette; Thaysen, Jacob; Rasmussen, Anette Alsted;

    2000-01-01

    A recently developed microcantilever probe with integrated piezoresistive readout has been applied as a gas sensor. Resistors, sensitive to stress changes, are integrated on the flexible cantilevers. This makes it possible to monitor the cantilever deflection electrically and with an integrated...... reference cantilever background noise is subtracted directly in the measurement. A polymer coated cantilever has been exposed to vapors of various alcohols and the resulting cantilever response has been interpreted using a simple evaporation model. The model indicates that the cantilever response...

  19. Quality Model and Artificial Intelligence Base Fuel Ratio Management with Applications to Automotive Engine

    Directory of Open Access Journals (Sweden)

    Mojdeh Piran

    2014-01-01

    Full Text Available In this research, manage the Internal Combustion (IC engine modeling and a multi-input-multi-output artificial intelligence baseline chattering free sliding mode methodology scheme is developed with guaranteed stability to simultaneously control fuel ratios to desired levels under various air flow disturbances by regulating the mass flow rates of engine PFI and DI injection systems. Modeling of an entire IC engine is a very important and complicated process because engines are nonlinear, multi inputs-multi outputs and time variant. One purpose of accurate modeling is to save development costs of real engines and minimizing the risks of damaging an engine when validating controller designs. Nevertheless, developing a small model, for specific controller design purposes, can be done and then validated on a larger, more complicated model. Analytical dynamic nonlinear modeling of internal combustion engine is carried out using elegant Euler-Lagrange method compromising accuracy and complexity. A baseline estimator with varying parameter gain is designed with guaranteed stability to allow implementation of the proposed state feedback sliding mode methodology into a MATLAB simulation environment, where the sliding mode strategy is implemented into a model engine control module (“software”. To estimate the dynamic model of IC engine fuzzy inference engine is applied to baseline sliding mode methodology. The fuzzy inference baseline sliding methodology performance was compared with a well-tuned baseline multi-loop PID controller through MATLAB simulations and showed improvements, where MATLAB simulations were conducted to validate the feasibility of utilizing the developed controller and state estimator for automotive engines. The proposed tracking method is designed to optimally track the desired FR by minimizing the error between the trapped in-cylinder mass and the product of the desired FR and fuel mass over a given time interval.

  20. A real options based model and its application to China's overseas oil investment decisions

    International Nuclear Information System (INIS)

    This paper applies real options theory to overseas oil investment by adding an investment-environment factor to oil-resource valuation. A real options model is developed to illustrate how an investor country (or oil company) can evaluate and compare the critical value of oil-resource investment in different countries under oil-price, exchange-rate, and investment-environment uncertainties. The aim is to establish a broad model that can be used by every oil investor country to value overseas oil resources. The model developed here can match three key elements: 1) deal with overseas investment (the effects of investment environment and exchange rates); 2) deal with oil investment (oil price, production decline rate and development cost etc.); 3) the comparability of the results from different countries (different countries' oil-investment situation can be compared by using the option value index (OVI)). China's overseas oil investment is taken as an example to explain the model by calculating each oil-investee country's critical value per unit of oil reserves and examining the effect of different factors on the critical value. The results show that the model developed here can provide useful advice for China's overseas oil investment program. The research would probably also be helpful to other investor countries looking to invest in overseas oil resources. (author)

  1. Soft Sensing Modelling Based on Optimal Selection of Secondary Variables and Its Application

    Institute of Scientific and Technical Information of China (English)

    Qi Li; Cheng Shao

    2009-01-01

    The composition of the distillation column is a very important quality value in refineries, unfortunately, few hardware sensors are available on-line to measure the distillation compositions. In this paper, a novel method using sensitivity matrix analysis and kernel ridge regression (KRR) to implement on-line soft sensing of distillation compositions is proposed. In this approach, the sensitivity matrix analysis is presented to select the most suitable secondary variables to be used as the soft sensor's input. The KRR is used to build the composition soft sensor. Application to a simulated distillation column demonstrates the effectiveness of the method.

  2. Cloud computing models and their application in LTE based cellular systems

    NARCIS (Netherlands)

    Staring, A.J.; Karagiannis, G.

    2013-01-01

    As cloud computing emerges as the next novel concept in computer science, it becomes clear that the model applied in large data storage systems used to resolve issues coming forth from an increasing demand, could also be used to resolve the very high bandwidth requirements on access network, core ne

  3. Life-Stage Physiologically-Based Pharmacokinetic (PBPK) Model Applications to Screen Environmental Hazards.

    Science.gov (United States)

    This presentation discusses methods used to extrapolate from in vitro high-throughput screening (HTS) toxicity data for an endocrine pathway to in vivo for early life stages in humans, and the use of a life stage PBPK model to address rapidly changing physiological parameters. A...

  4. Application of the probability-based covering algorithm model in text classification

    Institute of Scientific and Technical Information of China (English)

    ZHOU; Ying

    2009-01-01

    The probability-based covering algorithm(PBCA)is a new algorithm based on probability distribution.It decides,by voting,the class of the tested samples on the border of the coverage area,based on the probability of training samples.When using the original covering algorithm(CA),many tested samples that are located on the border of the coverage cannot be classified by the spherical neighborhood gained.The network structure of PBCA is a mixed structure composed of both a feed-forward network and a feedback network.By using this method of adding some heterogeneous samples and enlarging the coverage radius,it is possible to decrease the number of rejected samples and improve the rate of recognition accuracy.Relevant computer experiments indicate that the algorithm improves the study precision and achieves reasonably good results in text classification.

  5. A model for requirements traceability in an heterogeneous model-based design process. Application to automotive embedded systems

    OpenAIRE

    Dubois, Hubert; Peraldi-Frati, Marie-Agnès; Lakhal, Fadoi

    2010-01-01

    Requirements traceability modeling is a key issue in real-time embedded design process. In such systems, requirements are of different nature (software-related, system-related, functional and non functional) and must be traced through a multi level design flow which integrates multiple and heterogeneous models. Validation and Verification (V&V) activities must be performed on models and on the final product to check if they are matching the initial require-ments. Results of a design and of V&...

  6. Overview of Emerging Web 2.0-Based Business Models and Web 2.0 Applications in Businesses: An Ecological Perspective

    OpenAIRE

    In Lee

    2011-01-01

    Web 2.0 offers business organizations an array of new ways to interact with customers and partners. Web 2.0 is continuously evolving and offers new business models and support business processes, customer relationship management, and partner relationship management. This study reviews some of the major business applications of Web 2.0, and identifies Web 2.0-based business models. Six emerging Web 2.0-based business models were identified: (1) Broad Online Community, (2) Focused Online Commun...

  7. Designing Collaborative E-Learning Environments Based upon Semantic Wiki: From Design Models to Application Scenarios

    Science.gov (United States)

    Li, Yanyan; Dong, Mingkai; Huang, Ronghuai

    2011-01-01

    The knowledge society requires life-long learning and flexible learning environment that enables fast, just-in-time and relevant learning, aiding the development of communities of knowledge, linking learners and practitioners with experts. Based upon semantic wiki, a combination of wiki and Semantic Web technology, this paper designs and develops…

  8. Overview of Dioxin Kinetics and Application of Dioxin Physiologically Based Phannacokinetic (PBPK) Models to Risk Assessment

    Science.gov (United States)

    The available data on the pharmacokinetics of 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) in animals and humans have been thoroughly reviewed in literature. It is evident based on these reviews and other analyses that three distinctive features of TCDD play important roles in dete...

  9. Conceptual design and modeling of particle-matter interaction cooling systems for muon based applications

    CERN Document Server

    Stratakis, Diktys; Rogers, Chris T; Alekou, Androula; Pasternak, Jaroslaw

    2014-01-01

    An ionization cooling channel is a tightly spaced lattice containing absorbers for reducing the momentum of the muon beam, rf cavities for restoring the longitudinal momentum, and strong solenoids for focusing. Such a lattice can be an essential feature for fundamental high-energy physics applications. In this paper we design, simulate, and compare four individual cooling schemes that rely on ionization cooling. We establish a scaling characterizing the impact of rf gradient limitations on the overall performance and systematically compare important lattice parameters such as the required magnetic fields and the number of cavities and absorber lengths for each cooling scenario. We discuss approaches for reducing the peak magnetic field inside the rf cavities by either increasing the lattice cell length or adopting a novel bucked-coil configuration. We numerically examine the performance of our proposed channels with two independent codes that fully incorporate all basic particle-matter-interaction physical pr...

  10. Identifying plausible genetic models based on association and linkage results: application to type 2 diabetes.

    Science.gov (United States)

    Guan, Weihua; Boehnke, Michael; Pluzhnikov, Anna; Cox, Nancy J; Scott, Laura J

    2012-12-01

    When planning resequencing studies for complex diseases, previous association and linkage studies can constrain the range of plausible genetic models for a given locus. Here, we explore the combinations of causal risk allele frequency (RAFC ) and genotype relative risk (GRRC ) consistent with no or limited evidence for affected sibling pair (ASP) linkage and strong evidence for case-control association. We find that significant evidence for case-control association combined with no or moderate evidence for ASP linkage can define a lower bound for the plausible RAFC . Using data from large type 2 diabetes (T2D) linkage and genome-wide association study meta-analyses, we find that under reasonable model assumptions, 23 of 36 autosomal T2D risk loci are unlikely to be due to causal variants with combined RAFC < 0.005, and four of the 23 are unlikely to be due to causal variants with combined RAFC < 0.05.

  11. Interacting Particle-based Model for Missing Data in Sensor Networks: Foundations and Applications

    OpenAIRE

    Koushanfar, Farinaz; Kiyavash, Negar; Potkonjak, Miodrag

    2006-01-01

    Missing data is unavoidable in sensor networks due to sensor faults, communication malfunctioning and malicious attacks. There is a very little insight in missing data causes and statistical and pattern properties of missing data in collected data streams. To address this problem, we utilize interacting-particle model that takes into account both patterns of missing data at individual sensor data streams as well as the correlation between occurrence of missing data at other sensor data stream...

  12. Model-Based Statistical Tracking and Decision Making for Collision Avoidance Application

    OpenAIRE

    Karlsson, Rickard; Jansson, Jonas; Gustafsson, Fredrik

    2004-01-01

    A growing research topic within the automotive industry is active safety systems. These systems aim at helping the driver avoid or mitigate the consequences of an accident. In this paper a collision mitigation system that performs late braking is discussed. The brake decision is based on estimates from tracking sensors. We use a Bayesian approach, implementing an extended Kalman filter (EKF) and a particle filter to solve the tracking problem. The two filters are compared for different sensor...

  13. B-SPLINE-BASED SVM MODEL AND ITS APPLICATIONS TO OIL WATER-FLOODED STATUS IDENTIFICATION

    Institute of Scientific and Technical Information of China (English)

    Shang Fuhua; Zhao Tiejun; Yi Xiongying

    2007-01-01

    A method of B-spline transform for signal feature extraction is developed. With the B-spline,the log-signal space is mapped into the vector space. An efficient algorithm based on Support Vector Machine (SVM) to automatically identify the water-flooded status of oil-saturated stratum is described.The experiments show that this algorithm can improve the performances for the identification and the generalization in the case of a limited set of samples.

  14. Applicability of the technology acceptance model for widget-based personal learning environments

    OpenAIRE

    Wild, Fridolin; Ullmann, Thomas; Scott, Peter; Rebedea, Traian; Hoisl, Bernhard

    2011-01-01

    This contribution presents results from two exploratory studies on technology acceptance and use of widget-based personal learning environments. Methodologically, the investigation carried out applies the unified theory of acceptance and use of technology (UTAUT). With the help of this instrument, the study assesses expert judgments about intentions to use and actual use of the emerging technology of flexibly arranged combinations of use-case-sized mini learning tools. This study aims to expl...

  15. Constraint-based modelling of mixed microbial populations: Application to polyhydroxyalkanoates production

    OpenAIRE

    Pardelha, Filipa Alexandra Guerreiro

    2013-01-01

    The combined use of mixed microbial cultures (MMC) and fermented feedstock as substrate may significantly decrease polyhydroxyalkanoates (PHA) production costs and make them more competitive in relation to conventional petroleum-based polymers. However, there still exists a lack of knowledge at metabolic level that limits the development of strategies to make this process more effective. In this thesis, system biology computational tools were developed and applied to PHA production by MMC fro...

  16. An application to pulmonary emphysema classification based on model of texton learning by sparse representation

    Science.gov (United States)

    Zhang, Min; Zhou, Xiangrong; Goshima, Satoshi; Chen, Huayue; Muramatsu, Chisako; Hara, Takeshi; Yokoyama, Ryojiro; Kanematsu, Masayuki; Fujita, Hiroshi

    2012-03-01

    We aim at using a new texton based texture classification method in the classification of pulmonary emphysema in computed tomography (CT) images of the lungs. Different from conventional computer-aided diagnosis (CAD) pulmonary emphysema classification methods, in this paper, firstly, the dictionary of texton is learned via applying sparse representation(SR) to image patches in the training dataset. Then the SR coefficients of the test images over the dictionary are used to construct the histograms for texture presentations. Finally, classification is performed by using a nearest neighbor classifier with a histogram dissimilarity measure as distance. The proposed approach is tested on 3840 annotated regions of interest consisting of normal tissue and mild, moderate and severe pulmonary emphysema of three subtypes. The performance of the proposed system, with an accuracy of about 88%, is comparably higher than state of the art method based on the basic rotation invariant local binary pattern histograms and the texture classification method based on texton learning by k-means, which performs almost the best among other approaches in the literature.

  17. Application of PPE Model in Land Adaptability Appraisal in Small Basin Based on RAGA

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    The paper applies multi-dimension to lower technology-projection pursuit evaluation model in the conservation of water and soil discipline domain, optimizes the projection direction using the improved acceleration genetic algorithms, transforms the multi-dimensional data target to lower sub-space, and values soil adaptability of Dongdagou basin in Keshan County by searching the optimal projection direction and the projection function data. The paper provides a new notion and method for the conservation of water and soil in small basin.

  18. APPLICATION OF GRAY EVALUATION MODEL BASED ON AHP IN ATM SYSTEM

    Institute of Scientific and Technical Information of China (English)

    Wu Zhijun; Pan Wen

    2008-01-01

    This paper presents a hierarchy model of Air Traffic Management (ATM) according to the security requirements in ATM system, analyzes it by grey assessment and Analytic Hierarchy Process(AHP), and evaluates it in details. It also provides theoretical support for building an effective evaluation system. The basic idea is to use AHP and Grey Assessment to obtain the weights of the indicators, and count grey evaluation coefficients with whitening function. The compositive clustering coefficients are obtained by combining the weights and the grey evaluation coefficients. Evaluation result can be gotten from the compositive clustering coefficients.

  19. Linking Satellite Remote Sensing Based Environmental Predictors to Disease: AN Application to the Spatiotemporal Modelling of Schistosomiasis in Ghana

    Science.gov (United States)

    Wrable, M.; Liss, A.; Kulinkina, A.; Koch, M.; Biritwum, N. K.; Ofosu, A.; Kosinski, K. C.; Gute, D. M.; Naumova, E. N.

    2016-06-01

    90% of the worldwide schistosomiasis burden falls on sub-Saharan Africa. Control efforts are often based on infrequent, small-scale health surveys, which are expensive and logistically difficult to conduct. Use of satellite imagery to predictively model infectious disease transmission has great potential for public health applications. Transmission of schistosomiasis requires specific environmental conditions to sustain freshwater snails, however has unknown seasonality, and is difficult to study due to a long lag between infection and clinical symptoms. To overcome this, we employed a comprehensive 8-year time-series built from remote sensing feeds. The purely environmental predictor variables: accumulated precipitation, land surface temperature, vegetative growth indices, and climate zones created from a novel climate regionalization technique, were regressed against 8 years of national surveillance data in Ghana. All data were aggregated temporally into monthly observations, and spatially at the level of administrative districts. The result of an initial mixed effects model had 41% explained variance overall. Stratification by climate zone brought the R2 as high as 50% for major zones and as high as 59% for minor zones. This can lead to a predictive risk model used to develop a decision support framework to design treatment schemes and direct scarce resources to areas with the highest risk of infection. This framework can be applied to diseases sensitive to climate or to locations where remote sensing would be better suited than health surveys.

  20. Time-frequency representation based on time-varying autoregressive model with applications to non-stationary rotor vibration analysis

    Indian Academy of Sciences (India)

    Long Zhang; Guoliang Xiong; Hesheng Liu; Huijun Zou; Weizhong Guo

    2010-04-01

    A parametric time-frequency representation is presented based on timevarying autoregressive model (TVAR), followed by applications to non-stationary vibration signal processing. The identification of time-varying model coefficients and the determination of model order, are addressed by means of neural networks and genetic algorithms, respectively. Firstly, a simulated signal which mimic the rotor vibration during run-up stages was processed for a comparative study on TVAR and other non-parametric time-frequency representations such as Short Time Fourier Transform, Continuous Wavelet Transform, Empirical Mode Decomposition, Wigner–Ville Distribution and Choi–Williams Distribution, in terms of their resolutions, accuracy, cross term suppression as well as noise resistance. Secondly, TVAR was applied to analyse non-stationary vibration signals collected from a rotor test rig during run-up stages, with an aim to extract fault symptoms under non-stationary operating conditions. Simulation and experimental results demonstrate that TVAR is an effective solution to non-stationary signal analysis and has strong capability in signal time-frequency feature extraction.

  1. Application of an ion-packing model based on defect clusters to zirconia solid solutions. 2

    International Nuclear Information System (INIS)

    This paper reports that lattice parameter data of cubic phases and cube roots of unit-cell volumes of tetragonal phases in homogeneous ZrO2-containing solid solutions were compiled to examine the validity of Vegard's law. Except for ZrO2--CeO2 and ZrO2--UO2 systems, the data for cubic phases were expressed by the equation d = a,X + B, where d, as, X, and b denote the lattice parameter, a constant depending on dopant species, the dopant content, and a constant independent of dopant species, respectively. For tetragonal phases, the cube roots of unit-cell volumes could be fitted by a similar equation except for the data in the ZrO2--MO2 systems (M = Ge and U). The constant as was calculated using an ion-packing model and was independent of the defect cluster models. The calculated as is close to the experimentally observed one, although the former is slightly smaller than the latter in the ZrO2--MOu systems (u = 1 and 1.5). This difference was ascribed to the lack of consideration of the ionic distortions from the ideal sites of the fluorite-type structure

  2. Towards quantum-based modeling of enzymatic reaction pathways: Application to the acetylholinesterase catalysis

    Science.gov (United States)

    Polyakov, Igor V.; Grigorenko, Bella L.; Moskovsky, Alexander A.; Pentkovski, Vladimir M.; Nemukhin, Alexander V.

    2013-01-01

    We apply computational methods aiming to approach a full quantum mechanical treatment of chemical reactions in proteins. A combination of the quantum mechanical - molecular mechanical methodology for geometry optimization and the fragment molecular orbital approach for energy calculations is examined for an example of acetylcholinesterase catalysis. The codes based on the GAMESS(US) package operational on the 'RSC Tornado' computational cluster are applied to determine that the energy of the reaction intermediate upon hydrolysis of acetylcholine is lower than that of the enzyme-substrate complex. This conclusion is consistent with the experiments and it is free from the empirical force field contributions.

  3. Eye gaze in intelligent user interfaces gaze-based analyses, models and applications

    CERN Document Server

    Nakano, Yukiko I; Bader, Thomas

    2013-01-01

    Remarkable progress in eye-tracking technologies opened the way to design novel attention-based intelligent user interfaces, and highlighted the importance of better understanding of eye-gaze in human-computer interaction and human-human communication. For instance, a user's focus of attention is useful in interpreting the user's intentions, their understanding of the conversation, and their attitude towards the conversation. In human face-to-face communication, eye gaze plays an important role in floor management, grounding, and engagement in conversation.Eye Gaze in Intelligent User Interfac

  4. Application of T2 Control Charts and Hidden Markov Models in Condition-Based Maintenance at Thermoelectric Power Plants

    Directory of Open Access Journals (Sweden)

    Emilija Kisić

    2015-01-01

    Full Text Available An innovative approach to condition-based maintenance of coal grinding subsystems at thermoelectric power plants is proposed in the paper. Coal mill grinding tables become worn over time and need to be replaced through time-based maintenance, after a certain number of service hours. At times such replacement is necessary earlier or later than prescribed, depending on the quality of the coal and of the grinding table itself. Considerable financial losses are incurred when the entire coal grinding subsystem is shut down and the grinding table found to not actually require replacement. The only way to determine whether replacement is necessary is to shut down and open the entire subsystem for visual inspection. The proposed algorithm supports condition-based maintenance and involves the application of T2 control charts to distinct acoustic signal parameters in the frequency domain and the construction of Hidden Markov Models whose observations are coded samples from the control charts. In the present research, the acoustic signals were collected by coal mill monitoring at the thermoelectric power plant “Kostolac” in Serbia. The proposed approach provides information about the current condition of the grinding table.

  5. Application of a Collaborative Filtering Recommendation Algorithm Based on Cloud Model in Intrusion Detection

    Directory of Open Access Journals (Sweden)

    Deguang Wang

    2011-02-01

    Full Text Available Intrusion detection is a computer network system that collects information on several key points. and it gets these information from the security audit, monitoring, attack recognition and response aspects, check if there are some the behavior and signs against the network security policy. The classification of data acquisition is a key part of intrusion detection. In this article, we use the data cloud model to classify the invasion, effectively maintaining a continuous data on the qualitative ambiguity of the concept and evaluation phase of the invasion against the use of the coordination level filtering recommendation algorithm greatly improves the intrusion detection system in the face of massive data processing efficiency suspicious intrusion.

  6. Model-based software design

    Science.gov (United States)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  7. Application of damage mechanics modeling to strain based design with respect to ductile crack initiation

    Energy Technology Data Exchange (ETDEWEB)

    Ishikawa, Nobuyuki; Sueyoshi, Hitoshi; Igi, Satoshi [Steel Research Laboratory, JFE Steel Corporation (Japan)

    2010-07-01

    In the oil and gas sector, with the increase in demand, more and more pipelines are now constructed in permafrost and seismic regions. When installed in such harsh environments, pipelines must be resistant to buckling and weld fracture and the strain based design methodology is preferably used. The aim of this paper is to study the critical condition for ductile crack initiation. Both notched round bar and wide plate tests were carried out on X80 and X100 steel pipes and welds; the equivalent plastic strain criterion and Gurson Tvergaard mechanical damage analysis were used. It was found that to determine ductile crack initiation that is not affected by specimen geometry, the critical equivalent plastic strain can be used as the local criterion. In addition, when ductile crack initiation is independent of specimen geometry, the void volume fraction can be used as a criterion. This paper provided useful information on which criterion to use for ductile crack initiation.

  8. WAP - based telemedicine applications

    International Nuclear Information System (INIS)

    Telemedicine refers to the utilization of telecommunication technology for medical diagnosis, treatment, and patient care. Its aim is to provide expert-based health care to remote sites through telecommunication and information technologies. The significant advances in technologies have enabled the introduction of a broad range of telemedicine applications, which are supported by computer networks, wireless communication, and information superhighway. For example, some hospitals are using tele-radiology for remote consultation. Such a system includes medical imaging devices networked with computers and databases. Another growing area is patient monitoring, in which sensors are used to acquire biomedical signals, such as electrocardiogram (ECG), blood pressure, and body temperature, from a remote patient, who could be in bed or moving freely. The signals are then relayed to remote systems for viewing and analysis. Telemedicine can be divided into two basic modes of operations: real-time mode, in which the patient data can be accessed remotely in real-time, and store-and-forward mode, in which the acquired data does not have to be accessed immediately. In the recent years, many parties have demonstrated various telemedicine applications based on the Internet and cellular phone as these two fields have been developing rapidly. A current, recognizable trend in telecommunication is the convergence of wireless communication and computer network technologies. This has been reflected in recently developed telemedicine systems. For example, in 1998 J. Reponen, et al. have demonstrated transmission and display of computerized tomography (CT) examinations using a remote portable computer wirelessly connected to a computer network through TCP/IP on a GSM cellular phone. Two years later, they carried out the same tests with a GSM-based wireless personal digital assistant (PDA). The WAP (Wireless Application Protocol) Forum was founded in 1997 to create a global protocol

  9. Principles of models based engineering

    Energy Technology Data Exchange (ETDEWEB)

    Dolin, R.M.; Hefele, J.

    1996-11-01

    This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.

  10. Multiscale enhanced path sampling based on the Onsager-Machlup action: Application to a model polymer

    CERN Document Server

    Fujisaki, Hiroshi; Moritsugu, Kei; Kidera, Akinori

    2013-01-01

    We propose a novel path sampling method based on the Onsager-Machlup (OM) action by generalizing the multiscale enhanced sampling (MSES) technique suggested by Moritsugu and coworkers (J. Chem. Phys. 133, 224105 (2010)). The basic idea of this method is that the system we want to study (for example, some molecular system described by molecular mechanics) is coupled to a coarse-grained (CG) system, which can move more quickly and computed more efficiently than the original system. We simulate this combined system (original + CG system) using (underdamped) Langevin dynamics where different heat baths are coupled to the two systems. When the coupling is strong enough, the original system is guided by the CG system, and able to sample the configuration and path space more efficiency. We need to correct the bias caused by the coupling, however, by employing the Hamiltonian replica exchange where we prepare many path replica with different coupling strengths. As a result, an unbiased path ensemble for the original ...

  11. Direct synchronous-asynchronous conversion system for hybrid electrical vehicle applications. An energy-based modeling approach

    OpenAIRE

    Muñoz-Aguilar, Raúl S.; Dòria-Cerezo, Arnau; Puleston, Pablo Federico

    2013-01-01

    This paper presents a proposal for a series hybrid electric vehicle propulsion system. This new configuration is based on a wound-rotor synchronous generator (WRSM) and a doubly-fed induction machine (DFIM). The energy-based model of the whole system is obtained taking advantage of the capabilities of the port-based modeling techniques. From the dq port-controlled Hamiltonian description of the WRSM and DFIM, the Hamiltonian model of the proposed Direct Synchronous-Asynchronous Conversion Sys...

  12. PITBUL: a physics-based modeling package for imaging and tracking of airborne targets for HEL applications including active illumination

    Science.gov (United States)

    Van Zandt, Noah R.; McCrae, Jack E.; Fiorino, Steven T.

    2013-05-01

    Aimpoint acquisition and maintenance is critical to high energy laser (HEL) system performance. This study demonstrates the development by the AFIT/CDE of a physics-based modeling package, PITBUL, for tracking airborne targets for HEL applications, including atmospheric and sensor effects and active illumination, which is a focus of this work. High-resolution simulated imagery of the 3D airborne target in-flight as seen from the laser position is generated using the HELSEEM model, and includes solar illumination, laser illumination, and thermal emission. Both CW and pulsed laser illumination are modeled, including the effects of illuminator scintillation, atmospheric backscatter, and speckle, which are treated at a first-principles level. Realistic vertical profiles of molecular and aerosol absorption and scattering, as well as optical turbulence, are generated using AFIT/CDE's Laser Environmental Effects Definition and Reference (LEEDR) model. The spatially and temporally varying effects of turbulence are calculated and applied via a fast-running wave optical method known as light tunneling. Sensor effects, for example blur, sampling, read-out noise, and random photon arrival, are applied to the imagery. Track algorithms, including centroid and Fitts correlation, as a part of a closed loop tracker are applied to the degraded imagery and scored, to provide an estimate of overall system performance. To gauge performance of a laser system against a UAV target, tracking results are presented as a function of signal to noise ratio. Additionally, validation efforts to date involving comparisons between simulated and experimental tracking of UAVs are presented.

  13. Modeling Evidence-Based Application: Using Team-Based Learning to Increase Higher Order Thinking in Nursing Research

    OpenAIRE

    Bridget Moore; Jennifer Styron; Kristina Miller

    2015-01-01

    Nursing practice is comprised of knowledge, theory, and research [1]. Because of its impact on the profession, the appraisal of research evidence is critically important. Future nursing professionals must be introduced to the purpose and utility of nursing research, as early exposure provides an opportunity to embed evidence-based practice (EBP) into clinical experiences. The AACN requires baccalaureate education to include an understanding of the research process to integrate reliable eviden...

  14. Optimization by means of an analytical heat transfer model of a thermal insulation for CSP applications based on radiative shields

    Science.gov (United States)

    Gaetano, A.; Roncolato, J.; Montorfano, D.; Barbato, M. C.; Ambrosetti, G.; Pedretti, A.

    2016-05-01

    The employment of new gaseous heat transfer fluids as air or CO2, which are cheaper and environmentally friendly, is drawing more and more attention within the field of Concentrated Solar Power applications. However, despite the advantages, their use requires receivers with a larger heat transfer area and flow cross section with a consequent greater volume of thermal insulation. Solid thermal insulations currently used present high thermal inertia which is energetically penalizing during the daily transient phases faced by the main plant components (e.g. receivers). With the aim of overcoming this drawback a thermal insulation based on radiative shields is presented in this study. Starting from an initial layout comprising a solid thermal insulation layer, the geometry was optimized avoiding the use of the solid insulation keeping performance and fulfilling the geometrical constraints. An analytical Matlab model was implemented to assess the system thermal behavior in terms of heat loss taking into account conductive, convective and radiative contributions. Accurate 2D Computational Fluid Dynamics (CFD) simulations were run to validate the Matlab model which was then used to select the most promising among three new different designs.

  15. Distributed Parameter Modelling Applications

    DEFF Research Database (Denmark)

    Sales-Cruz, Mauricio; Cameron, Ian; Gani, Rafiqul

    2011-01-01

    and the development of a short-path evaporator. The oil shale processing problem illustrates the interplay amongst particle flows in rotating drums, heat and mass transfer between solid and gas phases. The industrial application considers the dynamics of an Alberta-Taciuk processor, commonly used in shale oil and oil...... sands processing. The fertilizer granulation model considers the dynamics of MAP-DAP (mono and diammonium phosphates) production within an industrial granulator, that involves complex crystallisation, chemical reaction and particle growth, captured through population balances. A final example considers...

  16. Exploring formal models of linguistic data structuring. Enhanced solutions for knowledge management systems based on NLP applications

    OpenAIRE

    Marano, Federica

    2012-01-01

    2010 - 2011 The principal aim of this research is describing to which extent formal models for linguistic data structuring are crucial in Natural Language Processing (NLP) applications. In this sense, we will pay particular attention to those Knowledge Management Systems (KMS) which are designed for the Internet, and also to the enhanced solutions they may require. In order to appropriately deal with this topics, we will describe how to achieve computational linguistics applications helpfu...

  17. Recent Advances in Development and Application of Physiologically-Based Pharmacokinetic (PBPK) Models: a Transition from Academic Curiosity to Regulatory Acceptance

    OpenAIRE

    Jamei, Masoud

    2016-01-01

    There is a renewed surge of interest in applications of physiologically-based pharmacokinetic (PBPK) models by the pharmaceutical industry and regulatory agencies. Developing PBPK models within a systems pharmacology context allows separation of the parameters pertaining to the animal or human body (the system) from that of the drug and the study design which is essential to develop generic drug-independent models used to extrapolate PK/PD properties in various healthy and patient populations...

  18. A Model for Water Quality Assessment Based on the Information Entropy and Its Application in the Case of Huiji River

    Institute of Scientific and Technical Information of China (English)

    BingdongZhao; QingliangZhao; JianhuaMa; HuaGuan

    2004-01-01

    Based on the information entropy, a model for water quality assessment is Using this model, the paper gives a case study on the water quality assessment River. The space-time variation law of the water quality is analyzed also in this result indicates that the model possesses some clear mathematic and physical and it is simple, practical and accurate.

  19. PEM Fuel Cells - Fundamentals, Modeling and Applications

    Directory of Open Access Journals (Sweden)

    Maher A.R. Sadiq Al-Baghdadi

    2013-01-01

    Full Text Available Part I: Fundamentals Chapter 1: Introduction. Chapter 2: PEM fuel cell thermodynamics, electrochemistry, and performance. Chapter 3: PEM fuel cell components. Chapter 4: PEM fuel cell failure modes. Part II: Modeling and Simulation Chapter 5: PEM fuel cell models based on semi-empirical simulation. Chapter 6: PEM fuel cell models based on computational fluid dynamics. Part III: Applications Chapter 7: PEM fuel cell system design and applications.

  20. PEM Fuel Cells - Fundamentals, Modeling and Applications

    OpenAIRE

    Maher A.R. Sadiq Al-Baghdadi

    2013-01-01

    Part I: Fundamentals Chapter 1: Introduction. Chapter 2: PEM fuel cell thermodynamics, electrochemistry, and performance. Chapter 3: PEM fuel cell components. Chapter 4: PEM fuel cell failure modes. Part II: Modeling and Simulation Chapter 5: PEM fuel cell models based on semi-empirical simulation. Chapter 6: PEM fuel cell models based on computational fluid dynamics. Part III: Applications Chapter 7: PEM fuel cell system design and applications.

  1. A knowledge- and model-based system for automated weaning from mechanical ventilation: technical description and first clinical application.

    Science.gov (United States)

    Schädler, Dirk; Mersmann, Stefan; Frerichs, Inéz; Elke, Gunnar; Semmel-Griebeler, Thomas; Noll, Oliver; Pulletz, Sven; Zick, Günther; David, Matthias; Heinrichs, Wolfgang; Scholz, Jens; Weiler, Norbert

    2014-10-01

    To describe the principles and the first clinical application of a novel prototype automated weaning system called Evita Weaning System (EWS). EWS allows an automated control of all ventilator settings in pressure controlled and pressure support mode with the aim of decreasing the respiratory load of mechanical ventilation. Respiratory load takes inspired fraction of oxygen, positive end-expiratory pressure, pressure amplitude and spontaneous breathing activity into account. Spontaneous breathing activity is assessed by the number of controlled breaths needed to maintain a predefined respiratory rate. EWS was implemented as a knowledge- and model-based system that autonomously and remotely controlled a mechanical ventilator (Evita 4, Dräger Medical, Lübeck, Germany). In a selected case study (n = 19 patients), ventilator settings chosen by the responsible physician were compared with the settings 10 min after the start of EWS and at the end of the study session. Neither unsafe ventilator settings nor failure of the system occurred. All patients were successfully transferred from controlled ventilation to assisted spontaneous breathing in a mean time of 37 ± 17 min (± SD). Early settings applied by the EWS did not significantly differ from the initial settings, except for the fraction of oxygen in inspired gas. During the later course, EWS significantly modified most of the ventilator settings and reduced the imposed respiratory load. A novel prototype automated weaning system was successfully developed. The first clinical application of EWS revealed that its operation was stable, safe ventilator settings were defined and the respiratory load of mechanical ventilation was decreased.

  2. Methane emissions from floodplains in the Amazon Basin: challenges in developing a process-based model for global applications

    Science.gov (United States)

    Ringeval, B.; Houweling, S.; van Bodegom, P. M.; Spahni, R.; van Beek, R.; Joos, F.; Röckmann, T.

    2014-03-01

    Tropical wetlands are estimated to represent about 50% of the natural wetland methane (CH4) emissions and explain a large fraction of the observed CH4 variability on timescales ranging from glacial-interglacial cycles to the currently observed year-to-year variability. Despite their importance, however, tropical wetlands are poorly represented in global models aiming to predict global CH4 emissions. This publication documents a first step in the development of a process-based model of CH4 emissions from tropical floodplains for global applications. For this purpose, the LPX-Bern Dynamic Global Vegetation Model (LPX hereafter) was slightly modified to represent floodplain hydrology, vegetation and associated CH4 emissions. The extent of tropical floodplains was prescribed using output from the spatially explicit hydrology model PCR-GLOBWB. We introduced new plant functional types (PFTs) that explicitly represent floodplain vegetation. The PFT parameterizations were evaluated against available remote-sensing data sets (GLC2000 land cover and MODIS Net Primary Productivity). Simulated CH4 flux densities were evaluated against field observations and regional flux inventories. Simulated CH4 emissions at Amazon Basin scale were compared to model simulations performed in the WETCHIMP intercomparison project. We found that LPX reproduces the average magnitude of observed net CH4 flux densities for the Amazon Basin. However, the model does not reproduce the variability between sites or between years within a site. Unfortunately, site information is too limited to attest or disprove some model features. At the Amazon Basin scale, our results underline the large uncertainty in the magnitude of wetland CH4 emissions. Sensitivity analyses gave insights into the main drivers of floodplain CH4 emission and their associated uncertainties. In particular, uncertainties in floodplain extent (i.e., difference between GLC2000 and PCR-GLOBWB output) modulate the simulated emissions by a

  3. Stochastic fractal-based models of heterogeneity in subsurface hydrology: Origins, applications, limitations, and future research questions

    Science.gov (United States)

    Molz, Fred J.; Rajaram, Harihar; Lu, Silong

    2004-03-01

    Modern measurement techniques have shown that property distributions in natural porous and fractured media appear highly irregular and nonstationary in a spatial statistical sense. This implies that direct statistical analyses of the property distributions are not appropriate, because the statistical measures developed will be dependent on position and therefore will be nonunique. An alternative, which has been explored to an increasing degree during the past 20 years, is to consider the class of functions known as nonstationary stochastic processes with spatially stationary increments. When such increment distributions are described by probability density functions (PDFs) of the Gaussian, Levy, or gamma class or PDFs that converge to one of these classes under additions, then one is also dealing with a so-called stochastic fractal, the mathematical theory of which was developed during the first half of the last century. The scaling property associated with such fractals is called self-affinity, which is more general that geometric self-similarity. Herein we review the application of Gaussian and Levy stochastic fractals and multifractals in subsurface hydrology, mainly to porosity, hydraulic conductivity, and fracture roughness, along with the characteristics of flow and transport in such fields. Included are the development and application of fractal and multifractal concepts; a review of the measurement techniques, such as the borehole flowmeter and gas minipermeameter, that are motivating the use of fractal-based theories; the idea of a spatial weighting function associated with a measuring instrument; how fractal fields are generated; and descriptions of the topography and aperture distributions of self-affine fractures. In a somewhat different vein the last part of the review deals with fractal- and fragmentation-based descriptions of fracture networks and the implications for transport in such networks. Broad conclusions include the implication that models

  4. Morse potential-based model for contacting composite rough surfaces: Application to self-assembled monolayer junctions

    Science.gov (United States)

    Sierra-Suarez, Jonatan A.; Majumdar, Shubhaditya; McGaughey, Alan J. H.; Malen, Jonathan A.; Higgs, C. Fred

    2016-04-01

    This work formulates a rough surface contact model that accounts for adhesion through a Morse potential and plasticity through the Kogut-Etsion finite element-based approximation. Compared to the commonly used Lennard-Jones (LJ) potential, the Morse potential provides a more accurate and generalized description for modeling covalent materials and surface interactions. An extension of this contact model to describe composite layered surfaces is presented and implemented to study a self-assembled monolayer (SAM) grown on a gold substrate placed in contact with a second gold substrate. Based on a comparison with prior experimental measurements of the thermal conductance of this SAM junction [Majumdar et al., Nano Lett. 15, 2985-2991 (2015)], the more general Morse potential-based contact model provides a better prediction of the percentage contact area than an equivalent LJ potential-based model.

  5. Determining iron oxide nanoparticle heating efficiency and elucidating local nanoparticle temperature for application in agarose gel-based tumor model.

    Science.gov (United States)

    Shah, Rhythm R; Dombrowsky, Alexander R; Paulson, Abigail L; Johnson, Margaret P; Nikles, David E; Brazel, Christopher S

    2016-11-01

    Magnetic iron oxide nanoparticles (MNPs) have been developed for magnetic fluid hyperthermia (MFH) cancer therapy, where cancer cells are treated through the heat generated by application of a high frequency magnetic field. This heat has also been proposed as a mechanism to trigger release of chemotherapy agents. In each of these cases, MNPs with optimal heating performance can be used to maximize therapeutic effect while minimizing the required dosage of MNPs. In this study, the heating efficiencies (or specific absorption rate, SAR) of two types of MNPs were evaluated experimentally and then predicted from their magnetic properties. MNPs were also incorporated in the core of poly(ethylene glycol-b-caprolactone) micelles, co-localized with rhodamine B fluorescent dye attached to polycaprolactone to monitor local, nanoscale temperatures during magnetic heating. Despite a relatively high SAR produced by these MNPs, no significant temperature rise beyond that observed in the bulk solution was measured by fluorescence in the core of the magnetic micelles. MNPs were also incorporated into a macro-scale agarose gel system that mimicked a tumor targeted by MNPs and surrounded by healthy tissues. The agarose-based tumor models showed that targeted MNPs can reach hyperthermia temperatures inside a tumor with a sufficient MNP concentration, while causing minimal temperature rise in the healthy tissue surrounding the tumor. PMID:27523991

  6. Concrete fracture models and applications

    CERN Document Server

    Kumar, Shailendra

    2011-01-01

    Concrete-Fracture Models and Applications provides a basic introduction to nonlinear concrete fracture models. Readers will find a state-of-the-art review on various aspects of the material behavior and development of different concrete fracture models.

  7. Modelling and simulation of passive Lab-on-a-Chip (LoC) based micromixer for clinical application

    Science.gov (United States)

    Saikat, Chakraborty; Sharath, M.; Srujana, M.; Narayan, K.; Pattnaik, Prasant Kumar

    2016-03-01

    In biomedical application, micromixer is an important component because of many processes requires rapid and efficient mixing. At micro scale, the flow is Laminar due to small channel size which enables controlled rapid mixing. The reduction in analysis time along with high throughput can be achieved with the help of rapid mixing. In LoC application, micromixer is used for mixing of fluids especially for the devices which requires efficient mixing. Micromixer of this type of microfluidic devices with a rapid mixing is useful in application such as DNA/RNA synthesis, drug delivery system & biological agent detection. In this work, we design and simulate a microfluidic based passive rapid micromixer for lab-on-a-chip application.

  8. Patterns of Use of an Agent-Based Model and a System Dynamics Model: The Application of Patterns of Use and the Impacts on Learning Outcomes

    Science.gov (United States)

    Thompson, Kate; Reimann, Peter

    2010-01-01

    A classification system that was developed for the use of agent-based models was applied to strategies used by school-aged students to interrogate an agent-based model and a system dynamics model. These were compared, and relationships between learning outcomes and the strategies used were also analysed. It was found that the classification system…

  9. Mathematical modeling with multidisciplinary applications

    CERN Document Server

    Yang, Xin-She

    2013-01-01

    Features mathematical modeling techniques and real-world processes with applications in diverse fields Mathematical Modeling with Multidisciplinary Applications details the interdisciplinary nature of mathematical modeling and numerical algorithms. The book combines a variety of applications from diverse fields to illustrate how the methods can be used to model physical processes, design new products, find solutions to challenging problems, and increase competitiveness in international markets. Written by leading scholars and international experts in the field, the

  10. The effects of composition on glass dissolution rates: The application of four models to a data base

    International Nuclear Information System (INIS)

    Four models have been applied to a data base to relate glass dissolution in distilled water to composition. The data base is used to compare the precisions obtained from the models in fitting actual data. The usefulness of the data base in formulating a model is also demonstrated. Two related models in which the composite or pH-adjusted free energy of hydration of the glass is the correlating parameter are compared with experimental data. In a structural model, the nonbridging oxygen content of the glasses is used to correlate glass dissolution rate to composition. In a model formulated for this report, the cation valence and the oxygen content of the glass are compared with observed dissolution rates. The models were applied to the 28-day normalized silica release at 900C for over 285 glass compositions with surface area to volume ratios of 10 m-1 (Materials Characterization Center MCC-1 glass durability test using distilled water). These glasses included the nonradioactive analogs of WV205 and SRL-165, as well as SRL-131, PNL 76-68, and a European glass, UK209. Predicted glass dissolution rates show similar fits to the data for all four models. The predictions of the models were also plotted for two subsets of the glasses: waste glasses and Savannah River Laboratory glasses. The model predictions fit the data for these groups much better than they fit the data for the entire set of glasses. 14 refs., 12 figs., 7 tabs

  11. A Robust Design Applicability Model

    DEFF Research Database (Denmark)

    Ebro, Martin; Lars, Krogstie; Howard, Thomas J.

    2015-01-01

    This paper introduces a model for assessing the applicability of Robust Design (RD) in a project or organisation. The intention of the Robust Design Applicability Model (RDAM) is to provide support for decisions by engineering management considering the relevant level of RD activities. The applic...

  12. Empirical Differences in Omission Tendency and Reading Ability in PISA: An Application of Tree-Based Item Response Models

    Science.gov (United States)

    Okumura, Taichi

    2014-01-01

    This study examined the empirical differences between the tendency to omit items and reading ability by applying tree-based item response (IRTree) models to the Japanese data of the Programme for International Student Assessment (PISA) held in 2009. For this purpose, existing IRTree models were expanded to contain predictors and to handle…

  13. Application of thermodynamics-based rate-dependent constitutive models of concrete in the seismic analysis of concrete dams

    Institute of Scientific and Technical Information of China (English)

    Leng Fei; Lin Gao

    2008-01-01

    This paper discusses the seismic analysis of concrete dams with consideration of material nonlinearity. Based on a consistent rate-dependent model and two thermodynamics-based models, two thermodynamics-based rate-dependent constitutive models were developed with consideration of the influence of the strain rate. They can describe the dynamic behavior of concrete and be applied to nonlinear seismic analysis of concrete dams taking into account the rate sensitivity of concrete. With the two models, a nonlinear analysis of the seismic response of the Koyna Gravity Dam and the Dagangshan Arch Dam was conducted. The results were compared with those of a linear elastic model and two rate-independent thermodynamics-based constitutive models, and the influences of constitutive models and strain rate on the seismic response of concrete dams were discussed. It can be concluded from the analysis that, during seismic response, the tensile stress is the control stress in the design and seismic safety evaluation of concrete dams. In different models, the plastic strain and plastic strain rate of concrete dams show a similar distribution. When the influence of the strain rate is considered, the maximum plastic strain and plastic strain rate decrease.

  14. Application of thermodynamics-based rate-dependent constitutive models of concrete in the seismic analysis of concrete dams

    Directory of Open Access Journals (Sweden)

    Fei LENG

    2008-09-01

    Full Text Available This paper discusses the seismic analysis of concrete dams with consideration of material nonlinearity. Based on a consistent rate-dependent model and two thermodynamics-based models, two thermodynamics-based rate-dependent constitutive models were developed with consideration of the influence of the strain rate. They can describe the dynamic behavior of concrete and be applied to nonlinear seismic analysis of concrete dams taking into account the rate sensitivity of concrete. With the two models, a nonlinear analysis of the seismic response of the Koyna Gravity Dam and the Dagangshan Arch Dam was conducted. The results were compared with those of a linear elastic model and two rate-independent thermodynamics-based constitutive models, and the influences of constitutive models and strain rate on the seismic response of concrete dams were discussed. It can be concluded from the analysis that, during seismic response, the tensile stress is the control stress in the design and seismic safety evaluation of concrete dams. In different models, the plastic strain and plastic strain rate of concrete dams show a similar distribution. When the influence of the strain rate is considered, the maximum plastic strain and plastic strain rate decrease.

  15. Wind-Climate Estimation Based on Mesoscale and Microscale Modeling: Statistical-Dynamical Downscaling for Wind Energy Applications

    DEFF Research Database (Denmark)

    Badger, Jake; Frank, Helmut; Hahmann, Andrea N.;

    2014-01-01

    turbine site. The method is divided into two parts: 1) preprocessing, in which the configurations for the mesoscale model simulations are determined, and 2) postprocessing, in which the data from the mesoscale simulations are prepared for wind energy application. Results from idealized mesoscale modeling...... experiments for a challenging wind farm site in northern Spain are presented to support the preprocessing method. Comparisons of modeling results with measurements from the same wind farm site are presented to support the postprocessing method. The crucial element in postprocessing is the bridging...... of mesoscale modeling data to microscale modeling input data, via a so-called generalization method. With this method, very high-resolution wind resource mapping can be achieved....

  16. Model Based Definition

    Science.gov (United States)

    Rowe, Sidney E.

    2010-01-01

    In September 2007, the Engineering Directorate at the Marshall Space Flight Center (MSFC) created the Design System Focus Team (DSFT). MSFC was responsible for the in-house design and development of the Ares 1 Upper Stage and the Engineering Directorate was preparing to deploy a new electronic Configuration Management and Data Management System with the Design Data Management System (DDMS) based upon a Commercial Off The Shelf (COTS) Product Data Management (PDM) System. The DSFT was to establish standardized CAD practices and a new data life cycle for design data. Of special interest here, the design teams were to implement Model Based Definition (MBD) in support of the Upper Stage manufacturing contract. It is noted that this MBD does use partially dimensioned drawings for auxiliary information to the model. The design data lifecycle implemented several new release states to be used prior to formal release that allowed the models to move through a flow of progressive maturity. The DSFT identified some 17 Lessons Learned as outcomes of the standards development, pathfinder deployments and initial application to the Upper Stage design completion. Some of the high value examples are reviewed.

  17. A Unified ASrchitecture Model of Web Applications

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    With the increasing popularity,scale and complexity of web applications,design and development of web applications are becoming more and more difficult,However,the current state of their design and development is characterized by anarchy and ad hoc methodologies,One of the causes of this chaotic situation is that different researchers and designers have different understanding of web applications.In this paper,based on an explicit understanding of web applications,we present a unified architecture model of wed applications,the four-view model,which addresses the analysis and design issues of web applications from four perspectives,namely,logical view,data view,navigation view and presentation view,each addrssing a specific set of concerns of web applications,the purpose of the model is to provide a clear picture of web applications to alleviate the chaotic situation and facilitate its analysis,design and implementation.

  18. A Model of Application System for Man-Machine-Environment System Engineering in Vessels Based on IDEF0

    Institute of Scientific and Technical Information of China (English)

    Zhen Shang; Changhua Qiu; Shifan Zhu

    2011-01-01

    Applying man-machine-environment system engineering (MMESE) in vessels is a method to improve the effectiveness of the interaction between equipment,environment,and humans for the purpose of advancing operating efficiency,performance,safety,and habitability of a vessel and its subsystems.In the following research,the life cycle of vessels was divided into 9 phases,and 15 research subjects were also identified from among these phases.The 15 subjects were systemized,and then the man-machine-environment engineering system application model for vessels was developed using the ICAM definition method 0 (IDEF0),which is a systematical modeling method.This system model bridges the gap between the data and information flow of every two associated subjects with the major basic research methods and approaches included,which brings the formerly relatively independent subjects together as a whole.The application of this systematic model should facilitate the application of man-machine-environment system engineering in vessels,especially at the conceptual and embodiment design phases.The managers and designers can deal with detailed tasks quickly and efficiently while reducing repetitive work.

  19. Roadway management plan based on rockfall modelling calibration and validation. Application along the Ma-10 road in Mallorca (Spain)

    Science.gov (United States)

    Mateos, Rosa Maria; Garcia, Inmaculada; Reichenbach, Paola; Herrera, Gerardo; Sarro, Roberto; Rius, Joan; Aguilo, Raul

    2016-04-01

    The Tramuntana range, in the northwestern sector of the island of Mallorca (Spain), is frequently affected by rockfalls which have caused significant damage, mainly along the road network. The Ma-10 road constitutes the main transportation corridor on the range with a heavy traffic estimated at 7,200 vehicles per day on average. With a length of 111 km and a tortuous path, the road is the connecting track for 12 municipalities and constitutes a strategic road on the island for many tourist resorts. For the period spanning from 1995 to current times, 63 rockfalls have affected the Ma-10 road with volumes ranging from 0.3m3 to 30,000 m3. Fortunately, no fatalities occurred but numerous blockages on the road took place which caused significant economic losses, valued of around 11 MEuro (Mateos el al., 2013). In this work we present the procedure we have applied to calibrate and validate rockfall modelling in the Tramuntana region, using 103 cases of the available detailed rockfall inventory (Mateos, 2006). We have exploited STONE (Guzzetti et al. 2002), a GIS based rockfall simulation software which computes 2D and 3D rockfall trajectories starting from a DTM and maps of the dynamic rolling friction coefficient and of the normal and tangential energy restitution coefficients. The appropriate identification of these parameters determines the accuracy of the simulation. To calibrate them, we have selected 40 rockfalls along the range which include a wide variety of outcropping lithologies. Coefficients values have been changed in numerous attempts in order to select those where the extent and shape of the simulation matched the field mapping. Best results were summarized with the average statistical values for each parameter and for each geotechnical unit, determining that mode values represent more precisely the data. Initially, for the validation stage, 10 well- known rockfalls exploited in the calibration phase have been selected. Confidence tests have been applied

  20. A Habitat-based Wind-Wildlife Collision Model with Application to the Upper Great Plains Region

    Energy Technology Data Exchange (ETDEWEB)

    Forcey, Greg, M.

    2012-08-28

    compared among species, our model outputs provide a convenient and easy landscape-level tool to quickly screen for siting issues at a high level. The model resolution is suitable for state or multi-county siting but users are cautioned against using these models for micrositing. The U.S. Fish and Wildlife Service recently released voluntary land-based wind energy guidelines for assessing impacts of a wind facility to wildlife using a tiered approach. The tiered approach uses an iterative approach for assessing impacts to wildlife in levels of increasing detail from landscape-level screening to site-specific field studies. Our models presented in this paper would be applicable to be used as tools to conduct screening at the tier 1 level and would not be appropriate to complete smaller scale tier 2 and tier 3 level studies. For smaller scale screening ancillary field studies should be conducted at the site-specific level to validate collision predictions.

  1. Investigation of the applicability of a functional programming model to fault-tolerant parallel processing for knowledge-based systems

    Science.gov (United States)

    Harper, Richard

    1989-01-01

    In a fault-tolerant parallel computer, a functional programming model can facilitate distributed checkpointing, error recovery, load balancing, and graceful degradation. Such a model has been implemented on the Draper Fault-Tolerant Parallel Processor (FTPP). When used in conjunction with the FTPP's fault detection and masking capabilities, this implementation results in a graceful degradation of system performance after faults. Three graceful degradation algorithms have been implemented and are presented. A user interface has been implemented which requires minimal cognitive overhead by the application programmer, masking such complexities as the system's redundancy, distributed nature, variable complement of processing resources, load balancing, fault occurrence and recovery. This user interface is described and its use demonstrated. The applicability of the functional programming style to the Activation Framework, a paradigm for intelligent systems, is then briefly described.

  2. A Codon-Based Model of Host-Specific Selection in Parasites, with an Application to the Influenza A Virus

    DEFF Research Database (Denmark)

    Forsberg, Ronald; Christiansen, Freddy Bugge

    2003-01-01

    involved in hostspecific adaptation. We discuss the applicability of the model to the more general problem of ascertaining whether the selective regime differs between two groups of related organisms. The utility of the model is illustrated on a dataset of nucleoprotein sequences from the influenza A virus......Parasites sometimes expand their host range by acquiring a new host species. Following a host change event, the selective regime acting on a given parasite gene may change due to host-specific adaptive alterations of protein functionality or host-specific immune-mediated selection. We present...... obtained from avian and human hosts....

  3. From situation modelling to a distributed rule-based platform for situation awareness: an ontological framework for disaster management applications

    OpenAIRE

    Moreira, João Luiz Rebelo

    2015-01-01

    Situation-aware (SA) applications are particularly useful for disaster management. The complex nature of emergency scenarios presents challenges to the development of collaborative and distributed SA solutions. These challenges concern the whole lifecycle, from specification to implementation phases, such as how to model the reaction behavior of a detected situation and how to provide an interoperable situation notification service. In addition, treating unforeseen situations, i.e. situations...

  4. Efficient GIS-based model-driven method for flood risk management and its application in central China

    Science.gov (United States)

    Liu, Y.; Zhou, J.; Song, L.; Zou, Q.; Guo, J.; Wang, Y.

    2014-02-01

    In recent years, an important development in flood management has been the focal shift from flood protection towards flood risk management. This change greatly promoted the progress of flood control research in a multidisciplinary way. Moreover, given the growing complexity and uncertainty in many decision situations of flood risk management, traditional methods, e.g., tight-coupling integration of one or more quantitative models, are not enough to provide decision support for managers. Within this context, this paper presents a beneficial methodological framework to enhance the effectiveness of decision support systems, through the dynamic adaptation of support regarding the needs of the decision-maker. In addition, we illustrate a loose-coupling technical prototype for integrating heterogeneous elements, such as multi-source data, multidisciplinary models, GIS tools and existing systems. The main innovation is the application of model-driven concepts, which put the system in a state of continuous iterative optimization. We define the new system as a model-driven decision support system (MDSS ). Two characteristics that differentiate the MDSS are as follows: (1) it is made accessible to non-technical specialists; and (2) it has a higher level of adaptability and compatibility. Furthermore, the MDSS was employed to manage the flood risk in the Jingjiang flood diversion area, located in central China near the Yangtze River. Compared with traditional solutions, we believe that this model-driven method is efficient, adaptable and flexible, and thus has bright prospects of application for comprehensive flood risk management.

  5. APPLICATION OF TWO VERSIONS OF A RNG BASED k-ε MODEL TO NUMERICAL SIMULATIONS OF TURBULENT IMPINGING JET FLOW

    Institute of Scientific and Technical Information of China (English)

    Chen Qing-guang; Xu Zhong; Zhang Yong-jian

    2003-01-01

    Two independent versions of the RNG based k-ε turbulence model in conjunction with the law of the wall have been applied to the numerical simulation of an axisymmetric turbulent impinging jet flow field. The two model predictions are compared with those of the standard k-ε model and with the experimental data measured by LDV (Laser Doppler Velocimetry). It shows that the original version of the RNG k-ε model with the choice of Cε1=1.063 can not yield good results, among them the predicted turbulent kinetic energy profiles in the vicinity of the stagnation region are even worse than those predicted by the standard k-ε model. However, the new version of RNG k-ε model behaves well. This is mainly due to the corrections to the constants Cε1 and Cε2 along with a modification of the production term to account for non-equilibrium strain rates in the flow.

  6. Performance-degradation model for Li4Ti5O12-based battery cells used in wind power applications

    DEFF Research Database (Denmark)

    Stroe, Daniel Ioan; Swierczynski, Maciej Jozef; Stan, Ana-Irina;

    2012-01-01

    Energy storage systems based on Lithium-ion batteries have the potential to mitigate the negative impact of wind power grid integration on the power system stability, which is caused by the characteristics of the wind. This paper presents a performance model for a Li4Ti5O12/LiMO2 battery cell....... For developing the performance model an EIS-based electrical modelling approach was followed. The obtained model is able to predict with high accuracy charge and discharge voltage profiles for different ages of the battery cell and for different charging/discharging current rates. Moreover, the ageing behaviour...... of the battery cell was analysed for the case of accelerated cycling ageing with a certain mission profile....

  7. Parametric estimation of covariance function in Gaussian-process based Kriging models. Application to uncertainty quantification for computer experiments

    International Nuclear Information System (INIS)

    The parametric estimation of the covariance function of a Gaussian process is studied, in the framework of the Kriging model. Maximum Likelihood and Cross Validation estimators are considered. The correctly specified case, in which the covariance function of the Gaussian process does belong to the parametric set used for estimation, is first studied in an increasing-domain asymptotic framework. The sampling considered is a randomly perturbed multidimensional regular grid. Consistency and asymptotic normality are proved for the two estimators. It is then put into evidence that strong perturbations of the regular grid are always beneficial to Maximum Likelihood estimation. The incorrectly specified case, in which the covariance function of the Gaussian process does not belong to the parametric set used for estimation, is then studied. It is shown that Cross Validation is more robust than Maximum Likelihood in this case. Finally, two applications of the Kriging model with Gaussian processes are carried out on industrial data. For a validation problem of the friction model of the thermal-hydraulic code FLICA 4, where experimental results are available, it is shown that Gaussian process modeling of the FLICA 4 code model error enables to considerably improve its predictions. Finally, for a meta modeling problem of the GERMINAL thermal-mechanical code, the interest of the Kriging model with Gaussian processes, compared to neural network methods, is shown. (author)

  8. Structural equation modeling methods and applications

    CERN Document Server

    Wang, Jichuan

    2012-01-01

    A reference guide for applications of SEM using Mplus Structural Equation Modeling: Applications Using Mplus is intended as both a teaching resource and a reference guide. Written in non-mathematical terms, this book focuses on the conceptual and practical aspects of Structural Equation Modeling (SEM). Basic concepts and examples of various SEM models are demonstrated along with recently developed advanced methods, such as mixture modeling and model-based power analysis and sample size estimate for SEM. The statistical modeling program, Mplus, is also featured and provides researchers with a

  9. Study and Application of Safety Risk Evaluation Model for CO2 Geological Storage Based on Uncertainty Measure Theory

    OpenAIRE

    Hujun He; Yaning Zhao; Xingke Yang; Yaning Gao; Xu Wu

    2015-01-01

    Analyzing showed that the safety risk evaluation for CO2 geological storage had important significance. Aimed at the characteristics of CO2 geological storage safety risk evaluation, drawing on previous research results, rank and order models for safety risk evaluation of CO2 geological storage were put forward based on information entropy and uncertainty measure theory. In this model, the uncertainty problems in safety risk evaluation of CO2 geological storage were solved by qualitative anal...

  10. Modelling Foundations and Applications

    DEFF Research Database (Denmark)

    and selected from 81 submissions. Papers on all aspects of MDE were received, including topics such as architectural modelling and product lines, code generation, domain-specic modeling, metamodeling, model analysis and verication, model management, model transformation and simulation. The breadth of topics...

  11. Application of Molecular Interaction Volume Model for Phase Equilibrium of Sn-Based Binary System in Vacuum Distillation

    Science.gov (United States)

    Kong, Lingxin; Yang, Bin; Xu, Baoqiang; Li, Yifu

    2014-09-01

    Based on the molecular interaction volume model (MIVM), the activities of components of Sn-Sb, Sb-Bi, Sn-Zn, Sn-Cu, and Sn-Ag alloys were predicted. The predicted values are in good agreement with the experimental data, which indicate that the MIVM is of better stability and reliability due to its good physical basis. A significant advantage of the MIVM lies in its ability to predict the thermodynamic properties of liquid alloys using only two parameters. The phase equilibria of Sn-Sb and Sn-Bi alloys were calculated based on the properties of pure components and the activity coefficients, which indicates that Sn-Sb and Sn-Bi alloys can be separated thoroughly by vacuum distillation. This study extends previous investigations and provides an effective and convenient model on which to base refining simulations for Sn-based alloys.

  12. Hyaluronic acid algorithm-based models for assessment of liver ifbrosis:translation from basic science to clinical application

    Institute of Scientific and Technical Information of China (English)

    Zeinab Babaei; Hadi Parsian

    2016-01-01

    BACKGROUND: The estimation of liver ifbrosis is usually dependent on liver biopsy evaluation. Because of its disad-vantages and side effects, researchers try to ifnd non-invasive methods for the assessment of liver injuries. Hyaluronic acid has been proposed as an index for scoring the severity of if-brosis, alone or in algorithm models. The algorithm model in which hyaluronic acid was used as a major constituent was more reliable and accurate in diagnosis than hyaluronic acid alone. This review described various hyaluronic acid algo-rithm-based models for assessing liver ifbrosis. DATA SOURCE: A PubMed database search was performed to identify the articles relevant to hyaluronic acid algorithm-based models for estimating liver ifbrosis. RESULT: The use of hyaluronic acid in an algorithm model is an extra and valuable tool for assessing liver ifbrosis. CONCLUSIONS: Although hyaluronic acid algorithm-based models have good diagnostic power in liver ifbrosis assess-ment, they cannot render the need for liver biopsy obsolete and it is better to use them in parallel with liver biopsy. They can be used when frequent liver biopsy is not possible in situa-tions such as highlighting the efifcacy of treatment protocol for liver ifbrosis.

  13. In vitro permeation models for healthy and compromised skin: The Phospholipid Vesicle-based Permeation Assay (PVPA) for skin applications

    OpenAIRE

    Engesland, André

    2015-01-01

    In vitro models with the ability to estimate drug penetration through healthy and compromised skin may reduce animal testing of drugs and cosmetics to a minimum. The phospholipid vesicle based permeation assay (PVPA) is based on a tight barrier composed of liposomes mimicking cells. It was originally made to mimic the intestinal epithelial barrier and in this project further developed to mimic the stratum corneum barrier of the skin. The lipid composition was changed to better mimic the lipid...

  14. Simulation-Based Estimation of the Structural Errors-in-Variables Negative Binomial Regression Model with an Application

    OpenAIRE

    Jie Q. Guo; Tong Li

    2001-01-01

    This paper studies the effects and estimation of errors-in-variables negative binomial regression model. We prove that in the presence of measurement errors, in general, maximum likelihood estimator of the overdispersion using the observed data is biased upward. We adopt a structural approach assuming that the distribution of the latent variables is known and propose a simulation-based corrected maximum likelihood estimator and a simulation-based corrected score estimator to estimate the erro...

  15. Three-parameter-based streamflow elasticity model: application to MOPEX basins in the USA at annual and seasonal scales

    Science.gov (United States)

    Konapala, Goutam; Mishra, Ashok K.

    2016-07-01

    We present a three-parameter streamflow elasticity model as a function of precipitation, potential evaporation, and change in groundwater storage applicable at both seasonal and annual scales. The model was applied to 245 Model Parameter Estimation Experiment (MOPEX) basins spread across the continental USA. The analysis of the modified equation at annual and seasonal scales indicated that the groundwater and surface water storage change contributes significantly to the streamflow elasticity. Overall, in case of annual as well as seasonal water balances, precipitation has higher elasticity values when compared to both potential evapotranspiration and storage changes. The streamflow elasticities show significant nonlinear associations with the climate conditions of the catchments indicating a complex interplay between elasticities and climate variables with substantial seasonal variations.

  16. An Empirical Polarizable Force Field Based on the Classical Drude Oscillator Model: Development History and Recent Applications.

    Science.gov (United States)

    Lemkul, Justin A; Huang, Jing; Roux, Benoît; MacKerell, Alexander D

    2016-05-11

    Molecular mechanics force fields that explicitly account for induced polarization represent the next generation of physical models for molecular dynamics simulations. Several methods exist for modeling induced polarization, and here we review the classical Drude oscillator model, in which electronic degrees of freedom are modeled by charged particles attached to the nuclei of their core atoms by harmonic springs. We describe the latest developments in Drude force field parametrization and application, primarily in the last 15 years. Emphasis is placed on the Drude-2013 polarizable force field for proteins, DNA, lipids, and carbohydrates. We discuss its parametrization protocol, development history, and recent simulations of biologically interesting systems, highlighting specific studies in which induced polarization plays a critical role in reproducing experimental observables and understanding physical behavior. As the Drude oscillator model is computationally tractable and available in a wide range of simulation packages, it is anticipated that use of these more complex physical models will lead to new and important discoveries of the physical forces driving a range of chemical and biological phenomena.

  17. Generalizability and Applicability of Model-Based Business Process Compliance-Checking Approaches – A State-of-the-Art Analysis and Research Roadmap

    Directory of Open Access Journals (Sweden)

    Jörg Becker

    2012-11-01

    Full Text Available With a steady increase of regulatory requirements for business processes, automation support of compliance management is a field garnering increasing attention in Information Systems research. Several approaches have been developed to support compliance checking of process models. One major challenge for such approaches is their ability to handle different modeling techniques and compliance rules in order to enable widespread adoption and application. Applying a structured literature search strategy, we reflect and discuss compliance-checking approaches in order to provide an insight into their generalizability and evaluation. The results imply that current approaches mainly focus on special modeling techniques and/or a restricted set of types of compliance rules. Most approaches abstain from real-world evaluation which raises the question of their practical applicability. Referring to the search results, we propose a roadmap for further research in model-based business process compliance checking.

  18. RIGID-PLASTIC/RIGID-VISCOPLASTIC FEM BASED ON LINEAR PROGRAMMING—THEORETICAL MODELING AND APPLICATION FOR AXISYMMETRICAL PROBLEMS

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Compared with the traditional rigid-plastic/rigid-viscoplastic(RP/RVP) FEM(based on iteration solution),RP/RVP FEM based on linear programming (LP) has some remarkable advantages,such as it's free of convergence problem and its convenience in contact,rigid zone,and friction force treatment.The numerical model of RP/RVP FEM based on LP for axisymmetrical metal forming simulation is studied,and some related key factors and its treatment methods in formulation of constraint condition are proposed.Some solution examples are provided to validate its accuracy and efficiency.

  19. Modelling of Impulsional pH Variations Using ChemFET-Based Microdevices: Application to Hydrogen Peroxide Detection

    Directory of Open Access Journals (Sweden)

    Abdou Karim Diallo

    2014-02-01

    Full Text Available This work presents the modelling of impulsional pH variations in microvolume related to water-based electrolysis and hydrogen peroxide electrochemical oxidation using an Electrochemical Field Effect Transistor (ElecFET microdevice. This ElecFET device consists of a pH-Chemical FET (pH-ChemFET with an integrated microelectrode around the dielectric gate area in order to trigger electrochemical reactions. Combining oxidation/reduction reactions on the microelectrode, water self-ionization and diffusion properties of associated chemical species, the model shows that the sensor response depends on the main influential parameters such as: (i polarization parameters on the microelectrode, i.e., voltage (Vp and time (tp; (ii distance between the gate sensitive area and the microelectrode (d; and (iii hydrogen peroxide concentration ([H2O2]. The model developed can predict the ElecFET response behaviour and creates new opportunities for H2O2-based enzymatic detection of biomolecules.

  20. CORBA Based CIMS Application Integration

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    Common object request broker architecture (CORBA) provides the framework and the mechanism for distributed object operation. It can also be applied to computer integrated manufacturing system (CIMS) application integration. This paper studies the CIMS information service requirement, presents a CORBA based integration approach including the CORBA based CIM information system architecture and the application integration mechanism, and discusses the relationship between CORBA and the CIM application integration platform.

  1. An operational weather radar-based Quantitative Precipitation Estimation and its application in catchment water resources modeling

    DEFF Research Database (Denmark)

    He, Xin; Vejen, Flemming; Stisen, Simon;

    2011-01-01

    of precipitation compared with rain-gauge-based methods, thus providing the basis for better water resources assessments. The radar QPE algorithm called ARNE is a distance-dependent areal estimation method that merges radar data with ground surface observations. The method was applied to the Skjern River catchment...... in western Denmark where alternative precipitation estimates were also used as input to an integrated hydrologic model. The hydrologic responses from the model were analyzed by comparing radar- and ground-based precipitation input scenarios. Results showed that radar QPE products are able to generate...... reliable simulations of stream flow and water balance. The potential of using radar-based precipitation was found to be especially high at a smaller scale, where the impact of spatial resolution was evident from the stream discharge results. Also, groundwater recharge was shown to be sensitive...

  2. Model for water pollution remote sensing based on double scattering and its application in the Zhujiang River outfall

    Institute of Scientific and Technical Information of China (English)

    DENG Ruru; LIU Qinhuo; KE Ruiping; CHENG Lei; LIU Xiaoping

    2004-01-01

    It is a valid route for quantitatively remote sensing on water pollution to build a model according to the physical mechanisms of scattering and absorbing of suspended substance, pollutant, and molecules of water. Remote sensing model for water pollution based on single scattering is simple and easy to be used, but the precision is affected by turbidity of water. The characteristics of the energy composition of multiple scattering, are analyzed and it is proposed that, based on the model of single scattering, ifthe flux of the second scattering is considered additionally, the precision of the modelwill be remarkably improved and the calculation is still very simple. The factor of the second scattering is deduced to build a double scattering model, and the practical arithmetic for the calculation of the model is put forward. The result of applying this model in the water area around the Zhujiang(Pearl) River outfall shows that the precision is obviously improved. The result also shows that the seriously polluted water area is distributed in the northeast of Lingding Sea, the Victoria Bay of Hong Kong, and the Shengzhen Bay.

  3. Model-Based Reasoning

    Science.gov (United States)

    Ifenthaler, Dirk; Seel, Norbert M.

    2013-01-01

    In this paper, there will be a particular focus on mental models and their application to inductive reasoning within the realm of instruction. A basic assumption of this study is the observation that the construction of mental models and related reasoning is a slowly developing capability of cognitive systems that emerges effectively with proper…

  4. A dislocation density based crystal plasticity finite element model: Application to a two-phase polycrystalline HCP/BCC composites

    Science.gov (United States)

    Ardeljan, Milan; Beyerlein, Irene J.; Knezevic, Marko

    2014-05-01

    We present a multiscale model for anisotropic, elasto-plastic, rate- and temperature-sensitive deformation of polycrystalline aggregates to large plastic strains. The model accounts for a dislocation-based hardening law for multiple slip modes and links a single-crystal to a polycrystalline response using a crystal plasticity finite element based homogenization. It is capable of predicting local stress and strain fields based on evolving microstructure including the explicit evolution of dislocation density and crystallographic grain reorientation. We apply the model to simulate monotonic mechanical response of a hexagonal close-packed metal, zirconium (Zr), and a body-centered cubic metal, niobium (Nb), and study the texture evolution and deformation mechanisms in a two-phase Zr/Nb layered composite under severe plastic deformation. The model predicts well the texture in both co-deforming phases to very large plastic strains. In addition, it offers insights into the active slip systems underlying texture evolution, indicating that the observed textures develop by a combination of prismatic, pyramidal, and anomalous basal slip in Zr and primarily {110} slip and secondly {112} slip in Nb.

  5. LSER-based modeling vapor pressures of (solvent+salt) systems by application of Xiang-Tan equation

    Institute of Scientific and Technical Information of China (English)

    Aynur Senol

    2015-01-01

    The study deals with modeling the vapor pressures of (solvent+salt) systems depending on the linear solvation energy relation (LSER) principles. The LSER-based vapor pressure model clarifies the simultaneous impact of the vapor pressure of a pure solvent estimated by the Xiang-Tan equation, the solubility and solvatochromic parameters of the solvent and the physical properties of the ionic salt. It has been performed independently two structural forms of the generalized solvation model, i.e. the unified solvation model with the integrated properties (USMIP) containing nine physical descriptors and the reduced property-basis solvation model. The vapor pressure data of fourteen (solvent+salt) systems have been processed to analyze statistical y the reliabil-ity of existing models in terms of a log-ratio objective function. The proposed vapor pressure approaches reproduce the observed performance relatively accurately, yielding the overall design factors of 1.0643 and 1.0702 for the integrated property-basis and reduced property-basis solvation models.

  6. Model Predictions and Ground-based Observations for Jupiter's Magnetospheric Environment: Application to the JUICE and Juno Missions

    Science.gov (United States)

    Achilleos, Nicholas; Guio, Patrick; Arridge, Christopher S.; Ray, Licia C.; Yates, Japheth N.; Fossey, Stephen J.; Savini, Giorgio; Pearson, Mick; Fernando, Nathalie; Gerasimov, Roman; Murat, Thomas

    2016-10-01

    The advent of new missions to the Jovian system such as Juno (recently arrived) and JUICE (scheduled for 2022 launch) makes timely the provision of model-based predictions for the physical conditions to be encountered by these spacecraft; as well as the planning of simultaneous, ground-based observations of the Jovian system.Using the UCL Jovian magnetodisc model, which calculates magnetic field and plasma distributionsaccording to Caudal's (1986) force-balance formalism, we provide predictions of the following quantities along representative Juno / JUICE orbits through the middle magnetosphere: (i) Magnetic field strength and direction; (ii) Density and / or pressure of the 'cold' and 'hot' particle populations; (iii) Plasma angular velocity.The characteristic variation in these parameters is mainly influenced by the periodic approaches towards and recessions from the magnetodisc imposed on the 'synthetic spacecraft' by the planet's rotating, tilteddipole field. We also include some corresponding predictions for ionospheric / thermospheric conditions at the magnetic footpoint of the spacecraft, using the JASMIN model (Jovian Atmospheric Simulatorwith Magnetosphere, Ionosphere and Neutrals).We also present preliminary imaging results from 'IoSpot', a planned, ground-based programme of observations based at the University College London Observatory (UCLO) which targets ionized sulphur emissions from the Io plasma torus. Such programmes, conducted simultaneously with the above missions, will provide valuable context for the overall physical conditions within the Jovian magnetosphere, for which Io's volcanoes are the principal source of plasma.

  7. Development of Hierarchical Bayesian Model Based on Regional Frequency Analysis and Its Application to Estimate Areal Rainfall in South Korea

    Science.gov (United States)

    Kim, J.; Kwon, H. H.

    2014-12-01

    The existing regional frequency analysis has disadvantages in that it is difficult to consider geographical characteristics in estimating areal rainfall. In this regard, This study aims to develop a hierarchical Bayesian model based regional frequency analysis in that spatial patterns of the design rainfall with geographical information are explicitly incorporated. This study assumes that the parameters of Gumbel distribution are a function of geographical characteristics (e.g. altitude, latitude and longitude) within a general linear regression framework. Posterior distributions of the regression parameters are estimated by Bayesian Markov Chain Monte Calro (MCMC) method, and the identified functional relationship is used to spatially interpolate the parameters of the Gumbel distribution by using digital elevation models (DEM) as inputs. The proposed model is applied to derive design rainfalls over the entire Han-river watershed. It was found that the proposed Bayesian regional frequency analysis model showed similar results compared to L-moment based regional frequency analysis. In addition, the model showed an advantage in terms of quantifying uncertainty of the design rainfall and estimating the area rainfall considering geographical information. Acknowledgement: This research was supported by a grant (14AWMP-B079364-01) from Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  8. A model for assessment of telemedicine applications

    DEFF Research Database (Denmark)

    Kidholm, Kristian; Ekeland, Anne Granstrøm; Jensen, Lise Kvistgaard;

    2012-01-01

    Telemedicine applications could potentially solve many of the challenges faced by the healthcare sectors in Europe. However, a framework for assessment of these technologies is need by decision makers to assist them in choosing the most efficient and cost-effective technologies. Therefore in 2009...... the European Commission initiated the development of a framework for assessing telemedicine applications, based on the users' need for information for decision making. This article presents the Model for ASsessment of Telemedicine applications (MAST) developed in this study....

  9. Multilevel Models Applications Using SAS

    CERN Document Server

    Wang, Jichuan; Fisher, James

    2011-01-01

    This book covers a broad range of topics about multilevel modeling. The goal is to help readersto understand the basic concepts, theoretical frameworks, and application methods of multilevel modeling. Itis at a level also accessible to non-mathematicians, focusing on the methods and applications of various multilevel models and using the widely used statistical software SAS®.Examples are drawn from analysis of real-world research data.

  10. Fundamental Study on Applicability of Powder-Based 3D Printer for Physical Modeling in Rock Mechanics

    Science.gov (United States)

    Fereshtenejad, Sayedalireza; Song, Jae-Joon

    2016-06-01

    Applications of 3D printing technology become more widespread in many research fields because of its rapid development and valuable capabilities. In rock mechanics and mining engineering, this technology has the potential to become a useful tool that might help implement a number of research studies previously considered impractical. Most commercial 3D printers cannot print prototypes with mechanical properties that match precisely those of natural rock samples. Therefore, some additional enhancements are required for 3D printers to be effectively utilized for rock mechanics applications. In this study, we printed and studied specimens using a powder-based commercial ZPrinter® 450 with ZP® 150 powder and Zb® 63 binder used as raw materials. The specimens printed by this 3D printer exhibited relatively low strength and ductile behavior, implying that it needs further improvements. Hence, we focused on several ways to determine the best combination of printing options and post-processing including the effects of the printing direction, printing layer thickness, binder saturation level, and heating process on the uniaxial compressive strength (UCS) and stress-strain behavior of the printed samples. The suggested procedures have demonstrated their effectiveness by obtaining the printed samples that behave similarly to the natural rocks with low UCS. Although our optimization methods were particularly successful, further improvements are required to expand 3D printer application in the area of rock mechanics.

  11. An efficient algorithm for computing fixed length attractors based on bounded model checking in synchronous Boolean networks with biochemical applications.

    Science.gov (United States)

    Li, X Y; Yang, G W; Zheng, D S; Guo, W S; Hung, W N N

    2015-01-01

    Genetic regulatory networks are the key to understanding biochemical systems. One condition of the genetic regulatory network under different living environments can be modeled as a synchronous Boolean network. The attractors of these Boolean networks will help biologists to identify determinant and stable factors. Existing methods identify attractors based on a random initial state or the entire state simultaneously. They cannot identify the fixed length attractors directly. The complexity of including time increases exponentially with respect to the attractor number and length of attractors. This study used the bounded model checking to quickly locate fixed length attractors. Based on the SAT solver, we propose a new algorithm for efficiently computing the fixed length attractors, which is more suitable for large Boolean networks and numerous attractors' networks. After comparison using the tool BooleNet, empirical experiments involving biochemical systems demonstrated the feasibility and efficiency of our approach.

  12. Modelling of Impulsional pH Variations Using ChemFET-Based Microdevices: Application to Hydrogen Peroxide Detection

    OpenAIRE

    Abdou Karim Diallo; Lyes Djeghlaf; Jerome Launay; Pierre Temple-Boyer

    2014-01-01

    This work presents the modelling of impulsional pH variations in microvolume related to water-based electrolysis and hydrogen peroxide electrochemical oxidation using an Electrochemical Field Effect Transistor (ElecFET) microdevice. This ElecFET device consists of a pH-Chemical FET (pH-ChemFET) with an integrated microelectrode around the dielectric gate area in order to trigger electrochemical reactions. Combining oxidation/reduction reactions on the microelectrode, water self-ionization and...

  13. Methane emissions from floodplains in the Amazon Basin: challenges in developing a process-based model for global applications

    OpenAIRE

    Ringeval, B.; S. Houweling; P. M. van Bodegom; R. Spahni; De Beek, R.; Joos, F.; Röckmann, T.

    2014-01-01

    Tropical wetlands are estimated to represent about 50% of the natural wetland methane (CH4) emissions and explain a large fraction of the observed CH4 variability on timescales ranging from glacial–interglacial cycles to the currently observed year-to-year variability. Despite their importance, however, tropical wetlands are poorly represented in global models aiming to predict global CH4 emissions. This publication documents a first step in the development of a process-base...

  14. Analysis of different model-based approaches for estimating dFRC for real-time application

    OpenAIRE

    Van Drunen, EJ; Chase, JG; Chiew, YS; Shaw, GM; Desaive, Thomas

    2013-01-01

    Background Acute Respiratory Distress Syndrome (ARDS) is characterized by inflammation, filling of the lung with fluid and the collapse of lung units. Mechanical ventilation (MV) is used to treat ARDS using positive end expiratory pressure (PEEP) to recruit and retain lung units, thus increasing pulmonary volume and dynamic functional residual capacity (dFRC) at the end of expiration. However, simple, non-invasive methods to estimate dFRC do not exist. Methods Four model-based methods for est...

  15. Land Use Allocation Based on a Multi-Objective Artificial Immune Optimization Model: An Application in Anlu County, China

    Directory of Open Access Journals (Sweden)

    Xiaoya Ma

    2015-11-01

    Full Text Available As the main feature of land use planning, land use allocation (LUA optimization is an important means of creating a balance between the land-use supply and demand in a region and promoting the sustainable utilization of land resources. In essence, LUA optimization is a multi-objective optimization problem under the land use supply and demand constraints in a region. In order to obtain a better sustainable multi-objective LUA optimization solution, the present study proposes a LUA model based on the multi-objective artificial immune optimization algorithm (MOAIM-LUA model. The main achievements of the present study are as follows: (a the land-use supply and demand factors are analyzed and the constraint conditions of LUA optimization problems are constructed based on the analysis framework of the balance between the land use supply and demand; (b the optimization objectives of LUA optimization problems are defined and modeled using ecosystem service value theory and land rent and price theory; and (c a multi-objective optimization algorithm is designed for solving multi-objective LUA optimization problems based on the novel immune clonal algorithm (NICA. On the basis of the aforementioned achievements, MOAIM-LUA was applied to a real case study of land-use planning in Anlu County, China. Compared to the current land use situation in Anlu County, optimized LUA solutions offer improvements in the social and ecological objective areas. Compared to the existing models, such as the non-dominated sorting genetic algorithm-II, experimental results demonstrate that the model designed in the present study can obtain better non-dominated solution sets and is superior in terms of algorithm stability.

  16. Model-based real-time control for laser induced thermal therapy with applications to prostate cancer treatment

    Science.gov (United States)

    Feng, Yusheng; Fuentes, David; Stafford, R. Jason; Oden, J. Tinsley

    2009-02-01

    In this paper, we present a model-based predictive control system that is capable of capturing physical and biological variations of laser-tissue interaction as well as heterogeneity in real-time during laser induced thermal therapy (LITT). Using a three-dimensional predictive bioheat transfer model, which is built based on regular magnetic resonance imaging (MRI) anatomic scan and driven by imaging data produced by real-time magnetic resonance temperature imaging (MRTI), the computational system provides a regirous real-time predictive control during surgical operation process. The unique feature of the this system is its ability for predictive control based on validated model with high precision in real-time, which is made possible by implementation of efficient parallel algorithms. The major components of the current computational systems involves real-time finite element solution of the bioheat transfer induced by laser-tissue interaction, solution module of real-time calibration problem, optimal laser source control, goal-oriented error estimation applied to the bioheat transfer equation, and state-of-the-art imaging process module to characterize the heterogeneous biological domain. The system was tested in vivo in a canine animal model in which an interstitial laser probe was placed in the prostate region and the desired treatment outcome in terms of ablation temperature and damage zone were achieved. Using the guidance of the predictive model driven by real-time MRTI data while applying the optimized laser heat source has the potential to provide unprecedented control over the treatment outcome for laser ablation.

  17. Different-source gas emission prediction model of working face based on BP artificial neural network and its application

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, H.; Chang, W.; Zhang, B. [China University of Mining and Technology, Beijing (China)

    2007-05-15

    Back-propagation (BP) neural network analysis based on the difference- source gas emission quantity prediction theory was applied to predict the quantity of gas emitted from the coal seam being mined, the neighbouring coal seam and the goaf of the working face. Three separate gas emission prediction neural network models were established for these. The prediction model of the coal seam being mined was made up of three layers and nine parameters; that of the neighbouring coal seam was made up of three layers and eight parameters; and that of the goaf of three layers and four parameters. The difference-source gas emission prediction model can greatly improve prediction accuracy. BP neural network analysis using Matlab software was applied in a coal mine. 10 refs., 2 figs., 3 tabs.

  18. Diagnosis of dynamic systems based on explicit and implicit behavioural models: an application to gas turbines in Esprit Project Tiger

    Energy Technology Data Exchange (ETDEWEB)

    Trave-Massuyes, L. [Centre National de la Recherche Scientifique (CNRS), 31 - Toulouse (France); Milne, R.

    1995-12-31

    We are interested in the monitoring and diagnosis of dynamic systems. In our work, we are combining explicit temporal models of the behaviour of a dynamic system with implicit behavioural models supporting model based approaches. This work is drive by the needs of and applied to, two gas turbines of very different size and power. In this paper we describe the problems of building systems for these domains and illustrate how we have developed a system where these two approaches complement each other to provide a comprehensive fault detection and diagnosis system. We also explore the strengths and weaknesses of each approach. The work described here is currently working continuously, on line to a gas turbine in a major chemical plant. (author) 24 refs.

  19. GIS-Based (W+-W-) Weight of Evidence Model and Its Application to Gold Resources Assessment in Abitibi, Canada

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    The weight of evidence (WofE) model has been widely used for mineral potential mapping.During the conversion of a multiclass map into a binary map a lot of mineralization information is artificially added or lost because the generalization of the class within the cumulative distance interval to a linear feature is based on a maximum contrast, which matches a cumulative distance interval. Additionally,some categorical data evidence cannot be generated by this method because a maximum contrast does not exist. In this article, an alternative (W+ -W- )-based WofE model is proposed. In this model, the "(W+ -W- ) greater than zero or not" is used as a criterion to reclassify the corresponding categorical class into a presence or absence class to convert a multiclass map into a binary map. This model can be applied to both categorical data and successive data. The latter can be operated as categorical data. The W+ and W- of the generated binary maps can be recalculated, and several binary maps can be integrated on the condition that the reclassified binary evidences are conditionally independent of each other. This method effectively reduces artificial data and both nominal and categorical data can be operated. A case study of gold potential mapping in the Abitibi area, Ontario, Canada, shows that the gold potential map by the (W+ -W- ) model displays a smaller potential area but a higher posterior probability (POP),whereas the potential map by the traditional (W+ -W- ) model exhibits a larger potential area but a lower POP.

  20. Comprehensive distributed-parameters modeling and experimental validation of microcantilever-based biosensors with an application to ultrasmall biological species detection

    International Nuclear Information System (INIS)

    Nanotechnological advancements have made a great contribution in developing label-free and highly sensitive biosensors. The detection of ultrasmall adsorbed masses has been enabled by such sensors which transduce molecular interaction into detectable physical quantities. More specifically, microcantilever-based biosensors have caught widespread attention for offering a label-free, highly sensitive and inexpensive platform for biodetection. Although there are a lot of studies investigating microcantilever-based sensors and their biological applications, a comprehensive mathematical modeling and experimental validation of such devices providing a closed form mathematical framework is still lacking. In almost all of the studies, a simple lumped-parameters model has been proposed. However, in order to have a precise biomechanical sensor, a comprehensive model is required being capable of describing all phenomena and dynamics of the biosensor. Therefore, in this study, an extensive distributed-parameters modeling framework is proposed for the piezoelectric microcantilever-based biosensor using different methodologies for the purpose of detecting an ultrasmall adsorbed mass over the microcantilever surface. An optimum modeling methodology is concluded and verified with the experiment. This study includes three main parts. In the first part, the Euler–Bernoulli beam theory is used to model the nonuniform piezoelectric microcantilever. Simulation results are obtained and presented. The same system is then modeled as a nonuniform rectangular plate. The simulation results are presented describing model's capability in the detection of an ultrasmall mass. Finally the last part presents the experimental validation verifying the modeling results. It was shown that plate modeling predicts the real situation with a degree of precision of 99.57% whereas modeling the system as an Euler–Bernoulli beam provides a 94.45% degree of precision. The detection of ultrasmall

  1. A Statistically-Based Low-Level Cloud Scheme and Its Tentative Application in a General Circulation Model

    Institute of Scientific and Technical Information of China (English)

    DAI Fushan; YU Rucong; ZHANG Xuehong; YU Yongqiang

    2005-01-01

    In this study, a statistical cloud scheme is first introduced and coupled with a first-order turbulence scheme with second-order turbulence moments parameterized by the timescale of the turbulence dissipation and the vertical turbulent diffusion coefficient. Then the ability of the scheme to simulate cloud fraction at different relative humidity, vertical temperature profile, and the timescale of the turbulent dissipation is examined by numerical simulation. It is found that the simulated cloud fraction is sensitive to the parameter used in the statistical cloud scheme and the timescale of the turbulent dissipation. Based on the analyses, the introduced statistical cloud scheme is modified. By combining the modified statistical cloud scheme with a boundary layer cumulus scheme, a new statistically-based low-level cloud scheme is proposed and tentatively applied in NCAR (National Center for Atmospheric Research) CCM3 (Community Climate Model version3). It is found that the simulation of low-level cloud fraction is markedly improved and the centers with maximum low-level cloud fractions are well simulated in the cold oceans off the western coasts with the statistically-based low-level cloud scheme applied in CCM3. It suggests that the new statistically-based low-level cloud scheme has a great potential in the general circulation model for improving the low-level cloud parameterization.

  2. Models for Dynamic Applications

    DEFF Research Database (Denmark)

    Sales-Cruz, Mauricio; Morales Rodriguez, Ricardo; Heitzig, Martina;

    2011-01-01

    This chapter covers aspects of the dynamic modelling and simulation of several complex operations that include a controlled blending tank, a direct methanol fuel cell that incorporates a multiscale model, a fluidised bed reactor, a standard chemical reactor and finally a polymerisation reactor. T...

  3. Fusion strategies for selecting multiple tuning parameters for multivariate calibration and other penalty based processes: A model updating application for pharmaceutical analysis.

    Science.gov (United States)

    Tencate, Alister J; Kalivas, John H; White, Alexander J

    2016-05-19

    New multivariate calibration methods and other processes are being developed that require selection of multiple tuning parameter (penalty) values to form the final model. With one or more tuning parameters, using only one measure of model quality to select final tuning parameter values is not sufficient. Optimization of several model quality measures is challenging. Thus, three fusion ranking methods are investigated for simultaneous assessment of multiple measures of model quality for selecting tuning parameter values. One is a supervised learning fusion rule named sum of ranking differences (SRD). The other two are non-supervised learning processes based on the sum and median operations. The effect of the number of models evaluated on the three fusion rules are also evaluated using three procedures. One procedure uses all models from all possible combinations of the tuning parameters. To reduce the number of models evaluated, an iterative process (only applicable to SRD) is applied and thresholding a model quality measure before applying the fusion rules is also used. A near infrared pharmaceutical data set requiring model updating is used to evaluate the three fusion rules. In this case, calibration of the primary conditions is for the active pharmaceutical ingredient (API) of tablets produced in a laboratory. The secondary conditions for calibration updating is for tablets produced in the full batch setting. Two model updating processes requiring selection of two unique tuning parameter values are studied. One is based on Tikhonov regularization (TR) and the other is a variation of partial least squares (PLS). The three fusion methods are shown to provide equivalent and acceptable results allowing automatic selection of the tuning parameter values. Best tuning parameter values are selected when model quality measures used with the fusion rules are for the small secondary sample set used to form the updated models. In this model updating situation, evaluation of

  4. Study and application of monitoring plane displacement of a similarity model based on time-series images

    Institute of Scientific and Technical Information of China (English)

    Xu Jiankun; Wang Enyuan; Li Zhonghui; Wang Chao

    2011-01-01

    In order to compensate for the deficiency of present methods of monitoring plane displacement in similarity model tests,such as inadequate real-time monitoring and more manual intervention,an effective monitoring method was proposed in this study,and the major steps of the monitoring method include:firstly,time-series images of the similarity model in the test were obtained by a camera,and secondly,measuring points marked as artificial targets were automatically tracked and recognized from time-series images.Finally,the real-time plane displacement field was calculated by the fixed magnification between objects and images under the specific conditions.And then the application device of the method was designed and tested.At the same time,a sub-pixel location method and a distortion error model were used to improve the measuring accuracy.The results indicate that this method may record the entire test,especially the detailed non-uniform deformation and sudden deformation.Compared with traditional methods this method has a number of advantages,such as greater measurement accuracy and reliability,less manual intervention,higher automation,strong practical properties,much more measurement information and so on.

  5. Marginal regression models for clustered count data based on zero-inflated Conway-Maxwell-Poisson distribution with applications.

    Science.gov (United States)

    Choo-Wosoba, Hyoyoung; Levy, Steven M; Datta, Somnath

    2016-06-01

    Community water fluoridation is an important public health measure to prevent dental caries, but it continues to be somewhat controversial. The Iowa Fluoride Study (IFS) is a longitudinal study on a cohort of Iowa children that began in 1991. The main purposes of this study (http://www.dentistry.uiowa.edu/preventive-fluoride-study) were to quantify fluoride exposures from both dietary and nondietary sources and to associate longitudinal fluoride exposures with dental fluorosis (spots on teeth) and dental caries (cavities). We analyze a subset of the IFS data by a marginal regression model with a zero-inflated version of the Conway-Maxwell-Poisson distribution for count data exhibiting excessive zeros and a wide range of dispersion patterns. In general, we introduce two estimation methods for fitting a ZICMP marginal regression model. Finite sample behaviors of the estimators and the resulting confidence intervals are studied using extensive simulation studies. We apply our methodologies to the dental caries data. Our novel modeling incorporating zero inflation, clustering, and overdispersion sheds some new light on the effect of community water fluoridation and other factors. We also include a second application of our methodology to a genomic (next-generation sequencing) dataset that exhibits underdispersion. PMID:26575079

  6. The development and application of a risk-based prioritization model for the Oak Ridge Environmental Restoration Program

    International Nuclear Information System (INIS)

    The Oak Ridge Environmental Restoration (ER) Program developed and implemented the Environmental Restoration Benefit Assessment Matrix (ERBAM) early in 1994 to provide a simple, efficient process for prioritizing and justifying fiscal budget decisions for a diverse set of activities. The decision to develop a methodology for prioritizing sites was necessitated by the large number of buildings and areas managed by the DOE Oak Ridge Field Office and the finite resources available to address these areas. The ERBAM was based on the Integrated Resource Management System prioritization methodology historically used by the United States Department of Energy (DOE) and Lockheed Martin Energy Systems, Inc., to rank compliance and operational activities. To develop the matrix, ER Program management, working with federal and state regulators, agreed on impact criteria that balance the major objectives within the ER Program: protection of public health, protection of the environment, protection of on-site workers, consideration of stakeholder/community preference, achievement of ER mission, and optimization of cost efficiency. Lessons learned from the initial application of the matrix were used to make refinements and improvements in the methodology. A standard set of assumptions (both overall and categoric) and a prioritization board, consisting of top level DOE and Lockheed Martin Energy Systems, Inc., managers along with federal and state regulatory representatives, were established to facilitate consistent application. Current and future improvements include a method to incorporate existing quantitative risk data and facilitate increased efficiency in applying baseline cost data and approved funding levels to the prioritized output. Application of the prioritization methodology yields a prioritized list of all work activities within the programs' work breakdown structure

  7. [Measuring water ecological carrying capacity with the ecosystem-service-based ecological footprint (ESEF) method: Theory, models and application].

    Science.gov (United States)

    Jiao, Wen-jun; Min, Qing-wen; Li, Wen-hua; Fuller, Anthony M

    2015-04-01

    Integrated watershed management based on aquatic ecosystems has been increasingly acknowledged. Such a change in the philosophy of water environment management requires recognizing the carrying capacity of aquatic ecosystems for human society from a more general perspective. The concept of the water ecological carrying capacity is therefore put forward, which considers both water resources and water environment, connects socio-economic development to aquatic ecosystems and provides strong support for integrated watershed management. In this paper, the authors proposed an ESEF-based measure of water ecological carrying capacity and constructed ESEF-based models of water ecological footprint and capacity, aiming to evaluate water ecological carrying capacity with footprint methods. A regional model of Taihu Lake Basin was constructed and applied to evaluate the water ecological carrying capacity in Changzhou City which located in the upper reaches of the basin. Results showed that human demand for water ecosystem services in this city had exceeded the supply capacity of local aquatic ecosystems and the significant gap between demand and supply had jeopardized the sustainability of local aquatic ecosystems. Considering aqua-product provision, water supply and pollutant absorption in an integrated way, the scale of population and economy aquatic ecosystems in Changzhou could bear only 54% of the current status.

  8. Application of physiologically based pharmacokinetic modeling in predicting drug–drug interactions for sarpogrelate hydrochloride in humans

    Directory of Open Access Journals (Sweden)

    Min JS

    2016-09-01

    Full Text Available Jee Sun Min,1 Doyun Kim,1 Jung Bae Park,1 Hyunjin Heo,1 Soo Hyeon Bae,2 Jae Hong Seo,1 Euichaul Oh,1 Soo Kyung Bae1 1Integrated Research Institute of Pharmaceutical Sciences, College of Pharmacy, The Catholic University of Korea, Bucheon, 2Department of Pharmacology, College of Medicine, The Catholic University of Korea, Seocho-gu, Seoul, South Korea Background: Evaluating the potential risk of metabolic drug–drug interactions (DDIs is clinically important. Objective: To develop a physiologically based pharmacokinetic (PBPK model for sarpogrelate hydrochloride and its active metabolite, (R,S-1-{2-[2-(3-methoxyphenylethyl]-phenoxy}-3-(dimethylamino-2-propanol (M-1, in order to predict DDIs between sarpogrelate and the clinically relevant cytochrome P450 (CYP 2D6 substrates, metoprolol, desipramine, dextromethorphan, imipramine, and tolterodine. Methods: The PBPK model was developed, incorporating the physicochemical and pharmacokinetic properties of sarpogrelate hydrochloride, and M-1 based on the findings from in vitro and in vivo studies. Subsequently, the model was verified by comparing the predicted concentration-time profiles and pharmacokinetic parameters of sarpogrelate and M-1 to the observed clinical data. Finally, the verified model was used to simulate clinical DDIs between sarpogrelate hydrochloride and sensitive CYP2D6 substrates. The predictive performance of the model was assessed by comparing predicted results to observed data after coadministering sarpogrelate hydrochloride and metoprolol. Results: The developed PBPK model accurately predicted sarpogrelate and M-1 plasma concentration profiles after single or multiple doses of sarpogrelate hydrochloride. The simulated ratios of area under the curve and maximum plasma concentration of metoprolol in the presence of sarpogrelate hydrochloride to baseline were in good agreement with the observed ratios. The predicted fold-increases in the area under the curve ratios of metoprolol

  9. Development of a pyrolysis waste recovery model with designs, test plans, and applications for space-based habitats

    Science.gov (United States)

    Roberson, Bobby J.

    1992-01-01

    Extensive literature searches revealed the numerous advantages of using pyrolysis as a means of recovering usable resources from inedible plant biomass, paper, plastics, other polymers, and human waste. A possible design of a pyrolysis reactor with test plans and applications for use on a space-based habitat are proposed. The proposed system will accommodate the wastes generated by a four-person crew while requiring solar energy as the only power source. Waste materials will be collected and stored during the 15-day lunar darkness periods. Resource recovery will occur during the daylight periods. Usable gases such as methane and hydrogen and a solid char will be produced while reducing the mass and volume of the waste to almost infinitely small levels. The system will be operated economically, safely, and in a non-polluting manner.

  10. Applicability of a noise-based model to estimate in-traffic exposure to black carbon and particle number concentrations in different cultures.

    Science.gov (United States)

    Dekoninck, Luc; Botteldooren, Dick; Panis, Luc Int; Hankey, Steve; Jain, Grishma; S, Karthik; Marshall, Julian

    2015-01-01

    Several studies show that a significant portion of daily air pollution exposure, in particular black carbon (BC), occurs during transport. In a previous work, a model for the in-traffic exposure of bicyclists to BC was proposed based on spectral evaluation of mobile noise measurements and validated with BC measurements in Ghent, Belgium. In this paper, applicability of this model in a different cultural context with a totally different traffic and mobility situation is presented. In addition, a similar modeling approach is tested for particle number (PN) concentration. Indirectly assessing BC and PN exposure through a model based on noise measurements is advantageous because of the availability of very affordable noise monitoring devices. Our previous work showed that a model including specific spectral components of the noise that relate to engine and rolling emission and basic meteorological data, could be quite accurate. Moreover, including a background concentration adjustment improved the model considerably. To explore whether this model could also be used in a different context, with or without tuning of the model parameters, a study was conducted in Bangalore, India. Noise measurement equipment, data storage, data processing, continent, country, measurement operators, vehicle fleet, driving behavior, biking facilities, background concentration, and meteorology are all very different from the first measurement campaign in Belgium. More than 24h of combined in-traffic noise, BC, and PN measurements were collected. It was shown that the noise-based BC exposure model gives good predictions in Bangalore and that the same approach is also successful for PN. Cross validation of the model parameters was used to compare factors that impact exposure across study sites. A pooled model (combining the measurements of the two locations) results in a correlation of 0.84 when fitting the total trip exposure in Bangalore. Estimating particulate matter exposure with traffic

  11. State-based modeling of continuous human-integrated systems: An application to air traffic separation assurance

    International Nuclear Information System (INIS)

    A method for modeling the safety of human-integrated systems that have continuous dynamics is introduced. The method is intended to supplement more detailed reliability-based methods. Assumptions for the model are defined such that the model is demonstrably complete, enabling it to yield a set of key agent characteristics. These key characteristics identify a sufficient set of characteristics that can be used to establish the safety of particular system configurations. The method is applied for the analysis of the safety of strategic and tactical separation assurance algorithms for the next generation air transportation system. It is shown that the key characteristics for this problem include the ability of agents (human or automated) to identify configurations that can enable intense transitions from a safe to unsafe state. However, the most technologically advanced algorithm for separation assurance does not currently attempt to identify such configurations. It is also discussed how, although the model is in a form that lends itself to quantitative evaluations, such evaluations are complicated by the difficulty of accurately quantifying human error probabilities.

  12. Computational models of upper-limb motion during functional reaching tasks for application in FES-based stroke rehabilitation.

    Science.gov (United States)

    Freeman, Chris; Exell, Tim; Meadmore, Katie; Hallewell, Emma; Hughes, Ann-Marie

    2015-06-01

    Functional electrical stimulation (FES) has been shown to be an effective approach to upper-limb stroke rehabilitation, where it is used to assist arm and shoulder motion. Model-based FES controllers have recently confirmed significant potential to improve accuracy of functional reaching tasks, but they typically require a reference trajectory to track. Few upper-limb FES control schemes embed a computational model of the task; however, this is critical to ensure the controller reinforces the intended movement with high accuracy. This paper derives computational motor control models of functional tasks that can be directly embedded in real-time FES control schemes, removing the need for a predefined reference trajectory. Dynamic models of the electrically stimulated arm are first derived, and constrained optimisation problems are formulated to encapsulate common activities of daily living. These are solved using iterative algorithms, and results are compared with kinematic data from 12 subjects and found to fit closely (mean fitting between 63.2% and 84.0%). The optimisation is performed iteratively using kinematic variables and hence can be transformed into an iterative learning control algorithm by replacing simulation signals with experimental data. The approach is therefore capable of controlling FES in real time to assist tasks in a manner corresponding to unimpaired natural movement. By ensuring that assistance is aligned with voluntary intention, the controller hence maximises the potential effectiveness of future stroke rehabilitation trials.

  13. Application of a Theory and Simulation based Convective Boundary Mixing model for AGB Star Evolution and Nucleosynthesis

    CERN Document Server

    Battino, U; Ritter, C; Herwig, F; Denisenkov, P; Hartogh, J W Den; Trappitsch, R; Hirschi, R; Freytag, B; Thielemann, F; Paxton, B

    2016-01-01

    The s-process nucleosynthesis in Asymptotic Giant Branch (AGB) stars depends on the modeling of convective boundaries. We present models and s-process simulations that adopt a treatment of convective boundaries based on the results of hydrodynamic simulations and on the theory of mixing due to gravity waves in the vicinity of convective boundaries. Hydrodynamics simulations suggest the presence of convective boundary mixing (CBM) at the bottom of the thermal pulse-driven convective zone. Similarly, convection-induced mixing processes are proposed for the mixing below the convective envelope during third dredge-up where the 13C pocket for the s process in AGB stars forms. In this work we apply a CBM model motivated by simulations and theory to models with initial mass M = 2 and M = 3M?, and with initial metal content Z = 0:01 and Z = 0:02. As reported previously, the He-intershell abundance of 12C and 16O are increased by CBM at the bottom of pulse-driven convection zone. This mixing is affecting the 22Ne(alph...

  14. Establishment and application of drilling sealing model in the spherical grouting mode based on the loosing-circle theory

    Institute of Scientific and Technical Information of China (English)

    Hao; Zhiyong; Lin; Baiquan; Gao; Yabin; Cheng; Yanying

    2012-01-01

    There are quite a few studies that have been done on borehole sealing theory both domestically and internationally.The existing researches usually consider drilling of the surroundings as a dense homogeneous elastic body which does not meet the characteristics of real drilling of the fractured body.Based on the loosing-circle theory and analyses of the surrounding rock stress field,cracks and seepage fields,combined with Newtonian fluid spherical grouting model,we deduced the dynamic relationship between the seepage coefficient and rock or grouting parameters of the drilling sealing fluid mode of spherical fissure grouting.In this experiment,mucus was injected in the simulated coal seam and the permeability coefficient of the sealing body was calculated by using the model.To verify the validity of the model,the calculated sealing body number was compared with the extreme negative pressure that the sealing body could withstand.The theoretical model revealed the drilling sealing fluid mechanism,provided a method for the quantitative calculation of the drilling sealing fluid effect by grouting mode and a reference for the subsequent research of sealing mechanism.

  15. Estimating daily time series of streamflow using hydrological model calibrated based on satellite observations of river water surface width: Toward real world applications.

    Science.gov (United States)

    Sun, Wenchao; Ishidaira, Hiroshi; Bastola, Satish; Yu, Jingshan

    2015-05-01

    Lacking observation data for calibration constrains applications of hydrological models to estimate daily time series of streamflow. Recent improvements in remote sensing enable detection of river water-surface width from satellite observations, making possible the tracking of streamflow from space. In this study, a method calibrating hydrological models using river width derived from remote sensing is demonstrated through application to the ungauged Irrawaddy Basin in Myanmar. Generalized likelihood uncertainty estimation (GLUE) is selected as a tool for automatic calibration and uncertainty analysis. Of 50,000 randomly generated parameter sets, 997 are identified as behavioral, based on comparing model simulation with satellite observations. The uncertainty band of streamflow simulation can span most of 10-year average monthly observed streamflow for moderate and high flow conditions. Nash-Sutcliffe efficiency is 95.7% for the simulated streamflow at the 50% quantile. These results indicate that application to the target basin is generally successful. Beyond evaluating the method in a basin lacking streamflow data, difficulties and possible solutions for applications in the real world are addressed to promote future use of the proposed method in more ungauged basins.

  16. Application of Large-Scale Database-Based Online Modeling to Plant State Long-Term Estimation

    Science.gov (United States)

    Ogawa, Masatoshi; Ogai, Harutoshi

    Recently, attention has been drawn to the local modeling techniques of a new idea called “Just-In-Time (JIT) modeling”. To apply “JIT modeling” to a large amount of database online, “Large-scale database-based Online Modeling (LOM)” has been proposed. LOM is a technique that makes the retrieval of neighboring data more efficient by using both “stepwise selection” and quantization. In order to predict the long-term state of the plant without using future data of manipulated variables, an Extended Sequential Prediction method of LOM (ESP-LOM) has been proposed. In this paper, the LOM and the ESP-LOM are introduced.

  17. Modeling of a Membrane Based Humidifier for Fuel Cell Applications Subject to End-Of-Life Conditions

    DEFF Research Database (Denmark)

    Nielsen, Mads Pagh; Olesen, Anders Christian; Menard, Alan

    2014-01-01

    Proton Exchange Membrane (PEM) Fuel Cell Stacks efficiently convert the chemical energy in hydrogen to electricity through electrochemical reactions occurring on either side of a proton conducting electrolyte. This is a promising and very robust energy conversion process which can be used in many......-based water permeable membrane. Results are presented at nominal BOL-conditions and extreme EOL-conditions. A detailed sub-model incorporating the water absorption/desorption kinetics of Nafion and a novel and accurate representation of the diffusion coefficient of water in Nafion was implemented. The...

  18. Ontology-based application integration

    CERN Document Server

    Paulheim, Heiko

    2011-01-01

    Ontology-based Application Integration introduces UI-level (User Interface Level) application integration and discusses current problems which can be remedied by using ontologies. It shows a novel approach for applying ontologies in system integration. While ontologies have been used for integration of IT systems on the database and on the business logic layer, integration on the user interface layer is a novel field of research. This book also discusses how end users, not only developers, can benefit from semantic technologies. Ontology-based Application Integration presents the development o

  19. Modeling and preliminary characterization of passive, wireless temperature sensors for harsh environment applications based on periodic structures

    Science.gov (United States)

    Delfin Manriquez, Diego I.

    Wireless temperature sensing has attained significant attention in recent years due to the increasing need to develop reliable and affordable sensing solutions for energy conversion systems and other harsh environment applications. The development of next generation sensors for energy production processing parameters, such as temperature and pressure, can result in better performance of the system. Particularly, continuous temperature monitoring in energy conversion systems can result in enhancements such as better system integrity, less pollution and higher thermal efficiencies. However, the conditions experienced in these system components hinder the performance of current solutions due to the presence of semi-conductor materials and welded joints. Additionally, the use of wired systems can result in complex wiring networks, increasing the cost of installation, maintenance and sensor replacement. Therefore, next generation sensing solutions must be developed to overcome current challenges in systems where adverse conditions are present. This research project proposes two novel passive, wireless temperature sensor designs based on concepts of guided mode resonance filters (GMRF) and metamaterials. For the GMRF, a tri-layer structure using a metallic encasing and a circular aperture grating layer was developed to have a resonance frequency of 10 GHz. While for the metamaterial-based sensor a continuation of previous work was presented by utilizing a dielectric substrate and an array of commercially available metallic washers divided in two layers. For both designs, High Frequency Structure Simulator (HFSS) from ANSYSRTM was employed to assess the feasibility of the sensor as well as to optimize the geometry and guide the fabrication process. A systematic approach consisting of evaluating the unit cell, then assessing the number of periods needed, and finally characterizing the response of the final sensor was followed for each case. After the modeling process was

  20. Dual Security Testing Model for Web Applications

    Directory of Open Access Journals (Sweden)

    Singh Garima

    2016-02-01

    Full Text Available In recent years, web applications have evolved from small websites into large multi-tiered applications. The quality of web applications depends on the richness of contents, well structured navigation and most importantly its security. Web application testing is a new field of research so as to ensure the consistency and quality of web applications. In the last ten years there have been different approaches. Models have been developed for testing web applications but only a few focused on content testing, a few on navigation testing and a very few on security testing of web applications. There is a need to test content, navigation and security of an application in one go. The objective of this paper is to propose Dual Security Testing Model to test the security of web applications using UML modeling technique which includes web socket interface. In this research paper we have described how our security testing model is implemented using activity diagram, activity graph and based on this how test cases is generated.

  1. Application of a Theory and Simulation-based Convective Boundary Mixing Model for AGB Star Evolution and Nucleosynthesis

    Science.gov (United States)

    Battino, U.; Pignatari, M.; Ritter, C.; Herwig, F.; Denisenkov, P.; Den Hartogh, J. W.; Trappitsch, R.; Hirschi, R.; Freytag, B.; Thielemann, F.; Paxton, B.

    2016-08-01

    The s-process nucleosynthesis in Asymptotic giant branch (AGB) stars depends on the modeling of convective boundaries. We present models and s-process simulations that adopt a treatment of convective boundaries based on the results of hydrodynamic simulations and on the theory of mixing due to gravity waves in the vicinity of convective boundaries. Hydrodynamics simulations suggest the presence of convective boundary mixing (CBM) at the bottom of the thermal pulse-driven convective zone. Similarly, convection-induced mixing processes are proposed for the mixing below the convective envelope during third dredge-up (TDU), where the {}13{{C}} pocket for the s process in AGB stars forms. In this work, we apply a CBM model motivated by simulations and theory to models with initial mass M = 2 and M=3 {M}⊙ , and with initial metal content Z = 0.01 and Z = 0.02. As reported previously, the He-intershell abundances of {}12{{C}} and {}16{{O}} are increased by CBM at the bottom of the pulse-driven convection zone. This mixing is affecting the {}22{Ne}(α, n){}25{Mg} activation and the s-process efficiency in the {}13{{C}}-pocket. In our model, CBM at the bottom of the convective envelope during the TDU represents gravity wave mixing. Furthermore, we take into account the fact that hydrodynamic simulations indicate a declining mixing efficiency that is already about a pressure scale height from the convective boundaries, compared to mixing-length theory. We obtain the formation of the {}13{{C}}-pocket with a mass of ≈ {10}-4 {M}⊙ . The final s-process abundances are characterized by 0.36\\lt [{{s}}/{Fe}]\\lt 0.78 and the heavy-to-light s-process ratio is -0.23\\lt [{hs}/{ls}]\\lt 0.45. Finally, we compare our results with stellar observations, presolar grain measurements and previous work.

  2. Model-based assessment of estuary ecosystem health using the latent health factor index, with application to the richibucto estuary.

    Directory of Open Access Journals (Sweden)

    Grace S Chiu

    Full Text Available The ability to quantitatively assess ecological health is of great interest to those tasked with monitoring and conserving ecosystems. For decades, biomonitoring research and policies have relied on multimetric health indices of various forms. Although indices are numbers, many are constructed based on qualitative procedures, thus limiting the quantitative rigor of the practical interpretations of such indices. The statistical modeling approach to construct the latent health factor index (LHFI was recently developed. With ecological data that otherwise are used to construct conventional multimetric indices, the LHFI framework expresses such data in a rigorous quantitative model, integrating qualitative features of ecosystem health and preconceived ecological relationships among such features. This hierarchical modeling approach allows unified statistical inference of health for observed sites (along with prediction of health for partially observed sites, if desired and of the relevance of ecological drivers, all accompanied by formal uncertainty statements from a single, integrated analysis. Thus far, the LHFI approach has been demonstrated and validated in a freshwater context. We adapt this approach to modeling estuarine health, and illustrate it on the previously unassessed system in Richibucto in New Brunswick, Canada, where active oyster farming is a potential stressor through its effects on sediment properties. Field data correspond to health metrics that constitute the popular AZTI marine biotic index and the infaunal trophic index, as well as abiotic predictors preconceived to influence biota. Our paper is the first to construct a scientifically sensible model that rigorously identifies the collective explanatory capacity of salinity, distance downstream, channel depth, and silt-clay content-all regarded a priori as qualitatively important abiotic drivers-towards site health in the Richibucto ecosystem. This suggests the potential

  3. Agent Based Multiviews Requirements Model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Based on the current researches of viewpoints oriented requirements engineering and intelligent agent, we present the concept of viewpoint agent and its abstract model based on a meta-language for multiviews requirements engineering. It provided a basis for consistency checking and integration of different viewpoint requirements, at the same time, these checking and integration works can automatically realized in virtue of intelligent agent's autonomy, proactiveness and social ability. Finally, we introduce the practical application of the model by the case study of data flow diagram.

  4. 农作物空间格局变化模拟模型的MATLAB实现及应用%Model application of an agent-based model for simulating crop pattern dynamics at regional scale based on MATLAB

    Institute of Scientific and Technical Information of China (English)

    余强毅; 吴文斌; 陈羊阳; 杨鹏; 孟超英; 周清波; 唐华俊

    2014-01-01

    Crop pattern is a key element in agricultural land systems other than land use and land cover. Crop pattern dynamic changes take place very frequently, but they are not always easily observable, making many difficulties for analysis. As an effective tool for understanding the driver, process and consequence of agricultural land system changes, the spatially-explicit agent-based land change models have successfully been applied in representing human and natural interactions on agricultural landscapes. With the assumption that the crop pattern at a regional level is the aggregation of crop choices at the filed level, we conceptualized an agent-based model to simulate crop pattern dynamics at a regional scale (CroPaDy), which was supposed to represent the frequent but uneasily observed crop pattern changes in agricultural land systems. The conceptualization of CroPaDy model was designed strictly following the standard protocol for agent-based modeling. However, the computational model hinders its application because it needs a grid-based representation and the model itself is complicated with multi objectives, and nested by 3 interactive sub modules. As CroPaDy model can hardly been developed by the common agent-based modeling platforms, such as RePast, NetLogo, and Swarm, we are trying to use another alternative MATLAB to realize an empirical based application in an agricultural region of Northeast China, by taking the advantage of powerful and open-accessed matrix computing ability of MATLAB. We coded the model for the 3 interactive sub modules in steps: 1) Agents generating module. The Monte Carlo method was used to generate the internal factors (family attributes) for each individual agent in the full coverage study region by combining GIS data, statistical data, survey data and the individual based blanket rules. 2) Agent classifying module. The back propagation artificial neural network method was used to automatically classify the generated agents to groups

  5. Constraint Based Modeling Going Multicellular.

    Science.gov (United States)

    Martins Conde, Patricia do Rosario; Sauter, Thomas; Pfau, Thomas

    2016-01-01

    Constraint based modeling has seen applications in many microorganisms. For example, there are now established methods to determine potential genetic modifications and external interventions to increase the efficiency of microbial strains in chemical production pipelines. In addition, multiple models of multicellular organisms have been created including plants and humans. While initially the focus here was on modeling individual cell types of the multicellular organism, this focus recently started to switch. Models of microbial communities, as well as multi-tissue models of higher organisms have been constructed. These models thereby can include different parts of a plant, like root, stem, or different tissue types in the same organ. Such models can elucidate details of the interplay between symbiotic organisms, as well as the concerted efforts of multiple tissues and can be applied to analyse the effects of drugs or mutations on a more systemic level. In this review we give an overview of the recent development of multi-tissue models using constraint based techniques and the methods employed when investigating these models. We further highlight advances in combining constraint based models with dynamic and regulatory information and give an overview of these types of hybrid or multi-level approaches.

  6. Asymptotics-based CI models for atoms:Properties, exact solution of a minimal model for Li to Ne, and application to atomic spectra

    OpenAIRE

    Friesecke, G.; Goddard, B.D.

    2009-01-01

    Configuration-interaction (CI) models are approximations to the electronic Schrödinger equation which are widely used for numerical electronic structure calculations in quantum chemistry. Based on our recent closed-form asymptotic results for the full atomic Schrödinger equation in the limit of fixed electron number and large nuclear charge [SIAM J. Math. Anal., 41 (2009), pp. 631-664], we introduce a class of CI models for atoms which reproduce, at fixed finite model dimension, the correct S...

  7. Collective estimation of multiple bivariate density functions with application to angular-sampling-based protein loop modeling

    KAUST Repository

    Maadooliat, Mehdi

    2015-10-21

    This paper develops a method for simultaneous estimation of density functions for a collection of populations of protein backbone angle pairs using a data-driven, shared basis that is constructed by bivariate spline functions defined on a triangulation of the bivariate domain. The circular nature of angular data is taken into account by imposing appropriate smoothness constraints across boundaries of the triangles. Maximum penalized likelihood is used to fit the model and an alternating blockwise Newton-type algorithm is developed for computation. A simulation study shows that the collective estimation approach is statistically more efficient than estimating the densities individually. The proposed method was used to estimate neighbor-dependent distributions of protein backbone dihedral angles (i.e., Ramachandran distributions). The estimated distributions were applied to protein loop modeling, one of the most challenging open problems in protein structure prediction, by feeding them into an angular-sampling-based loop structure prediction framework. Our estimated distributions compared favorably to the Ramachandran distributions estimated by fitting a hierarchical Dirichlet process model; and in particular, our distributions showed significant improvements on the hard cases where existing methods do not work well.

  8. A microcomputer-based model of radionuclide spills and discharge plumes for application to the Great Lakes

    International Nuclear Information System (INIS)

    This report describes the implementation of a spill model for a shore-based contamination source on Lake Ontario, and its extensions (implemented and proposed) to situations other than that for which it was originally intended. The original programs were developed to model the transport of radionuclide contaminants in the wind-driven shore currents of the Lake. The information relevant to the simulation consists of depth contours adjacent to the shore, and wind history at half-day intervals from the previous thirty days. The codes have been written in the programming language C, for IBM PC or PC compatibles, preferably with the arithmetic co-processor. Much of the code has been written using the Halo graphics package. Features of the simulation codes include: reasonable processing speed and medium -resolution visual display, and variations on the mode of wind history input graphical output display. The model is suitable in its current form only for straight or concave shorelines. Extensions to convex contours and to reactive pollutants are under development

  9. A Space-Time Network-Based Modeling Framework for Dynamic Unmanned Aerial Vehicle Routing in Traffic Incident Monitoring Applications

    Directory of Open Access Journals (Sweden)

    Jisheng Zhang

    2015-06-01

    Full Text Available It is essential for transportation management centers to equip and manage a network of fixed and mobile sensors in order to quickly detect traffic incidents and further monitor the related impact areas, especially for high-impact accidents with dramatic traffic congestion propagation. As emerging small Unmanned Aerial Vehicles (UAVs start to have a more flexible regulation environment, it is critically important to fully explore the potential for of using UAVs for monitoring recurring and non-recurring traffic conditions and special events on transportation networks. This paper presents a space-time network- based modeling framework for integrated fixed and mobile sensor networks, in order to provide a rapid and systematic road traffic monitoring mechanism. By constructing a discretized space-time network to characterize not only the speed for UAVs but also the time-sensitive impact areas of traffic congestion, we formulate the problem as a linear integer programming model to minimize the detection delay cost and operational cost, subject to feasible flying route constraints. A Lagrangian relaxation solution framework is developed to decompose the original complex problem into a series of computationally efficient time-dependent and least cost path finding sub-problems. Several examples are used to demonstrate the results of proposed models in UAVs’ route planning for small and medium-scale networks.

  10. A Space-Time Network-Based Modeling Framework for Dynamic Unmanned Aerial Vehicle Routing in Traffic Incident Monitoring Applications

    Science.gov (United States)

    Zhang, Jisheng; Jia, Limin; Niu, Shuyun; Zhang, Fan; Tong, Lu; Zhou, Xuesong

    2015-01-01

    It is essential for transportation management centers to equip and manage a network of fixed and mobile sensors in order to quickly detect traffic incidents and further monitor the related impact areas, especially for high-impact accidents with dramatic traffic congestion propagation. As emerging small Unmanned Aerial Vehicles (UAVs) start to have a more flexible regulation environment, it is critically important to fully explore the potential for of using UAVs for monitoring recurring and non-recurring traffic conditions and special events on transportation networks. This paper presents a space-time network- based modeling framework for integrated fixed and mobile sensor networks, in order to provide a rapid and systematic road traffic monitoring mechanism. By constructing a discretized space-time network to characterize not only the speed for UAVs but also the time-sensitive impact areas of traffic congestion, we formulate the problem as a linear integer programming model to minimize the detection delay cost and operational cost, subject to feasible flying route constraints. A Lagrangian relaxation solution framework is developed to decompose the original complex problem into a series of computationally efficient time-dependent and least cost path finding sub-problems. Several examples are used to demonstrate the results of proposed models in UAVs’ route planning for small and medium-scale networks. PMID:26076404

  11. An ANN-Based Synthesis Model for Parallel Coupled Microstrip Lines with Floating Ground-Plane Conductor and Its Applications

    Directory of Open Access Journals (Sweden)

    Yuan Cao

    2016-01-01

    Full Text Available To directly obtain physical dimensions of parallel coupled microstrip lines with a floating ground-plane conductor (PCMLFGPC, an accurate synthesis model based on an artificial neural network (ANN is proposed. The synthesis model is validated by using the conformal mapping technique (CMT analysis contours. Using the synthesis model and the CMT analysis, the PCMLFGPC having equal even- and odd-mode phase velocities can be obtained by adjusting the width of the floating ground-plane conductor. Applying the method, a 7 dB coupler with the measured isolation better than 27 dB across a wide bandwidth (more than 120%, a 90° Schiffman phase shifter with phase deviation ±2.5° and return loss more than 17.5 dB covering 63.4% bandwidth, and a bandpass filter with completely eliminated second-order spurious band are implemented. The performances of the current designs are superior to those of the previous components configured with the PCMLFGPC.

  12. Formulation and Application of a Physically-Based Rupture Probability Model for Large Earthquakes on Subduction Zones: A Case Study of Earthquakes on Nazca Plate

    Science.gov (United States)

    Mahdyiar, M.; Galgana, G.; Shen-Tu, B.; Klein, E.; Pontbriand, C. W.

    2014-12-01

    Most time dependent rupture probability (TDRP) models are basically designed for a single-mode rupture, i.e. a single characteristic earthquake on a fault. However, most subduction zones rupture in complex patterns that create overlapping earthquakes of different magnitudes. Additionally, the limited historic earthquake data does not provide sufficient information to estimate reliable mean recurrence intervals for earthquakes. This makes it difficult to identify a single characteristic earthquake for TDRP analysis. Physical models based on geodetic data have been successfully used to obtain information on the state of coupling and slip deficit rates for subduction zones. Coupling information provides valuable insight into the complexity of subduction zone rupture processes. In this study we present a TDRP model that is formulated based on subduction zone slip deficit rate distribution. A subduction zone is represented by an integrated network of cells. Each cell ruptures multiple times from numerous earthquakes that have overlapping rupture areas. The rate of rupture for each cell is calculated using a moment balance concept that is calibrated based on historic earthquake data. The information in conjunction with estimates of coseismic slip from past earthquakes is used to formulate time dependent rupture probability models for cells. Earthquakes on the subduction zone and their rupture probabilities are calculated by integrating different combinations of cells. The resulting rupture probability estimates are fully consistent with the state of coupling of the subduction zone and the regional and local earthquake history as the model takes into account the impact of all large (M>7.5) earthquakes on the subduction zone. The granular rupture model as developed in this study allows estimating rupture probabilities for large earthquakes other than just a single characteristic magnitude earthquake. This provides a general framework for formulating physically-based

  13. Simulation of turbulent supersonic separated base flows using enhanced turbulence modeling techniques with application to an X-33 aerospike rocket nozzle system

    Science.gov (United States)

    Papp, John Laszlo

    2000-10-01

    The successful application of CFD and turbulence modeling methods to an aerospike nozzle system first involves the successful simulation of its key flow components. This report addresses the task using the Chien low-Re k-epsilon and the Yakhot et al. high-Re RNG k-epsilon turbulence models. An improved implicit axis of symmetry boundary condition is also developed to increase stability and lower artificial dissipation. Grid adaptation through the SAGE post-processing package is used throughout the study. The RNG model, after low-Re modifications, and the Chien low-Re k-epsilon model are applied to the supersonic axisymmetric base flow problem. Both models predict a peak recirculation velocity almost twice as large as experiment. The RNG model predicts a flatter base pressure and lower recirculation velocity more consistent with experimental data using less grid points than a comparable Chien model solution. The turbulent quantities predicted by both models are typical of other numerical results and generally under predict peak values obtained in experiment suggesting that too little turbulent eddy viscosity is produced. After several test cases, the full 3-D aerospike nozzle is simulated using both the Chien and modified RNG low-Re models. The Chien model outperforms the RNG model in all circumstances. The surface pressure predicted by the Chien model along the nozzle center-plane is very near experiment while mid-plane results are not as close but useful for design purposes. The lack of a thick boundary layer along the nozzle surface in RNG simulations is the cause of poor surface pressure comparisons. Although initial base flow comparisons between the model predictions and experiment are poor, the profiles are relatively flat. To accelerate the progress to a steady-state solution, a process involving the artificial lowering of the base pressure and subsequent iteration to a new steady state is undertaken. After several of these steps, the resulting steady

  14. Disaggregation of nation-wide dynamic population exposure estimates in The Netherlands: Applications of activity-based transport models

    Science.gov (United States)

    Beckx, Carolien; Int Panis, Luc; Uljee, Inge; Arentze, Theo; Janssens, Davy; Wets, Geert

    Traditional exposure studies that link concentrations with population data do not always take into account the temporal and spatial variations in both concentrations and population density. In this paper we present an integrated model chain for the determination of nation-wide exposure estimates that incorporates temporally and spatially resolved information about people's location and activities (obtained from an activity-based transport model) and about ambient pollutant concentrations (obtained from a dispersion model). To the best of our knowledge, it is the first time that such an integrated exercise was successfully carried out in a fully operational modus for all models under consideration. The evaluation of population level exposure in The Netherlands to NO 2 at different time-periods, locations, for different subpopulations (gender, socio-economic status) and during different activities (residential, work, transport, shopping) is chosen as a case-study to point out the new features of this methodology. Results demonstrate that, by neglecting people's travel behaviour, total average exposure to NO 2 will be underestimated by 4% and hourly exposure results can be underestimated by more than 30%. A more detailed exposure analysis reveals the intra-day variations in exposure estimates and the presence of large exposure differences between different activities (traffic > work > shopping > home) and between subpopulations (men > women, low socio-economic class > high socio-economic class). This kind of exposure analysis, disaggregated by activities or by subpopulations, per time of day, provides useful insight and information for scientific and policy purposes. It demonstrates that policy measures, aimed at reducing the overall (average) exposure concentration of the population may impact in a different way depending on the time of day or the subgroup considered. From a scientific point of view, this new approach can be used to reduce exposure misclassification.

  15. Methane emissions from floodplains in the Amazon Basin: towards a process-based model for global applications

    Directory of Open Access Journals (Sweden)

    B. Ringeval

    2013-10-01

    Full Text Available Tropical wetlands are estimated to represent about 50% of the natural wetland emissions and explain a large fraction of the observed CH4 variability on time scales ranging from glacial-interglacial cycles to the currently observed year-to-year variability. Despite their importance, however, tropical wetlands are poorly represented in global models aiming to predict global CH4 emissions. This study documents the first regional-scale, process-based model of CH4 emissions from tropical floodplains. The LPX-Bern Dynamic Global Vegetation Model (LPX hereafter was modified to represent floodplain hydrology, vegetation and associated CH4 emissions. The extent of tropical floodplains was prescribed using output from the spatially-explicit hydrology model PCR-GLOBWB. We introduced new Plant Functional Types (PFTs that explicitly represent floodplain vegetation. The PFT parameterizations were evaluated against available remote sensing datasets (GLC2000 land cover and MODIS Net Primary Productivity. Simulated CH4 flux densities were evaluated against field observations and regional flux inventories. Simulated CH4 emissions at Amazon Basin scale were compared to model simulations performed in the WETCHIMP intercomparison project. We found that LPX simulated CH4 flux densities are in reasonable agreement with observations at the field scale but with a~tendency to overestimate the flux observed at specific sites. In addition, the model did not reproduce between-site variations or between-year variations within a site. Unfortunately, site informations are too limited to attest or disprove some model features. At the Amazon Basin scale, our results underline the large uncertainty in the magnitude of wetland CH4 emissions. In particular, uncertainties in floodplain extent (i.e., difference between GLC2000 and PCR-GLOBWB output modulate the simulated emissions by a factor of about 2. Our best estimates, using PCR-GLOBWB in combination with GLC2000, lead to

  16. Association models for petroleum applications

    DEFF Research Database (Denmark)

    Kontogeorgis, Georgios

    2013-01-01

    Thermodynamics plays an important role in many applications in the petroleum industry, both upstream and downstream, ranging from flow assurance, (enhanced) oil recovery and control of chemicals to meet production and environmental regulations. There are many different applications in the oil & gas...... thermodynamic models like cubic equations of state have been the dominating tools in the petroleum industry, the focus of this review is on the association models. Association models are defined as the models of SAFT/CPA family (and others) which incorporate hydrogen bonding and other complex interactions....... Such association models have been, especially over the last 20 years, proved to be very successful in predicting many thermodynamic properties in the oil & gas industry. They have not so far replaced cubic equations of state, but the results obtained by using these models are very impressive in many cases, e...

  17. Small signal model parameters analysis of GaN and GaAs based HEMTs over temperature for microwave applications

    Science.gov (United States)

    Alim, Mohammad A.; Rezazadeh, Ali A.; Gaquiere, Christophe

    2016-05-01

    Thermal and small-signal model parameters analysis have been carried out on 0.5 μm × (2 × 100 μm) AlGaAs/GaAs HEMT grown on semi-insulating GaAs substrate and 0.25 μm × (2 × 100 μm) AlGaN/GaN HEMT grown on SiC substrate. Two different technologies are investigated in order to establish a detailed understanding of their capabilities in terms of frequency and temperature using on-wafer S-parameter measurement over the temperature range from -40 to 150 °C up to 50 GHz. The equivalent circuit parameters as well as their temperature-dependent behavior of the two technologies were analyzed and discussed for the first time. The principle elevation or degradation of transistor parameters with temperature demonstrates the great potential of GaN device for high frequency and high temperature applications. The result provides some valuable insights for future design optimizations of advanced GaN and a comparison of this with the GaAs technology.

  18. Sociable interface-based model checking for Web applications%基于交际接口的Web应用模型检验

    Institute of Scientific and Technical Information of China (English)

    李决龙; 李亮; 邢建春; 杨启亮

    2011-01-01

    为了验证Web应用的质量,首次采用了基于交际接口及其工具TICC的建筑智能化系统Web应用验证方法,通过一个简单的能源管理Web应用系统实例说明了整个建模、构件模块组合验证和系统性质验证过程.结果表明验证能够顺利实现,因而该方法是一种合适的Web应用验证方法.%In order to verify Web applications' quality, the paper firstly adopted the methodology based on sociable interface and its tool TICC to check Web applications in the intelligent building systems, used a simple case of energy sources management Web application system to illustrate the whole process of modeling, component composing verification and characteristic model checking.The result shows that verification is done successfully, so it is an appropriate verification method for Web applications.

  19. ABC成本分配模型在仓储企业中的应用%The Application of Actively -based Costing Allocating Model in Logistics Enterprise

    Institute of Scientific and Technical Information of China (English)

    杨静; 于桂平

    2012-01-01

    为了解决仓储企业资源配置问题,将ABC(Activity-based Costing System)思想运用于存货成本分配,并用实例分析了模型的具体应用方法,表现出结果公开、过程易理解、投入富于意义的特性,旨在推动企业资源的合理配置,使产业最优化,效率最大化。%In order to solve the problems of resource allocation in logistics enterprise, this paper applies ABC ( Ac- tively - based Costing) model in allocating stock cost, and analyzes the specific application method of the model, which shows the characteristics of open outcome, understandable process and meaningful input, with the purpose of improving the reasonable resource allocation, optimizing the industry and maximizing the efficiency.

  20. Model-based fault detection and isolation for intermittently active faults with application to motion-based thruster fault detection and isolation for spacecraft

    Science.gov (United States)

    Wilson, Edward (Inventor)

    2008-01-01

    The present invention is a method for detecting and isolating fault modes in a system having a model describing its behavior and regularly sampled measurements. The models are used to calculate past and present deviations from measurements that would result with no faults present, as well as with one or more potential fault modes present. Algorithms that calculate and store these deviations, along with memory of when said faults, if present, would have an effect on the said actual measurements, are used to detect when a fault is present. Related algorithms are used to exonerate false fault modes and finally to isolate the true fault mode. This invention is presented with application to detection and isolation of thruster faults for a thruster-controlled spacecraft. As a supporting aspect of the invention, a novel, effective, and efficient filtering method for estimating the derivative of a noisy signal is presented.

  1. Variability of tsunami inundation footprints considering stochastic scenarios based on a single rupture model: Application to the 2011 Tohoku earthquake

    KAUST Repository

    Goda, Katsuichiro

    2015-06-30

    The sensitivity and variability of spatial tsunami inundation footprints in coastal cities and towns due to a megathrust subduction earthquake in the Tohoku region of Japan are investigated by considering different fault geometry and slip distributions. Stochastic tsunami scenarios are generated based on the spectral analysis and synthesis method with regards to an inverted source model. To assess spatial inundation processes accurately, tsunami modeling is conducted using bathymetry and elevation data with 50 m grid resolutions. Using the developed methodology for assessing variability of tsunami hazard estimates, stochastic inundation depth maps can be generated for local coastal communities. These maps are important for improving disaster preparedness by understanding the consequences of different situations/conditions, and by communicating uncertainty associated with hazard predictions. The analysis indicates that the sensitivity of inundation areas to the geometrical parameters (i.e., top-edge depth, strike, and dip) depends on the tsunami source characteristics and the site location, and is therefore complex and highly nonlinear. The variability assessment of inundation footprints indicates significant influence of slip distributions. In particular, topographical features of the region, such as ria coast and near-shore plain, have major influence on the tsunami inundation footprints.

  2. A novel random void model and its application in predicting void content of composites based on ultrasonic attenuation coefficient

    Science.gov (United States)

    Lin, Li; Zhang, Xiang; Chen, Jun; Mu, Yunfei; Li, Ximeng

    2011-06-01

    A novel two-dimensional random void model (RVM) based on random medium theory and a statistical method is proposed to describe random voids in composite materials. The spatial autocorrelation function and statistical parameters are used to describe the large-scale heterogeneity from the composite matrix and the small-scale heterogeneities of elastic fluctuations from random voids, the values of which are determined by statistical data from microscopic observations of void morphology. A RVM for CFRP (carbon fiber reinforced polymer) composite specimens with void content of 0.03-4.62% is presented. It is found that the geometric morphology of voids from the RVM presents good matches to the microscopic images. Calculations of ultrasonic attenuation coefficients from the RVM at 5 MHz are much closer to the experiments than those from the previous deterministic model. Furthermore, the RVM can also cover abnormal coefficients from unusually large voids, which unpredictably occur during the composite preparation and have a detrimental effect on the strength and mechanical properties of the components. The significant enhancements in description of void morphology and quantitative correlation between void content and ultrasonic attenuation coefficient make this method a good candidate for predicting void content of composite materials non-destructively.

  3. Variability of tsunami inundation footprints considering stochastic scenarios based on a single rupture model: Application to the 2011 Tohoku earthquake

    Science.gov (United States)

    Goda, Katsuichiro; Yasuda, Tomohiro; Mori, Nobuhito; Mai, P. Martin

    2015-06-01

    The sensitivity and variability of spatial tsunami inundation footprints in coastal cities and towns due to a megathrust subduction earthquake in the Tohoku region of Japan are investigated by considering different fault geometry and slip distributions. Stochastic tsunami scenarios are generated based on the spectral analysis and synthesis method with regards to an inverted source model. To assess spatial inundation processes accurately, tsunami modeling is conducted using bathymetry and elevation data with 50 m grid resolutions. Using the developed methodology for assessing variability of tsunami hazard estimates, stochastic inundation depth maps can be generated for local coastal communities. These maps are important for improving disaster preparedness by understanding the consequences of different situations/conditions, and by communicating uncertainty associated with hazard predictions. The analysis indicates that the sensitivity of inundation areas to the geometrical parameters (i.e., top-edge depth, strike, and dip) depends on the tsunami source characteristics and the site location, and is therefore complex and highly nonlinear. The variability assessment of inundation footprints indicates significant influence of slip distributions. In particular, topographical features of the region, such as ria coast and near-shore plain, have major influence on the tsunami inundation footprints.

  4. Application of a rule-based model to estimate mercury exchange for three background biomes in the continental United States.

    Science.gov (United States)

    Hartman, Jelena S; Weisberg, Peter J; Pillai, Rekha; Ericksen, Jody A; Kuiken, Todd; Lindberg, Steve E; Zhang, Hong; Rytuba, James J; Gustin, Mae S

    2009-07-01

    Ecosystems that have low mercury (Hg) concentrations (i.e., not enriched or impacted by geologic or anthropogenic processes) cover most of the terrestrial surface area of the earth yet their role as a net source or sink for atmospheric Hg is uncertain. Here we use empirical data to develop a rule-based model implemented within a geographic information system framework to estimate the spatial and temporal patterns of Hg flux for semiarid deserts, grasslands, and deciduous forests representing 45% of the continental United States. This exercise provides an indication of whether these ecosystems are a net source or sink for atmospheric Hg as well as a basis for recommendation of data to collect in future field sampling campaigns. Results indicated that soil alone was a small net source of atmospheric Hg and that emitted Hg could be accounted for based on Hg input by wet deposition. When foliar assimilation and wet deposition are added to the area estimate of soil Hg flux these biomes are a sink for atmospheric Hg.

  5. Fractional order model reduction approach based on retention of the dominant dynamics: application in IMC based tuning of FOPI and FOPID controllers.

    Science.gov (United States)

    Tavakoli-Kakhki, Mahsan; Haeri, Mohammad

    2011-07-01

    Fractional order PI and PID controllers are the most common fractional order controllers used in practice. In this paper, a simple analytical method is proposed for tuning the parameters of these controllers. The proposed method is useful in designing fractional order PI and PID controllers for control of complicated fractional order systems. To achieve the goal, at first a reduction technique is presented for approximating complicated fractional order models. Then, based on the obtained reduced models some analytical rules are suggested to determine the parameters of fractional order PI and PID controllers. Finally, numerical results are given to show the efficiency of the proposed tuning algorithm.

  6. A Framework for Next Generation Mobile and Wireless Networks Application Development using Hybrid Component Based Development Model

    CERN Document Server

    Barnawi, Ahmed; Khan, Asif Irshad

    2012-01-01

    The IP Multimedia Subsystems (IMS) that features in Next Generation Networks (NGN) offers the application developer (third party) abilities to map out applications over mobile telecommunication infrastructure. The IMS comes about with APIs useful for mobile application developers to create applications to meet end-users' demands and comply with the provider's infrastructure set up at the same time. Session Initiation Protocol (SIP) is a signaling protocol for this architecture. It is used for establishing sessions in IP network, making it an ideal candidate for supporting terminal mobility in to deliver the services with improved Quality of Services (QOS). The realization of IMS's virtues as far as software design is concerned is faced by lack of standardizations and methodologies throughout application development process. In this paper, we report on progress on ongoing research by our group toward putting together a platform as a testbed used for NGN application development. We examine a novel component bas...

  7. Web-based applications for virtual laboratories

    NARCIS (Netherlands)

    Bier, H.H.

    2011-01-01

    Web-based applications for academic education facilitate, usually, exchange of multimedia files, while design-oriented domains such as architectural and urban design require additional support in collaborative real-time drafting and modeling. In this context, multi-user interactive interfaces employ

  8. Application of coating and base material living models to evaluate degradation and estimate the mean local operating temperature of two ex-service 1{sup st} stage blades

    Energy Technology Data Exchange (ETDEWEB)

    Mandelli, M. [Proing Italia, Torbole sul Garda, Trento (Italy); Rinaldi, C. [ERSE, Milan (Italy); Vacchieri, E. [Ansaldo Energia S.p.A., Genoa (Italy)

    2010-07-01

    In the frame of the collaborative program COST 538 a coating life prediction code was implemented by Proing and ERSE with an inverse problem solution routine able to calculate the local mean operating temperature from the operating conditions and the extension of the coating depleted regions. Moreover base material degradation models were developed by Ansaldo Energia on both equiaxed and single crystal superalloys. This paper describes the application of such methodologies to two ex-service 1st stage gas turbine blades delivered to COST 538 by AEN after operation in two different plants with different operating conditions. The objective of the study was the application and validation of an innovative NDT and the estimate of the mean operating temperature at different positions of the components. The destructive metallographic analysis of the blades let to validate the non destructive frequency scanning eddy current technique (F-SECT). Coating life modelling results are compared with those of the base material degradation models. An interesting correlation was found between the estimated temperatures with the two methods and also with the NDT findings at the most significant component positions. (orig.)

  9. Survival analysis models and applications

    CERN Document Server

    Liu, Xian

    2012-01-01

    Survival analysis concerns sequential occurrences of events governed by probabilistic laws.  Recent decades have witnessed many applications of survival analysis in various disciplines. This book introduces both classic survival models and theories along with newly developed techniques. Readers will learn how to perform analysis of survival data by following numerous empirical illustrations in SAS. Survival Analysis: Models and Applications: Presents basic techniques before leading onto some of the most advanced topics in survival analysis.Assumes only a minimal knowledge of SAS whilst enablin

  10. Application of hybrid robust three-axis attitude control approach to overactuated spacecraft-A quaternion based model

    Institute of Scientific and Technical Information of China (English)

    A. H. Mazinan

    2016-01-01

    A novel hybrid robust three-axis attitude control approach, namely HRTAC, is considered along with the well-known developments in the area of space systems, since there is a consensus among the related experts that the new insights may be taken into account as decision points to outperform the available materials. It is to note that the traditional control approaches may generally be upgraded, as long as a number of modifications are made with respect to state-of-the-art, in order to propose high-precision outcomes. Regarding the investigated issues, the robust sliding mode finite-time control approach is first designed to handle three-axis angular rates in the inner control loop, which consists of the pulse width pulse frequency modulations in line with the control allocation scheme and the system dynamics. The main subject to employ these modulations that is realizing in association with the control allocation scheme is to be able to handle a class of overactuated systems, in particular. The proportional derivative based linear quadratic regulator approach is then designed to handle three-axis rotational angles in the outer control loop, which consists of the system kinematics that is correspondingly concentrated to deal with the quaternion based model. The utilization of the linear and its nonlinear terms, simultaneously, are taken into real consideration as the research motivation, while the performance results are of the significance as the improved version in comparison with the recent investigated outcomes. Subsequently, there is a stability analysis to verify and guarantee the closed loop system performance in coping with the whole of nominal referenced commands. At the end, the effectiveness of the approach considered here is highlighted in line with a number of potential recent benchmarks.

  11. Applicability of heat and gas trans-port models in biocover design based on a case study from Denmark

    DEFF Research Database (Denmark)

    Nielsen, A. A. F.; Binning, Philip John; Kjeldsen, Peter

    2015-01-01

    . Both models used the heat equation for heat transfer, and the numerical model used advection-diffusion model with dual Monod kinetics for gas transport. The results were validated with data from a Danish landfi The models correlated well with the observed data: the coefficient of determination (R2......) was 0.95 for the analytic model and 0.91 for the numerical model. The models can be used for different design scenarios (e.g. varying methane infl thickness or start of operation), and can also help understand the processes that take place in the system, e.g. how oxygen penetration depends on ambient...

  12. Hydraulic model for multi-sources reclaimed water pipe network based on EPANET and its applications in Beijing, China

    Institute of Scientific and Technical Information of China (English)

    Haifeng JIA; Wei WEI; Kunlun XIN

    2008-01-01

    Water shortage is one of the major water related problems for many cities in the world. The planning for utilization of reclaimed water has been or would be drafted in these cities. For using the reclaimed water soundly, Beijing planned to build a large scale reclaimed water pipe networks with multi-sources. In order to support the plan, the integrated hydraulic model of planning pipe network was developed based on EPANET supported by geographic information system (GIS). The complicated pipe network was divided into four weak conjunction subzones according to the distribution of reclaimed water plants and the elevation. It could provide a better solution for the problem of overhigh pressure in several regions of the network. Through the scenarios analy-sis in different subzones, some of the initial diameter of pipes in the network was adjusted. At last the pipe network planning scheme of reclaimed water was proposed. The proposed planning scheme could reach the balances between reclaimed water requirements and reclaimed water supplies, and provided a scientific basis for the reclaimed water utilization in Beijing. Now the scheme had been adopted by Beijing municipal government.

  13. Study on the Estimation of Groundwater Withdrawals Based on Groundwater Flow Modeling and Its Application in the North China Plain

    Institute of Scientific and Technical Information of China (English)

    Jingli Shao; Yali Cui; Qichen Hao; Zhong Han; Tangpei Cheng

    2014-01-01

    The amount of water withdrawn by wells is one of the quantitative variables that can be applied to estimate groundwater resources and further evaluate the human influence on ground-water systems. The accuracy for the calculation of the amount of water withdrawal significantly in-fluences the regional groundwater resource evaluation and management. However, the decentralized groundwater pumping, inefficient management, measurement errors and uncertainties have resulted in considerable errors in the groundwater withdrawal estimation. In this study, to improve the esti-mation of the groundwater withdrawal, an innovative approach was proposed using an inversion method based on a regional groundwater flow numerical model, and this method was then applied in the North China Plain. The principle of the method was matching the simulated water levels with the observation ones by adjusting the amount of groundwater withdrawal. In addition, uncertainty analysis of hydraulic conductivity and specific yield for the estimation of the groundwater with-drawal was conducted. By using the proposed inversion method, the estimated annual average groundwater withdrawal was approximately 24.92×109 m3 in the North China Plain from 2002 to 2008. The inversion method also significantly improved the simulation results for both hydrograph and the flow field. Results of the uncertainty analysis showed that the hydraulic conductivity was more sensitive to the inversion results than the specific yield.

  14. Aviation Safety Modeling and Simulation (ASMM) Propulsion Fleet Modeling: A Tool for Semi-Automatic Construction of CORBA-based Applications from Legacy Fortran Programs

    Science.gov (United States)

    Sang, Janche

    2003-01-01

    Within NASA's Aviation Safety Program, NASA GRC participates in the Modeling and Simulation Project called ASMM. NASA GRC s focus is to characterize the propulsion systems performance from a fleet management and maintenance perspective by modeling and through simulation predict the characteristics of two classes of commercial engines (CFM56 and GE90). In prior years, the High Performance Computing and Communication (HPCC) program funded, NASA Glenn in developing a large scale, detailed simulations for the analysis and design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). Three major aspects of this modeling included the integration of different engine components, coupling of multiple disciplines, and engine component zooming at appropriate level fidelity, require relatively tight coupling of different analysis codes. Most of these codes in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with distributed objects can increase these codes reusability. Aviation Safety s modeling and simulation use in characterizing fleet management has similar needs. The modeling and simulation of these propulsion systems use existing Fortran and C codes that are instrumental in determining the performance of the fleet. The research centers on building a CORBA-based development environment for programmers to easily wrap and couple legacy Fortran codes. This environment consists of a C++ wrapper library to hide the details of CORBA and an efficient remote variable scheme to facilitate data exchange between the client and the server model. Additionally, a Web Service model should also be constructed for evaluation of this technology s use over the next two- three years.

  15. From situation modelling to a distributed rule-based platform for situation awareness: an ontological framework for disaster management applications

    NARCIS (Netherlands)

    Moreira, João Luiz Rebelo

    2015-01-01

    Situation-aware (SA) applications are particularly useful for disaster management. The complex nature of emergency scenarios presents challenges to the development of collaborative and distributed SA solutions. These challenges concern the whole lifecycle, from specification to implementation phases

  16. Model-Based Security Testing

    CERN Document Server

    Schieferdecker, Ina; Schneider, Martin; 10.4204/EPTCS.80.1

    2012-01-01

    Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security testing (MBST) is a relatively new field and especially dedicated to the systematic and efficient specification and documentation of security test objectives, security test cases and test suites, as well as to their automated or semi-automated generation. In particular, the combination of security modelling and test generation approaches is still a challenge in research and of high interest for industrial applications. MBST includes e.g. security functional testing, model-based fuzzing, risk- and threat-oriented testing,...

  17. Behavior Modeling -- Foundations and Applications

    DEFF Research Database (Denmark)

    This book constitutes revised selected papers from the six International Workshops on Behavior Modelling - Foundations and Applications, BM-FA, which took place annually between 2009 and 2014. The 9 papers presented in this volume were carefully reviewed and selected from a total of 58 papers...

  18. Applications of Continuum Shell Model

    OpenAIRE

    Volya, Alexander

    2006-01-01

    The nuclear many-body problem at the limits of stability is considered in the framework of the Continuum Shell Model that allows a unified description of intrinsic structure and reactions. Technical details behind the method are highlighted and practical applications combining the reaction and structure pictures are presented.

  19. Cell-Based Biosensors Principles and Applications

    CERN Document Server

    Wang, Ping

    2009-01-01

    Written by recognized experts the field, this leading-edge resource is the first book to systematically introduce the concept, technology, and development of cell-based biosensors. You find details on the latest cell-based biosensor models and novel micro-structure biosensor techniques. Taking an interdisciplinary approach, this unique volume presents the latest innovative applications of cell-based biosensors in a variety of biomedical fields. The book also explores future trends of cell-based biosensors, including integrated chips, nanotechnology and microfluidics. Over 140 illustrations hel

  20. Improved version of BTOPMC model and its application in event-based hydrologic simulations%改进的BTOPMC模型及其在水文模拟中的应用

    Institute of Scientific and Technical Information of China (English)

    王国强; 周买春; 竹内邦良; 石平博

    2007-01-01

    In this paper, a grid-based distributed hydrological model BTOPMC (Block-wise use of TOPMODEL) is introduced, which was developed from the original TOPMODEL. In order to broaden the model's application to arid regions, improvement methodology is also implemented. The canopy interception and soil infiltration processes were incorporated into the original BTOPMC to model event-based runoff simulation in large arid regions. One designed infiltration model with application of time compression approximation method is emphasized and validated for improving model's performance for event hydrological simulations with a case study of Lushi River basin.

  1. The Development of Dynamic Brand Equity Chase Model and Its Application to Digital Industry Based on Scanner Data

    Directory of Open Access Journals (Sweden)

    Nam Yongsik

    2009-12-01

    Full Text Available The purpose of this research is to develop a comprehensive modeling for measuring dynamics of brand power. We define brand power as brand specific coefficients to yield the sales volume for each period. The modeling consists of multinomial log it model for eachproduct category, the brand-specific coefficients, mixture modeling and fuzzy clustering algorithm. We apply our modeling to TV scanner data in Tianjin China. The results show 5 brands have 12 to 23 times change on their brand power in a year. The lasting time of brandpower spreads from 1 week to 12 weeks.

  2. Application of flood risk modelling in a web-based geospatial decision support tool for coastal adaptation to climate change

    Directory of Open Access Journals (Sweden)

    P. J. Knight

    2015-02-01

    Full Text Available A pressing problem facing coastal decision makers is the conversion of "high level" but plausible climate change assessments into an effective basis for climate change adaptation at the local scale. Here, we describe a web-based, geospatial decision-support tool (DST that provides an assessment of the potential flood risk for populated coastal lowlands arising from future sea-level rise, coastal storms and high river flows. This DST has been developed to support operational and strategic decision making by enabling the user to explore the flood hazard from extreme events, changes in the extent of the flood-prone areas with sea-level rise, and thresholds of sea-level rise where current policy and resource options are no longer viable. The DST is built in an open source GIS that uses freely available geospatial data. Flood risk assessments from a combination of LISFLOOD-FP and SWAB models are embedded within the tool; the user interface enables interrogation of different combinations of coastal and river events under rising sea-level scenarios. Users can readily vary the input parameters (sea level, storms, wave height and river flow relative to the present-day topography and infrastructure to identify combinations where significant regime shifts or "tipping points" occur. Two case studies are used to demonstrate the attributes of the DST with respect to the wider coastal community and the UK energy sector. Examples report on the assets at risk and illustrate the extent of flooding in relation to infrastructure access. This informs an economic assessment of potential losses due to climate change and thus provides local authorities and energy operators with essential information on the feasibility of investment for building resilience into vulnerable components of their area of responsibility.

  3. Application of wavelet neural network model based on genetic algorithm in the prediction of high-speed railway settlement

    Science.gov (United States)

    Tang, Shihua; Li, Feida; Liu, Yintao; Lan, Lan; Zhou, Conglin; Huang, Qing

    2015-12-01

    With the advantage of high speed, big transport capacity, low energy consumption, good economic benefits and so on, high-speed railway is becoming more and more popular all over the world. It can reach 350 kilometers per hour, which requires high security performances. So research on the prediction of high-speed railway settlement that as one of the important factors affecting the safety of high-speed railway becomes particularly important. This paper takes advantage of genetic algorithms to seek all the data in order to calculate the best result and combines the advantage of strong learning ability and high accuracy of wavelet neural network, then build the model of genetic wavelet neural network for the prediction of high-speed railway settlement. By the experiment of back propagation neural network, wavelet neural network and genetic wavelet neural network, it shows that the absolute value of residual errors in the prediction of high-speed railway settlement based on genetic algorithm is the smallest, which proves that genetic wavelet neural network is better than the other two methods. The correlation coefficient of predicted and observed value is 99.9%. Furthermore, the maximum absolute value of residual error, minimum absolute value of residual error-mean value of relative error and value of root mean squared error(RMSE) that predicted by genetic wavelet neural network are all smaller than the other two methods'. The genetic wavelet neural network in the prediction of high-speed railway settlement is more stable in terms of stability and more accurate in the perspective of accuracy.

  4. Optimal pricing decision model based on activity-based costing

    Institute of Scientific and Technical Information of China (English)

    王福胜; 常庆芳

    2003-01-01

    In order to find out the applicability of the optimal pricing decision model based on conventional costbehavior model after activity-based costing has given strong shock to the conventional cost behavior model andits assumptions, detailed analyses have been made using the activity-based cost behavior and cost-volume-profitanalysis model, and it is concluded from these analyses that the theory behind the construction of optimal pri-cing decision model is still tenable under activity-based costing, but the conventional optimal pricing decisionmodel must be modified as appropriate to the activity-based costing based cost behavior model and cost-volume-profit analysis model, and an optimal pricing decision model is really a product pricing decision model construc-ted by following the economic principle of maximizing profit.

  5. Research and Application of Role-Based Access Control Model in Web Application System%Web应用系统中RBAC模型的研究与实现

    Institute of Scientific and Technical Information of China (English)

    黄秀文

    2015-01-01

    Access control is the main strategy of security and protection in Web system, the traditional access control can not meet the needs of the growing security. With using the role based access control (RBAC) model and introducing the concept of the role in the web system, the user is mapped to a role in an organization, access to the corresponding role authorization, access authorization and control according to the user's role in an organization, so as to improve the web system flexibility and security permissions and access control.%访问控制是Web系统中安全防范和保护的主要策略,传统的访问控制已不能满足日益增长的安全性需求。本文在web应用系统中,使用基于角色的访问控制(RBAC)模型,通过引入角色的概念,将用户映射为在一个组织中的某种角色,将访问权限授权给相应的角色,根据用户在组织内所处的角色进行访问授权与控制,从而提高了在web系统中权限分配和访问控制的灵活性与安全性。

  6. Location Based Services and Applications

    Directory of Open Access Journals (Sweden)

    Elenis Gorrita Michel

    2012-05-01

    Full Text Available Location Based Services (LBS continue to grow in popularity, effectiveness and reliability, to the extent that applications are designed and implemented taking into account the facilities of the user location information. In this work, some of the main applications are addressed, in order to make an assessment of the current importance of the LBS, as a branch of technology in full swing. In addition, the main techniques for location estimation are studied, essential information to the LBS. Because of this it is a highly topical issue, the ongoing works and researches are also discussed.

  7. A Role-Based Fuzzy Assignment Model

    Institute of Scientific and Technical Information of China (English)

    ZUO Bao-he; FENG Shan

    2002-01-01

    It's very important to dynamically assign the tasks to corresponding actors in workflow management system, especially in complex applications. This improves the flexibility of workflow systems.In this paper, a role-based workflow model with fuzzy optimized intelligent assignment is proposed and applied in the investment management system. A groupware-based software model is also proposed.

  8. A Gaussian Mixture MRF for Model-Based Iterative Reconstruction with Applications to Low-Dose X-ray CT

    CERN Document Server

    Zhang, Ruoqiao; Pal, Debashish; Thibault, Jean-Baptiste; Sauer, Ken D; Bouman, Charles A

    2016-01-01

    Markov random fields (MRFs) have been widely used as prior models in various inverse problems such as tomographic reconstruction. While MRFs provide a simple and often effective way to model the spatial dependencies in images, they suffer from the fact that parameter estimation is difficult. In practice, this means that MRFs typically have very simple structure that cannot completely capture the subtle characteristics of complex images. In this paper, we present a novel Gaussian mixture Markov random field model (GM-MRF) that can be used as a very expressive prior model for inverse problems such as denoising and reconstruction. The GM-MRF forms a global image model by merging together individual Gaussian-mixture models (GMMs) for image patches. In addition, we present a novel analytical framework for computing MAP estimates using the GM-MRF prior model through the construction of surrogate functions that result in a sequence of quadratic optimizations. We also introduce a simple but effective method to adjust...

  9. A Proposal of B to B Collaboration Process Model based on a Concept of Service and its Application to Energy Saving Service Business

    Science.gov (United States)

    Zhang, Qi; Kosaka, Michitaka; Shirahada, Kunio; Yabutani, Takashi

    This paper proposes a new framework for B to B collaboration process based on a concept of service. Service value, which gives users satisfaction for provided services, depends on the situation, user characteristics, and user objectives in seeking the service. Vargo proposed Service Dominant Logic (SDL), which determines service value according to “value in use”. This concept illustrates the importance of the relationship between the service itself and its situation. This relationship is analogous to electro-magnetic field theory in physics. We developed the concept of service fields to create service value based on an analogy of the electro-magnetic field. By applying this concept to B to B collaboration, a model of service value co-creation in the collaboration can be formulated. Then, the collaboration can be described by 4 steps of KIKI model (Knowledge sharing related to service system, Identification of service field, Knowledge creation for new service idea, Implementation of service idea). As its application to B to B collaboration, the energy saving service business is reported to demonstrate the validity of the proposed collaboration model. This concept can be applied to make a collaboration process effective.

  10. Comparison between traditional laboratory tests, permeability measurements and CT-based fluid flow modelling for cultural heritage applications.

    Science.gov (United States)

    De Boever, Wesley; Bultreys, Tom; Derluyn, Hannelore; Van Hoorebeke, Luc; Cnudde, Veerle

    2016-06-01

    In this paper, we examine the possibility to use on-site permeability measurements for cultural heritage applications as an alternative for traditional laboratory tests such as determination of the capillary absorption coefficient. These on-site measurements, performed with a portable air permeameter, were correlated with the pore network properties of eight sandstones and one granular limestone that are discussed in this paper. The network properties of the 9 materials tested in this study were obtained from micro-computed tomography (μCT) and compared to measurements and calculations of permeability and the capillary absorption rate of the stones under investigation, in order to find the correlation between pore network characteristics and fluid management characteristics of these sandstones. Results show a good correlation between capillary absorption, permeability and network properties, opening the possibility of using on-site permeability measurements as a standard method in cultural heritage applications.

  11. An Effective Security Mechanism for M-Commerce Applications Exploiting Ontology Based Access Control Model for Healthcare System

    OpenAIRE

    S.M. Roychoudri; Dr. M. Aramudhan

    2016-01-01

    Health organizations are beginning to move mobile commerce services in recent years to enhance services and quality without spending much investment for IT infrastructure. Medical records are very sensitive and private to any individuals. Hence effective security mechanism is required. The challenges of our research work are to maintain privacy for the users and provide smart and secure environment for accessing the application. It is achieved with the help of personalization. Internet has pr...

  12. Model of Hot Metal Silicon Content in Blast Furnace Based on Principal Component Analysis Application and Partial Least Square

    Institute of Scientific and Technical Information of China (English)

    SHI Lin; LI Zhi-ling; YU Tao; LI Jiang-peng

    2011-01-01

    In blast furnace (BF) iron-making process, the hot metal silicon content was usually used to measure the quality of hot metal and to reflect the thermal state of BF. Principal component analysis (PCA) and partial least- square (PLS) regression methods were used to predict the hot metal silicon content. Under the conditions of BF rela- tively stable situation, PCA and PLS regression models of hot metal silicon content utilizing data from Baotou Steel No. 6 BF were established, which provided the accuracy of 88.4% and 89.2%. PLS model used less variables and time than principal component analysis model, and it was simple to calculate. It is shown that the model gives good results and is helpful for practical production.

  13. Physically-based 6-DoF Nodes Deformable Models: Application to Connective Tissues Simulation and Soft-Robots Control

    OpenAIRE

    Bosman, Julien

    2015-01-01

    The medical simulation is an increasingly active research field. Yet, despite the promising advance observed over the past years, the complete virtual patient’s model is yet to come. There are still many avenues for improvements, especially concerning the mechanical modeling of boundary conditions on anatomical structures.So far, most of the work has been dedicated to organs simulation, which are generally simulated alone. This raises a real problem as the role of the surrounding organs in th...

  14. A finite-element-based perturbation model for the rotordynamic analysis of shrouded pump impellers: Part 1: Model development and applications

    Science.gov (United States)

    Baskharone, Erian A.

    1993-01-01

    This study concerns the rotor dynamic characteristics of fluid-encompassed rotors, with special emphasis on shrouded pump impellers. The core of the study is a versatile and categorically new finite-element-based perturbation model, which is based on a rigorous flow analysis and what we have generically termed the 'virtually' deformable finite-element approach. The model is first applied to the case of a smooth annular seal for verification purposes. The rotor excitation components, in this sample problem, give rise to a purely cylindrical, purely conical, and a simultaneous cylindrical/conical rotor whirl around the housing centerline. In all cases, the computed results are compared to existing experimental and analytical data involving the same seal geometry and operating conditions. Next, two labyrinth-seal configurations, which share the same tooth-to-tooth chamber geometry but differ in the total number of chambers, were investigated. The results, in this case, are compared to experimental measurements for both seal configurations. The focus is finally shifted to the shrouded-impeller problem, where the stability effects of the leakage flow in the shroud-to-housing secondary passage are investigated. To this end, the computational model is applied to a typical shrouded-impeller pump stage, fabricated and rotor dynamically tested by Sulzer Bros., and the results compared to those of a simplified 'bulk-flow' analysis and Sulzer Bros.' test data. In addition to assessing the computed rotor dynamic coefficients, the shrouded-impeller study also covers a controversial topic, namely that of the leakage-passage inlet swirl, which was previously cited as the origin of highly unconventional (resonance-like) trends of the fluid-exerted forces. In order to validate this claim, a 'microscopic' study of the fluid/shroud interaction mechanism is conducted, with the focus being on the structure of the perturbed flow field associated with the impeller whirl. The conclusions

  15. Yin-yang of space travel: lessons from the ground-based models of microgravity and their applications to disease and health for life on Earth

    Science.gov (United States)

    Kulkarni, A.; Yamauchi, K.; Hales, N.; Sundaresan, A.; Pellis, N.; Yamamoto, S.; Andrassy, R.

    Space flight environment has numerous clinical effects on human physiology; however, the advances made in physical and biological sciences have benefited humans on Earth. Space flight induces adverse effects on bone, muscle, cardiovascular, neurovestibular, gastrointestinal, and immune function. Similar pathophysiologic changes are also observed in aging with debilitating consequences. Anti-orthostatic tail-suspension (AOS) of rodents is an in vivo model to study many of these effects induced by the microgravity environment of space travel. Over the years AOS has been used by several researchers to study bone demineralization, muscle atrophy, neurovestibular and stress related effects. ecently we employed the AOS model in parallel with in vitro cell culture microgravity analog (Bioreactor) to document the decrease in immune function and its reversal by a nutritional countermeasure. We have modified the rodent model to study nutrient effects and benefits in a short period of time, usually within one to two weeks, in contrast to conventional aging research models which take several weeks to months to get the same results. This model has a potential for further development to study the role of nutrition in other pathophysiologies in an expedited manner. Using this model it is possible to evaluate the response of space travelers of various ages to microgravity stressors for long-term space travel. Hence this modified model will have significant impact on time and financial research budget. For the first time our group has documented a true potential immunonutritional countermeasure for the space flight induced effects on immune system (Clinical Nutrition 2002). Based on our nutritional and immunological studies we propose application of these microgravity analogs and its benefits and utility for nutritional effects on other physiologic parameters especially in aging. (Supported by NASA NCC8-168 grant, ADK)

  16. A Brief Introduction of the Achievements of Key Project Image-based Modeling and Rendering for Virtual Reality Applications

    Institute of Scientific and Technical Information of China (English)

    Jiaoying Shi; Zhanyi Hu; Enhua Wu; Qunsheng Peng

    2006-01-01

    @@ 1.Background The virtual reality (VR) technology is now at the frontier of modern information science.VR is based on computer graphics,computer vision,and other fresh air topics in today's computer technology.

  17. RIGID-PLASTIC/RIGID-VISCOPLASTIC FEM BASED ON LINEAR PROGRAMMING-THEORETICAL MODELING AND APPLICATION FOR PLANE-STRAIN PROBLEMS

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    A new rigid-plastic/rigid-viscoplastic (RP/RVP) FEM based on linear programming (LP) for plane-strain metal forming simulation is proposed. Compared with the traditional RP/RVP FEM based on iteration solution, it has some remarkable advantages, such as it's free of convergence problem and its convenience in contact, incompressibility constraint and rigid zone treatment. Two solution examples are provided to validate its accuracy and efficiency.

  18. Research and application of credit risk prediction model based on wrapper%基于Wrapper的信用风险预测模型研究与应用

    Institute of Scientific and Technical Information of China (English)

    张凯

    2012-01-01

    研究了一个有效适用的企业信用风险预警模型。针对单一BP神经网络预测模型由于财务指标选择不当导致误判率较高的问题,提出了首先进行特征选择,利用遗传算法搜索出最优特征子集,并采用BP神经网络作为遗传算法的评估函数,构建了一个基于Wrapper方法的神经网络信用风险预测模型。以沪深股市1998—2004年间的制造企业数据为例对模型进行实验,结果表明,新模型提高了预测准确率,评估结果更具科学性,实际应用具有良好的信用风险预测能力。%The paper studies an effective enterprise credit risk prediction model based on the wrapper methods. For single BP neural network prediction model with higher rate of misjudgment due to improper financial index selection, the paper puts forward a BP neural network learning algorithm, which first uses the conclusion of feature selection, then gets optimal feature subset, and uses the BP neural network as the evaluate function of genetic algorithm, constructed a Wrapper method based on neural network to credit risk prediction model. In the paper, an example is adopted to test the new model by using the data of Hu-Shen stock market from 1998 to 2004 in the manufacturing enterprise. The simulation results show that the new model improves the predic- tion accuracy, evaluate the results more scientific and practical application with good credit risk.

  19. A model-based approach to adjust microwave observations for operational applications: results of a campaign at Munich Airport in winter 2011/2012

    Directory of Open Access Journals (Sweden)

    J. Güldner

    2013-10-01

    Full Text Available In the frame of the project "LuFo iPort VIS" which focuses on the implementation of a site-specific visibility forecast, a field campaign was organised to offer detailed information to a numerical fog model. As part of additional observing activities, a 22-channel microwave radiometer profiler (MWRP was operating at the Munich Airport site in Germany from October 2011 to February 2012 in order to provide vertical temperature and humidity profiles as well as cloud liquid water information. Independently from the model-related aims of the campaign, the MWRP observations were used to study their capabilities to work in operational meteorological networks. Over the past decade a growing quantity of MWRP has been introduced and a user community (MWRnet was established to encourage activities directed at the set up of an operational network. On that account, the comparability of observations from different network sites plays a fundamental role for any applications in climatology and numerical weather forecast. In practice, however, systematic temperature and humidity differences (bias between MWRP retrievals and co-located radiosonde profiles were observed and reported by several authors. This bias can be caused by instrumental offsets and by the absorption model used in the retrieval algorithms as well as by applying a non-representative training data set. At the Lindenberg observatory, besides a neural network provided by the manufacturer, a measurement-based regression method was developed to reduce the bias. These regression operators are calculated on the basis of coincident radiosonde observations and MWRP brightness temperature (TB measurements. However, MWRP applications in a network require comparable results at just any site, even if no radiosondes are available. The motivation of this work is directed to a verification of the suitability of the operational local forecast model COSMO-EU of the Deutscher Wetterdienst (DWD for the calculation

  20. Pattern-based Automatic Translation of Structured Power System Data to Functional Models for Decision Support Applications

    DEFF Research Database (Denmark)

    Heussen, Kai; Weckesser, Johannes Tilman Gabriel; Kullmann, Daniel

    2013-01-01

    Improved information and insight for decision support in operations and design are central promises of a smart grid. Well-structured information about the composition of power systems is increasingly becoming available in the domain, e.g. due to standard information models (e.g. CIM or IEC61850...

  1. Early FDI Based on Residuals Design According to the Analysis of Models of Faults: Application to DAMADICS

    Directory of Open Access Journals (Sweden)

    Yahia Kourd

    2011-01-01

    Full Text Available The increased complexity of plants and the development of sophisticated control systems have encouraged the parallel development of efficient rapid fault detection and isolation (FDI systems. FDI in industrial system has lately become of great significance. This paper proposes a new technique for short time fault detection and diagnosis in nonlinear dynamic systems with multi inputs and multi outputs. The main contribution of this paper is to develop a FDI schema according to reference models of fault-free and faulty behaviors designed with neural networks. Fault detection is obtained according to residuals that result from the comparison of measured signals with the outputs of the fault free reference model. Then, Euclidean distance from the outputs of models of faults to the measurements leads to fault isolation. The advantage of this method is to provide not only early detection but also early diagnosis thanks to the parallel computation of the models of faults and to the proposed decision algorithm. The effectiveness of this approach is illustrated with simulations on DAMADICS benchmark.

  2. Automatic deployment of component-based applications

    OpenAIRE

    Lascu, Tudor Alexandru; Mauro, Jacopo; Zavattaro, Gianluigi

    2015-01-01

    International audience In distributed systems like those based on cloud or service-oriented frameworks, applications are typically assembled by deploying and connecting a large number of heterogeneous software components, spanning from fine-grained packages to coarse-grained complex services. Automation techniques and tools have been proposed to ease the deployment process of these complex system. By relying on a formal model of components, we describe a sound and complete algorithm for co...

  3. A Habitat-based Wind-Wildlife Collision Model with Application to the Upper Great Plains Region

    Energy Technology Data Exchange (ETDEWEB)

    Forcey, Greg, M.

    2012-08-28

    Most previous studies on collision impacts at wind facilities have taken place at the site-specific level and have only examined small-scale influences on mortality. In this study, we examine landscape-level influences using a hierarchical spatial model combined with existing datasets and life history knowledge for: Horned Lark, Red-eyed Vireo, Mallard, American Avocet, Golden Eagle, Whooping Crane, red bat, silver-haired bat, and hoary bat. These species were modeled in the central United States within Bird Conservation Regions 11, 17, 18, and 19. For the bird species, we modeled bird abundance from existing datasets as a function of habitat variables known to be preferred by each species to develop a relative abundance prediction for each species. For bats, there are no existing abundance datasets so we identified preferred habitat in the landscape for each species and assumed that greater amounts of preferred habitat would equate to greater abundance of bats. The abundance predictions for bird and bats were modeled with additional exposure factors known to influence collisions such as visibility, wind, temperature, precipitation, topography, and behavior to form a final mapped output of predicted collision risk within the study region. We reviewed published mortality studies from wind farms in our study region and collected data on reported mortality of our focal species to compare to our modeled predictions. We performed a sensitivity analysis evaluating model performance of 6 different scenarios where habitat and exposure factors were weighted differently. We compared the model performance in each scenario by evaluating observed data vs. our model predictions using spearmans rank correlations. Horned Lark collision risk was predicted to be highest in the northwestern and west-central portions of the study region with lower risk predicted elsewhere. Red-eyed Vireo collision risk was predicted to be the highest in the eastern portions of the study region and in

  4. Applications of a thermal-based two-source energy balance model using Priestley-Taylor approach for surface temperature partitioning under advective conditions

    Science.gov (United States)

    Song, Lisheng; Kustas, William P.; Liu, Shaomin; Colaizzi, Paul D.; Nieto, Hector; Xu, Ziwei; Ma, Yanfei; Li, Mingsong; Xu, Tongren; Agam, Nurit; Tolk, Judy A.; Evett, Steven R.

    2016-09-01

    In this study ground measured soil and vegetation component temperatures and composite temperature from a high spatial resolution thermal camera and a network of thermal-IR sensors collected in an irrigated maize field and in an irrigated cotton field are used to assess and refine the component temperature partitioning approach in the Two-Source Energy Balance (TSEB) model. A refinement to TSEB using a non-iterative approach based on the application of the Priestley-Taylor formulation for surface temperature partitioning and estimating soil evaporation from soil moisture observations under advective conditions (TSEB-A) was developed. This modified TSEB formulation improved the agreement between observed and modeled soil and vegetation temperatures. In addition, the TSEB-A model output of evapotranspiration (ET) and the components evaporation (E), transpiration (T) when compared to ground observations using the stable isotopic method and eddy covariance (EC) technique from the HiWATER experiment and with microlysimeters and a large monolithic weighing lysimeter from the BEAREX08 experiment showed good agreement. Difference between the modeled and measured ET measurements were less than 10% and 20% on a daytime basis for HiWATER and BEAREX08 data sets, respectively. The TSEB-A model was found to accurately reproduce the temporal dynamics of E, T and ET over a full growing season under the advective conditions existing for these irrigated crops located in arid/semi-arid climates. With satellite data this TSEB-A modeling framework could potentially be used as a tool for improving water use efficiency and conservation practices in water limited regions. However, TSEB-A requires soil moisture information which is not currently available routinely from satellite at the field scale.

  5. Value co-creation model based on IS application capabilities%基于 IS应用能力的价值共同创造模型

    Institute of Scientific and Technical Information of China (English)

    朱树婷; 仲伟俊; 梅姝娥

    2014-01-01

    为研究电子商务背景下企业和顾客之间的价值创造,从关系观的理论视角构建基于IS应用能力的价值共同创造模型,分析IS应用能力、关系资产、顾客敏捷性和关系价值的构念和维度,应用实证方法验证理论假设。实证研究验证了顾客敏捷性在价值创造的中介作用、IS应用能力对关系资产的互补作用及其对关系价值产生的影响。基于IS应用能力的价值共同创造模型拓展了服务主导逻辑下的价值创造框架,新的理论模型和实证结论解释了电子商务环境下企业通过IS应用能力与顾客进行价值共同创造的机理,为进一步的理论发展和企业实践提供指导。%To illustrate how firms and customers co-create value in business to business B2B e-commerce an integrated value co-creation model is proposed based on information systems IS application capabilities from the relational view.IS application capabilities relational assets customer agility and relational value are constructed and tested by empirical analysis.The empirical research tests and verifies the mediating effect of customer agility and the interactions of IS application capabilities and relational assets as well as their effect on relational value. This model expands the research framework of value co-creation in service dominant logic and reveals the mechanism of how firms and customers co-create value in B2B e-commerce based on IS application capabilities which provides the basis for further theory development and a practice guide.

  6. Model-based tolerance intervals derived from cumulative historical composition data: application for substantial equivalence assessment of a genetically modified crop.

    Science.gov (United States)

    Hong, Bonnie; Fisher, Tracey L; Sult, Theresa S; Maxwell, Carl A; Mickelson, James A; Kishino, Hirohisa; Locke, Mary E H

    2014-10-01

    Compositional analysis is a requisite component of the substantial equivalence framework utilized to assess genetically modified (GM) crop safety. Statistical differences in composition data between GM and non-GM crops require a context in which to determine biological relevance. This context is provided by surveying the natural variation of key nutrient and antinutrient levels within the crop population with a history of safe use. Data accumulated from various genotypes with a history of safe use cultivated in relevant commercial crop-growing environments over multiple seasons are discussed as the appropriate data representative of this natural variation. A model-based parametric tolerance interval approach, which accounts for the correlated and unbalanced data structure of cumulative historical data collected from multisite field studies conducted over multiple seasons, is presented. This paper promotes the application of this tolerance interval approach to generate reference ranges for evaluation of the biological relevance of statistical differences identified during substantial equivalence assessment of a GM crop.

  7. Design of a multi-model observer-based estimator for Fault Detection and Isolation (FDI strategy: application to a chemical reactor

    Directory of Open Access Journals (Sweden)

    Y. Chetouani

    2008-12-01

    Full Text Available This study presents a FDI strategy for nonlinear dynamic systems. It shows a methodology of tackling the fault detection and isolation issue by combining a technique based on the residuals signal and a technique using the multiple Kalman filters. The usefulness of this combination is the on-line implementation of the set of models, which represents the normal mode and all dynamics of faults, if the statistical decision threshold on the residuals exceeds a fixed value. In other cases, one Extended Kalman Filter (EKF is enough to estimate the process state. After describing the system architecture and the proposed FDI methodology, we present a realistic application in order to show the technique's potential. An algorithm is described and applied to a chemical process like a perfectly stirred chemical reactor functioning in a semi-batch mode. The chemical reaction used is an oxido reduction one, the oxidation of sodium thiosulfate by hydrogen peroxide.

  8. Model-Based Optimization of Scaffold Geometry and Operating Conditions of Radial Flow Packed-Bed Bioreactors for Therapeutic Applications

    Directory of Open Access Journals (Sweden)

    Danilo Donato

    2014-01-01

    Full Text Available Radial flow perfusion of cell-seeded hollow cylindrical porous scaffolds may overcome the transport limitations of pure diffusion and direct axial perfusion in the realization of bioengineered substitutes of failing or missing tissues. Little has been reported on the optimization criteria of such bioreactors. A steady-state model was developed, combining convective and dispersive transport of dissolved oxygen with Michaelis-Menten cellular consumption kinetics. Dimensional analysis was used to combine more effectively geometric and operational variables in the dimensionless groups determining bioreactor performance. The effectiveness of cell oxygenation was expressed in terms of non-hypoxic fractional construct volume. The model permits the optimization of the geometry of hollow cylindrical constructs, and direction and magnitude of perfusion flow, to ensure cell oxygenation and culture at controlled oxygen concentration profiles. This may help engineer tissues suitable for therapeutic and drug screening purposes.

  9. Struts Application Frame Based on MVC Design Model%基于MVC设计模式的Struts框架应用

    Institute of Scientific and Technical Information of China (English)

    毕磊; 邓忠华

    2007-01-01

    介绍Struts概念和体系结构,通过程序示例探讨Struts三个主要功能模块Controller, Model, View之间的内在联系及各自的处理流程,展现Struts能够更好帮助Java开发者利用J2EE开发大型Web应用的优势.

  10. Parametric estimation of covariance function in Gaussian-process based Kriging models. Application to uncertainty quantification for computer experiments

    OpenAIRE

    Bachoc, François

    2013-01-01

    The parametric estimation of the covariance function of a Gaussian process is studied, in the framework of the Kriging model. Maximum Likelihood and Cross Validation estimators are considered. The correctly specified case, in which the covariance function of the Gaussian process does belong to the parametric set used for estimation, is first studied in an increasing-domain asymptotic framework. The sampling considered is a randomly perturbed multidimensional regular grid. Consistency and asym...

  11. A Critical Review on Wind Turbine Power Curve Modelling Techniques and Their Applications in Wind Based Energy Systems

    OpenAIRE

    Sohoni, Vaishali; Gupta, S. C.; R. K. Nema

    2016-01-01

    Power curve of a wind turbine depicts the relationship between output power and hub height wind speed and is an important characteristic of the turbine. Power curve aids in energy assessment, warranty formulations, and performance monitoring of the turbines. With the growth of wind industry, turbines are being installed in diverse climatic conditions, onshore and offshore, and in complex terrains causing significant departure of these curves from the warranted values. Accurate models of power...

  12. Extraction of Desired Signal Based on AR Model with Its Application to Atrial Activity Estimation in Atrial Fibrillation

    Science.gov (United States)

    Wang, Gang; Rao, Ni-ni; Shepherd, Simon J.; Beggs, Clive B.

    2008-12-01

    The use of electrocardiograms (ECGs) to diagnose and analyse atrial fibrillation (AF) has received much attention recently. When studying AF, it is important to isolate the atrial activity (AA) component of the ECG plot. We present a new autoregressive (AR) model for semiblind source extraction of the AA signal. Previous researchers showed that one could extract a signal with the smallest normalized mean square prediction error (MSPE) as the first output from linear mixtures by minimizing the MSPE. However the extracted signal will be not always the desired one even if the AR model parameters of one source signal are known. We introduce a new cost function, which caters for the specific AR model parameters, to extract the desired source. Through theoretical analysis and simulation we demonstrate that this algorithm can extract any desired signal from mixtures provided that its AR parameters are first obtained. We use this approach to extract the AA signal from 12-lead surface ECG signals for hearts undergoing AF. In our methodology we roughly estimated the AR parameters from the fibrillatory wave segment in the V1 lead, and then used this algorithm to extract the AA signal. We validate our approach using real-world ECG data.

  13. Study and Application of Reinforcement Learning in Cooperative Strategy of the Robot Soccer Based on BDI Model

    Directory of Open Access Journals (Sweden)

    Wu Bo-ying

    2009-11-01

    Full Text Available The dynamic cooperation model of multi-Agent is formed by combining reinforcement learning with BDI model. In this model, the concept of the individual optimization loses its meaning, because the repayment of each Agent dose not only depend on itsself but also on the choice of other Agents. All Agents can pursue a common optimum solution and try to realize the united intention as a whole to a maximum limit. The robot moves to its goal, depending on the present positions of the other robots that cooperate with it and the present position of the ball. One of these robots cooperating with it is controlled to move by man with a joystick. In this way, Agent can be ensured to search for each state-action as frequently as possible when it carries on choosing movements, so as to shorten the time of searching for the movement space so that the convergence speed of reinforcement learning can be improved. The validity of the proposed cooperative strategy for the robot soccer has been proved by combining theoretical analysis with simulation robot soccer match (11vs11 .

  14. Lévy-based growth models

    DEFF Research Database (Denmark)

    Jónsdóttir, Kristjana Ýr; Schmiegel, Jürgen; Jensen, Eva Bjørn Vedel

    2008-01-01

    In the present paper, we give a condensed review, for the nonspecialist reader, of a new modelling framework for spatio-temporal processes, based on Lévy theory. We show the potential of the approach in stochastic geometry and spatial statistics by studying Lévy-based growth modelling of planar...... objects. The growth models considered are spatio-temporal stochastic processes on the circle. As a by product, flexible new models for space–time covariance functions on the circle are provided. An application of the Lévy-based growth models to tumour growth is discussed....

  15. DNA computing model based on lab-on-a-chip and its application to solving the timetabling problem

    Institute of Scientific and Technical Information of China (English)

    Fengyue Zhang; Bo Liu; Wenbin Liu; Qiang Zhang

    2008-01-01

    The essential characteristic of DNA computation is its massive parallelism in obtaining and managing information.With the development of molecular biology technique,the field of DNA computation has made a great progress.By using an advanced biochip technique,laboratory-on-a-chip,a new DNA computing model is presented in the paper to solve a simple timetabling problem,which is a special version ofthe optimization problems.It also plays an important role in education and other industries.With a simulated biological experiment,the result suggested that DNA computation with lab-on-a-chip has the potential to solve a real complex timetabling problem.

  16. Security Assessment of Web Based Distributed Applications

    Directory of Open Access Journals (Sweden)

    Catalin BOJA

    2010-01-01

    Full Text Available This paper presents an overview about the evaluation of risks and vulnerabilities in a web based distributed application by emphasizing aspects concerning the process of security assessment with regards to the audit field. In the audit process, an important activity is dedicated to the measurement of the characteristics taken into consideration for evaluation. From this point of view, the quality of the audit process depends on the quality of assessment methods and techniques. By doing a review of the fields involved in the research process, the approach wants to reflect the main concerns that address the web based distributed applications using exploratory research techniques. The results show that many are the aspects which must carefully be worked with, across a distributed system and they can be revealed by doing a depth introspective analyze upon the information flow and internal processes that are part of the system. This paper reveals the limitations of a non-existing unified security risk assessment model that could prevent such risks and vulnerabilities debated. Based on such standardize models, secure web based distributed applications can be easily audited and many vulnerabilities which can appear due to the lack of access to information can be avoided.

  17. Application of a Model to Evaluate Infrared Exposure Limits in Aluminum Foundries Based on Threshold Temperature in the Range of 770-1400 nm

    Directory of Open Access Journals (Sweden)

    FARAMARZ MADJIDI

    2015-10-01

    Full Text Available High intensity optical radiation can cause damage to the eye and intense radiation in the range of 770-1400 nm can cause thermal retinal damage. In the workplaces where there are high temperature sources, the workers in front of these hot  sources without bright light maybe exposed to the  intense IR radiation, thus regular measurement of these radiations seems crucial. Measurement of IR radiations by radiometer in  specific wavelength ranges is elusive. Moreover, when radiometers are used, the correct application of the recommended exposure limits requires knowledge of spectral radiance which seems sophisticated for hygienists. The main objective of the present study is applying a model to express retinal thermal injury in terms of temperature for molten aluminum ovens in an aluminum foundry that emit optical radiation without visible light. In the proposed model, ACGIH TLVs for retinal thermal injury in the range of 770 to 1400 nm was used where source luminance was under 0.01 cd/cm2. Also, by using the output results of this proposed model it is possible to present a new chart for evaluation of exposure to IR for hot sources based on Threshold Temperature.

  18. Voxel-Based LIDAR Analysis and Applications

    Science.gov (United States)

    Hagstrom, Shea T.

    One of the greatest recent changes in the field of remote sensing is the addition of high-quality Light Detection and Ranging (LIDAR) instruments. In particular, the past few decades have been greatly beneficial to these systems because of increases in data collection speed and accuracy, as well as a reduction in the costs of components. These improvements allow modern airborne instruments to resolve sub-meter details, making them ideal for a wide variety of applications. Because LIDAR uses active illumination to capture 3D information, its output is fundamentally different from other modalities. Despite this difference, LIDAR datasets are often processed using methods appropriate for 2D images and that do not take advantage of its primary virtue of 3-dimensional data. It is this problem we explore by using volumetric voxel modeling. Voxel-based analysis has been used in many applications, especially medical imaging, but rarely in traditional remote sensing. In part this is because the memory requirements are substantial when handling large areas, but with modern computing and storage this is no longer a significant impediment. Our reason for using voxels to model scenes from LIDAR data is that there are several advantages over standard triangle-based models, including better handling of overlapping surfaces and complex shapes. We show how incorporating system position information from early in the LIDAR point cloud generation process allows radiometrically-correct transmission and other novel voxel properties to be recovered. This voxelization technique is validated on simulated data using the Digital Imaging and Remote Sensing Image Generation (DIRSIG) software, a first-principles based ray-tracer developed at the Rochester Institute of Technology. Voxel-based modeling of LIDAR can be useful on its own, but we believe its primary advantage is when applied to problems where simpler surface-based 3D models conflict with the requirement of realistic geometry. To

  19. A robust hybrid model integrating enhanced inputs based extreme learning machine with PLSR (PLSR-EIELM) and its application to intelligent measurement.

    Science.gov (United States)

    He, Yan-Lin; Geng, Zhi-Qiang; Xu, Yuan; Zhu, Qun-Xiong

    2015-09-01

    In this paper, a robust hybrid model integrating an enhanced inputs based extreme learning machine with the partial least square regression (PLSR-EIELM) was proposed. The proposed PLSR-EIELM model can overcome two main flaws in the extreme learning machine (ELM), i.e. the intractable problem in determining the optimal number of the hidden layer neurons and the over-fitting phenomenon. First, a traditional extreme learning machine (ELM) is selected. Second, a method of randomly assigning is applied to the weights between the input layer and the hidden layer, and then the nonlinear transformation for independent variables can be obtained from the output of the hidden layer neurons. Especially, the original input variables are regarded as enhanced inputs; then the enhanced inputs and the nonlinear transformed variables are tied together as the whole independent variables. In this way, the PLSR can be carried out to identify the PLS components not only from the nonlinear transformed variables but also from the original input variables, which can remove the correlation among the whole independent variables and the expected outputs. Finally, the optimal relationship model of the whole independent variables with the expected outputs can be achieved by using PLSR. Thus, the PLSR-EIELM model is developed. Then the PLSR-EIELM model served as an intelligent measurement tool for the key variables of the Purified Terephthalic Acid (PTA) process and the High Density Polyethylene (HDPE) process. The experimental results show that the predictive accuracy of PLSR-EIELM is stable, which indicate that PLSR-EIELM has good robust character. Moreover, compared with ELM, PLSR, hierarchical ELM (HELM), and PLSR-ELM, PLSR-EIELM can achieve much smaller predicted relative errors in these two applications.

  20. The application of GIS based decision-tree models for generating the spatial distribution of hydromorphic organic landscapes in relation to digital terrain data

    Directory of Open Access Journals (Sweden)

    R. Bou Kheir

    2010-06-01

    Full Text Available Accurate information about organic/mineral soil occurrence is a prerequisite for many land resources management applications (including climate change mitigation. This paper aims at investigating the potential of using geomorphometrical analysis and decision tree modeling to predict the geographic distribution of hydromorphic organic landscapes in unsampled area in Denmark. Nine primary (elevation, slope angle, slope aspect, plan curvature, profile curvature, tangent curvature, flow direction, flow accumulation, and specific catchment area and one secondary (steady-state topographic wetness index topographic parameters were generated from Digital Elevation Models (DEMs acquired using airborne LIDAR (Light Detection and Ranging systems. They were used along with existing digital data collected from other sources (soil type, geological substrate and landscape type to explain organic/mineral field measurements in hydromorphic landscapes of the Danish area chosen. A large number of tree-based classification models (186 were developed using (1 all of the parameters, (2 the primary DEM-derived topographic (morphological/hydrological parameters only, (3 selected pairs of parameters and (4 excluding each parameter one at a time from the potential pool of predictor parameters. The best classification tree model (with the lowest misclassification error and the smallest number of terminal nodes and predictor parameters combined the steady-state topographic wetness index and soil type, and explained 68% of the variability in organic/mineral field measurements. The overall accuracy of the predictive organic/inorganic landscapes' map produced (at 1:50 000 cartographic scale using the best tree was estimated to be ca. 75%. The proposed classification-tree model is relatively simple, quick, realistic and practical, and it can be applied to other areas, thereby providing a tool to facilitate the implementation of pedological/hydrological plans for conservation

  1. A robust hybrid model integrating enhanced inputs based extreme learning machine with PLSR (PLSR-EIELM) and its application to intelligent measurement.

    Science.gov (United States)

    He, Yan-Lin; Geng, Zhi-Qiang; Xu, Yuan; Zhu, Qun-Xiong

    2015-09-01

    In this paper, a robust hybrid model integrating an enhanced inputs based extreme learning machine with the partial least square regression (PLSR-EIELM) was proposed. The proposed PLSR-EIELM model can overcome two main flaws in the extreme learning machine (ELM), i.e. the intractable problem in determining the optimal number of the hidden layer neurons and the over-fitting phenomenon. First, a traditional extreme learning machine (ELM) is selected. Second, a method of randomly assigning is applied to the weights between the input layer and the hidden layer, and then the nonlinear transformation for independent variables can be obtained from the output of the hidden layer neurons. Especially, the original input variables are regarded as enhanced inputs; then the enhanced inputs and the nonlinear transformed variables are tied together as the whole independent variables. In this way, the PLSR can be carried out to identify the PLS components not only from the nonlinear transformed variables but also from the original input variables, which can remove the correlation among the whole independent variables and the expected outputs. Finally, the optimal relationship model of the whole independent variables with the expected outputs can be achieved by using PLSR. Thus, the PLSR-EIELM model is developed. Then the PLSR-EIELM model served as an intelligent measurement tool for the key variables of the Purified Terephthalic Acid (PTA) process and the High Density Polyethylene (HDPE) process. The experimental results show that the predictive accuracy of PLSR-EIELM is stable, which indicate that PLSR-EIELM has good robust character. Moreover, compared with ELM, PLSR, hierarchical ELM (HELM), and PLSR-ELM, PLSR-EIELM can achieve much smaller predicted relative errors in these two applications. PMID:26112928

  2. The application of GIS based decision-tree models for generating the spatial distribution of hydromorphic organic landscapes in relation to digital terrain data

    Directory of Open Access Journals (Sweden)

    R. Bou Kheir

    2010-01-01

    Full Text Available Accurate information about soil organic carbon (SOC, presented in a spatially form, is prerequisite for many land resources management applications (including climate change mitigation. This paper aims to investigate the potential of using geomorphometrical analysis and decision tree modeling to predict the geographic distribution of hydromorphic organic landscapes at unsampled area in Denmark. Nine primary (elevation, slope angle, slope aspect, plan curvature, profile curvature, tangent curvature, flow direction, flow accumulation, and specific catchment area and one secondary (steady-state topographic wetness index topographic parameters were generated from Digital Elevation Models (DEMs acquired using airborne LIDAR (Light Detection and Ranging systems. They were used along with existing digital data collected from other sources (soil type, geological substrate and landscape type to statistically explain SOC field measurements in hydromorphic landscapes of the chosen Danish area. A large number of tree-based classification models (186 were developed using (1 all of the parameters, (2 the primary DEM-derived topographic (morphological/hydrological parameters only, (3 selected pairs of parameters and (4 excluding each parameter one at a time from the potential pool of predictor parameters. The best classification tree model (with the lowest misclassification error and the smallest number of terminal nodes and predictor parameters combined the steady-state topographic wetness index and soil type, and explained 68% of the variability in field SOC measurements. The overall accuracy of the produced predictive SOC map (at 1:50 000 cartographic scale using the best tree was estimated to be ca. 75%. The proposed classification-tree model is relatively simple, quick, realistic and practical, and it can be applied to other areas, thereby providing a tool to help with the implementation of pedological/hydrological plans for conservation and sustainable

  3. INCLUSION RATIO BASED ESTIMATOR FOR THE MEAN LENGTH OF THE BOOLEAN LINE SEGMENT MODEL WITH AN APPLICATION TO NANOCRYSTALLINE CELLULOSE

    Directory of Open Access Journals (Sweden)

    Mikko Niilo-Rämä

    2014-06-01

    Full Text Available A novel estimator for estimating the mean length of fibres is proposed for censored data observed in square shaped windows. Instead of observing the fibre lengths, we observe the ratio between the intensity estimates of minus-sampling and plus-sampling. It is well-known that both intensity estimators are biased. In the current work, we derive the ratio of these biases as a function of the mean length assuming a Boolean line segment model with exponentially distributed lengths and uniformly distributed directions. Having the observed ratio of the intensity estimators, the inverse of the derived function is suggested as a new estimator for the mean length. For this estimator, an approximation of its variance is derived. The accuracies of the approximations are evaluated by means of simulation experiments. The novel method is compared to other methods and applied to real-world industrial data from nanocellulose crystalline.

  4. Optimal multi-agent path planning for fast inverse modeling in UAV-based flood sensing applications

    KAUST Repository

    Abdelkader, Mohamed

    2014-05-01

    Floods are the most common natural disasters, causing thousands of casualties every year in the world. In particular, flash flood events are particularly deadly because of the short timescales on which they occur. Unmanned air vehicles equipped with mobile microsensors could be capable of sensing flash floods in real time, saving lives and greatly improving the efficiency of the emergency response. However, of the main issues arising with sensing floods is the difficulty of planning the path of the sensing agents in advance so as to obtain meaningful data as fast as possible. In this particle, we present a fast numerical scheme to quickly compute the trajectories of a set of UAVs in order to maximize the accuracy of model parameter estimation over a time horizon. Simulation results are presented, a preliminary testbed is briefly described, and future research directions and problems are discussed. © 2014 IEEE.

  5. Research and application on integration modeling of 3D bodies in coal mine with blended data model based on TIN and ARTP

    Institute of Scientific and Technical Information of China (English)

    HAN Zuo-zhen; HAN Rui-dong; MAO Shan-jun; HAN Jing-min

    2007-01-01

    Data modeling is the foundation of three-dimensional visualization technology.First the paper proposed the 3D integrated data model of stratum, laneway and drill on the basic of TIN and ARTP, and designed the relevant conceptual and logical model from the view of data model, and described the data structure of geometric elements of the model by adopting the object-oriented modeling idea. And then studied the key modeling technology of stratum, laneway and drill, introduced the ARTP modeling process of stratum,laneway and drill and studied the 3D geometric modeling process of different section laneways. At last, the paper realized the three-dimensional visualization system professionally coalmine-oriented, using SQL Server as background database, Visual C++6.0 and OpenGL as foreground development tools.

  6. Application of the Western-based adjuvant online model to Korean colon cancer patients; a single institution experience

    International Nuclear Information System (INIS)

    Adjuvant Online (AOL) is web-accessible risk-assessment model that predicts the mortality and the benefits of adjuvant therapy. AOL has never been validated for Asian colon cancer patients. Using the Yonsei Tumor Registry database, patients who were treated within the Yonsei University Health System between 1990 and 2005 for T1-4, N0-2, and M0 colon cancer were included in the calculations for survival. Observed and predicted 5-year overall survival was compared for each patient. The median age of the study population of 1431 patients was 60 years (range, 15–87 years), and the median follow-up duration was 7.9 years (range, 0.06–19.8 years). The predicted 5-year overall survival rate (77.7%) and observed survival (79.5%) was not statistically different (95% Confidential interval, 76.3–81.5) in all patients. Predicted outcomes were within 95% confidential interval of observed survival in both stage II and III disease, including most demographic and pathologic subgroups. Moreover, AOL more accurately predicted OS for patients with stage II than stage III. AOL tended to offer reliable prediction for 5-year overall survival and could be used as a decision making tool for adjuvant treatment in Korean colon cancer patients whose prognosis is similar to other Asian patients

  7. Application of Multicast-based Video Conference on CERNET Backbone

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Multicast-based video conference is a representative application in advanced network. In multi-point video conference using multicast can get better efficiency facilitated by inner-group broadcast mechanism. In the application, the multicast-based network resources assignment, management and security should be considered together. This paper presents a framework model of multicast-based video conferencing application with three layers. And a practical multicast-based video conferencing is implemented in CERNET(China Education and Research Network) backbone. The practice is valuable for the development of multicast-based video conferencing application in China.

  8. Cotton Assorting Optimized Model Based on HVI Data and Its Application%基于HVI数据的配棉优选模型及应用研究

    Institute of Scientific and Technical Information of China (English)

    邱兆宝

    2012-01-01

    研究基于HVI数据的配棉方案优选模型的创建方法及应用效果.以HVI数据为基础,配棉技术标准为依据,运用系统工程的思想和方法,遵循配棉原则,利用较少的定量信息使决策的思维过程数学化,从而为多目标、多准则的配棉问题提供比较简便的决策方法.模型的求解采用隐枚举法,有效地解决了非线性目标整数规划配棉问题.认为:配棉方案优选模型与配棉方案的评价方法,可作为棉纺质量工艺专家指导系统的组成部分.%Establish method and application effect of cotton assorting optimized model based on HVI data were researched. HVI data were taken as base, technology standard of cotton assorting were selected as foundation, ideas and method of system engineering was used. Principle of cotton assorting should be followed. Decision-making process of thinking could be mathematical by using less quantitative information, so decision method of cotton assorting problems with multi-objective and multi criteria can be provided. Implicit enumeration was used for model solve,cotton assorting problems of nonlinear objective integer programming can be solved effectively. It is considered that the cotton assorting optimization model and evaluation methods could be the part of cotton spinning processing expert guide system.

  9. DATA MODELING METHOD BASED ON PARTIAL LEAST SQUARE REGRESSION AND APPLICATION IN CORRELATION ANALYSIS OF THE STATOR BARS CONDITION PARAMETERS

    Institute of Scientific and Technical Information of China (English)

    李锐华; 高乃奎; 谢恒堃; 史维祥

    2004-01-01

    Objective To investigate various data message of the stator bars condition parameters under the condition that only a few samples are available, especially about correlation information between the nondestructive parameters and residual breakdown voltage of the stator bars. Methods Artificial stator bars is designed to simulate the generator bars. The partial didcharge( PD) and dielectric loss experiments are performed in order to obtain the nondestructive parameters, and the residual breakdown voltage acquired by AC damage experiment. In order to eliminate the dimension effect on measurement data, raw data is preprocessed by centered-compress. Based on the idea of extracting principal components, a partial least square (PLS) method is applied to screen and synthesize correlation information between the nondestructive parameters and residual breakdown voltage easily. Moreover, various data message about condition parameters are also discussed. Results Graphical analysis function of PLS is easily to understand various data message of the stator bars condition parameters. The analysis Results are consistent with result of aging testing. Conclusion The method can select and extract PLS components of condition parameters from sample data, and the problems of less samples and multicollinearity are solved effectively in regression analysis.

  10. The NFC Application Model Based on Social Network%基于社交网络的NFC应用模型

    Institute of Scientific and Technical Information of China (English)

    赵云辉; 张慧琳

    2015-01-01

    In order to expIore the appIication of the NFC (Near FieId Communication) technoIogy, discusses the appIication modeI of new sociaI network combined with cIoud computing, NFC technoIogy and sociaI networking, puts forward community services based on NFC mobiIe phones such as NFC sign-in, virtuaI stores and accurate advertising push, describes the profitabiIity?modeI of the NFC appIications, demonstrates the operabiIity of impIementation and promotion of the NFC sociaI network modeI.%为研究NFC(Near FieId Communication近场通信)技术的应用发展模式,结合云计算、NFC技术、社交网络等新技术的应用特点,探讨了新型社交网络的NFC应用模型,提出了NFC签到,虚拟商店,精准广告推送等基于NFC手机的社区服务内容,描述了基于NFC技术的应用盈利模式,论证了实施和推广的可操作性。

  11. Two Strategies Of Agent-Based Modelling Application For Management Of Lakeland Landscapes At A Regional Scale

    Directory of Open Access Journals (Sweden)

    Giełda-Pinas Katarzyna

    2015-09-01

    Full Text Available This work presents two different strategies of ABM for management of selected lakeland landscapes and their impact on sustainable development. Two different lakeland research areas as well as two different sets of agents and their decision rules were compared. In Strategy 1 decisions made by farmers and their influence on the land use/cover pattern as well as the indirect consequence of phosphorus and nitrogen delivery to the water bodies were investigated. In this strategy, a group of farmer agents is encouraged to participate in an agri-environmental program. The Strategy 2 combines the decisions of farmers, foresters and local authorities. The agents in the model share a common goal to produce a spatial plan. The land use/cover patterns arising from different attitudes and decision rules of the involved actors were investigated. As the basic spatial unit, the first strategy employed a landscape unit, i.e. lake catchment whereas the second strategy used an administrative unit, i.e. commune. Both strategies resulted in different land use/cover patterns and changes, which were evaluated in terms of sustainability policy. The main conclusion for Strategy 1 is that during 5 years of farmer’s participation in the agri-environmental program, there was significant decrease of nutrient leaching to the lake. The main conclusion for Strategy 2 should be stated that cooperating of the agents is better for the natural environment than the competitions between them. In both strategies, agents’ decisions influence the environment but different spatial units of analysis express this environment.

  12. Cardiac C-arm CT: 4D non-model based heart motion estimation and its application

    Science.gov (United States)

    Prümmer, M.; Fahrig, R.; Wigström, L.; Boese, J.; Lauritsch, G.; Strobel, N.; Hornegger, J.

    2007-03-01

    The combination of real-time fluoroscopy and 3D cardiac imaging on the same C-arm system is a promising technique that might improve therapy planning, guiding, and monitoring in the interventional suite. In principal, to reconstruct a 3D image of the beating heart at a particular cardiac phase, a complete set of X-ray projection data representing that phase is required. One approximate approach is the retrospectively ECG-gated FDK reconstruction (RG-FDK). From the acquired data set of N s multiple C-arm sweeps, those projection images which are acquired closest in time to the desired cardiac phase are retrospectively selected. However, this approach uses only 1/ N s of the obtained data. Our goal is to utilize data from other cardiac phases as well. In order to minimize blurring and motion artifacts, cardiac motion has to be compensated for, which can be achieved using a temporally dependent spatial 3D warping of the filtered-backprojections. In this work we investigate the computation of the 4D heart motion based on prior reconstructions of several cardiac phases using RG-FDK. A 4D motion estimation framework is presented using standard fast non-rigid registration. A smooth 4D motion vector field (MVF) represents the relative deformation compared to a reference cardiac phase. A 4D deformation regridding by adaptive supersampling allows selecting any reference phase independently of the set of phases used in the RG-FDK for a motion corrected reconstruction. Initial promising results from in vivo experiments are shown. The subjects individual 4D cardiac MVF could be computed from only three RG-FDK image volumes. In addition, all acquired projection data were motion corrected and subsequently used for image reconstruction to improve the signal-to-noise ratio compared to RG-FDK.

  13. 校园数据集成过渡模式研究与应用%Research and application of data integrated transition model based on campus network

    Institute of Scientific and Technical Information of China (English)

    郭政慧

    2012-01-01

    分析了校园网各业务系统的特点和软件架构模式,提出了利用面向服务架构实现数据集成的过渡方案.采用多粒度服务设计原则,将遗留系统封装成为服务构件.经过比较,选择统一的Web服务标准接口方式,并给出了校园网业务集成的步骤、方法,分析其可行性,并应用于具体实践中.%With the construction of information systems,there has been a wide variety of isolated applications.Based on this,this paper analyzes the characteristics of business systems and software architecture model,then proposes service-oriented architecture for business integration framework.Based on the principles of multi-granularity services,Legacy systems will be encapsulated as a web service component.Finally,the article describes the steps,methods,feasibility of the business integration on compus network.

  14. Application of Improved Radiation Modeling to General Circulation Models

    Energy Technology Data Exchange (ETDEWEB)

    Michael J Iacono

    2011-04-07

    This research has accomplished its primary objectives of developing accurate and efficient radiation codes, validating them with measurements and higher resolution models, and providing these advancements to the global modeling community to enhance the treatment of cloud and radiative processes in weather and climate prediction models. A critical component of this research has been the development of the longwave and shortwave broadband radiative transfer code for general circulation model (GCM) applications, RRTMG, which is based on the single-column reference code, RRTM, also developed at AER. RRTMG is a rigorously tested radiation model that retains a considerable level of accuracy relative to higher resolution models and measurements despite the performance enhancements that have made it possible to apply this radiation code successfully to global dynamical models. This model includes the radiative effects of all significant atmospheric gases, and it treats the absorption and scattering from liquid and ice clouds and aerosols. RRTMG also includes a statistical technique for representing small-scale cloud variability, such as cloud fraction and the vertical overlap of clouds, which has been shown to improve cloud radiative forcing in global models. This development approach has provided a direct link from observations to the enhanced radiative transfer provided by RRTMG for application to GCMs. Recent comparison of existing climate model radiation codes with high resolution models has documented the improved radiative forcing capability provided by RRTMG, especially at the surface, relative to other GCM radiation models. Due to its high accuracy, its connection to observations, and its computational efficiency, RRTMG has been implemented operationally in many national and international dynamical models to provide validated radiative transfer for improving weather forecasts and enhancing the prediction of global climate change.

  15. Crowdsourcing Based 3d Modeling

    Science.gov (United States)

    Somogyi, A.; Barsi, A.; Molnar, B.; Lovas, T.

    2016-06-01

    Web-based photo albums that support organizing and viewing the users' images are widely used. These services provide a convenient solution for storing, editing and sharing images. In many cases, the users attach geotags to the images in order to enable using them e.g. in location based applications on social networks. Our paper discusses a procedure that collects open access images from a site frequently visited by tourists. Geotagged pictures showing the image of a sight or tourist attraction are selected and processed in photogrammetric processing software that produces the 3D model of the captured object. For the particular investigation we selected three attractions in Budapest. To assess the geometrical accuracy, we used laser scanner and DSLR as well as smart phone photography to derive reference values to enable verifying the spatial model obtained from the web-album images. The investigation shows how detailed and accurate models could be derived applying photogrammetric processing software, simply by using images of the community, without visiting the site.

  16. Comparison and applicability of landslide susceptibility models based on landslide ratio-based logistic regression, frequency ratio, weight of evidence, and instability index methods in an extreme rainfall event

    Science.gov (United States)

    Wu, Chunhung

    2016-04-01

    Few researches have discussed about the applicability of applying the statistical landslide susceptibility (LS) model for extreme rainfall-induced landslide events. The researches focuses on the comparison and applicability of LS models based on four methods, including landslide ratio-based logistic regression (LRBLR), frequency ratio (FR), weight of evidence (WOE), and instability index (II) methods, in an extreme rainfall-induced landslide cases. The landslide inventory in the Chishan river watershed, Southwestern Taiwan, after 2009 Typhoon Morakot is the main materials in this research. The Chishan river watershed is a tributary watershed of Kaoping river watershed, which is a landslide- and erosion-prone watershed with the annual average suspended load of 3.6×107 MT/yr (ranks 11th in the world). Typhoon Morakot struck Southern Taiwan from Aug. 6-10 in 2009 and dumped nearly 2,000 mm of rainfall in the Chishan river watershed. The 24-hour, 48-hour, and 72-hours accumulated rainfall in the Chishan river watershed exceeded the 200-year return period accumulated rainfall. 2,389 landslide polygons in the Chishan river watershed were extracted from SPOT 5 images after 2009 Typhoon Morakot. The total landslide area is around 33.5 km2, equals to the landslide ratio of 4.1%. The main landslide types based on Varnes' (1978) classification are rotational and translational slides. The two characteristics of extreme rainfall-induced landslide event are dense landslide distribution and large occupation of downslope landslide areas owing to headward erosion and bank erosion in the flooding processes. The area of downslope landslide in the Chishan river watershed after 2009 Typhoon Morakot is 3.2 times higher than that of upslope landslide areas. The prediction accuracy of LS models based on LRBLR, FR, WOE, and II methods have been proven over 70%. The model performance and applicability of four models in a landslide-prone watershed with dense distribution of rainfall

  17. Application of simulation models for the optimization of business processes

    Science.gov (United States)

    Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří

    2016-06-01

    The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.

  18. Auditory model inversion and its application

    Institute of Scientific and Technical Information of China (English)

    ZHAO Heming; WANG Yongqi; CHEN Xueqin

    2005-01-01

    Auditory model has been applied to several aspects of speech signal processing field, and appears to be effective in performance. This paper presents the inverse transform of each stage of one widely used auditory model. First of all it is necessary to invert correlogram and reconstruct phase information by repetitious iterations in order to get auditory-nerve firing rate. The next step is to obtain the negative parts of the signal via the reverse process of the HWR (Half Wave Rectification). Finally the functions of inner hair cell/synapse model and Gammatone filters have to be inverted. Thus the whole auditory model inversion has been achieved. An application of noisy speech enhancement based on auditory model inversion algorithm is proposed. Many experiments show that this method is effective in reducing noise.Especially when SNR of noisy speech is low it is more effective than other methods. Thus this auditory model inversion method given in this paper is applicable to speech enhancement field.

  19. Using models to determine irrigation applications for water management

    Science.gov (United States)

    Simple models are used by field researchers and production agriculture to estimate crop water use for the purpose of scheduling irrigation applications. These are generally based on a simple volume balance approach based on estimates of soil water holding capacity, irrigation application amounts, pr...

  20. Rigorous model-based uncertainty quantification with application to terminal ballistics—Part II. Systems with uncontrollable inputs and large scatter

    Science.gov (United States)

    Adams, M.; Lashgari, A.; Li, B.; McKerns, M.; Mihaly, J.; Ortiz, M.; Owhadi, H.; Rosakis, A. J.; Stalzer, M.; Sullivan, T. J.

    2012-05-01

    This Part II of this series is concerned with establishing the feasibility of an extended data-on-demand (XDoD) uncertainty quantification (UQ) protocol based on concentration-of-measure inequalities and martingale theory. Specific aims are to establish the feasibility of the protocol and its basic properties, including the tightness of the predictions afforded by the protocol. The assessment is based on an application to terminal ballistics and a specific system configuration consisting of 6061-T6 aluminum plates struck by spherical 440c stainless steel projectiles at ballistic impact speeds in the range of 2.4-2.8 km/s. The system's inputs are the plate thickness, plate obliquity and impact velocity. The perforation area is chosen as the sole performance measure of the system. The objective of the UQ analysis is to certify the lethality of the projectile, i.e., that the projectile perforates the plate with high probability over a prespecified range of impact velocities, plate thicknesses and plate obliquities. All tests were conducted at Caltech's Small Particle Hypervelocity Range (SPHIR), which houses a two-stage gas gun. A feature of this facility is that the impact velocity, while amenable to precise measurement, cannot be controlled precisely but varies randomly according to a known probability density function. In addition, due to a competition between petalling and plugging mechanisms for the material system under consideration, the measured perforation area exhibits considerable scatter. The analysis establishes the feasibility of the XDoD UQ protocol as a rigorous yet practical approach for model-based certification of complex systems characterized by uncontrollable inputs and noisy experimental data.

  1. Dynamic modeling of breast tissue with application of model reference adaptive system identification technique based on clinical robot-assisted palpation.

    Science.gov (United States)

    Keshavarz, M; Mojra, A

    2015-11-01

    Accurate identification of breast tissue's dynamic behavior in physical examination is critical to successful diagnosis and treatment. In this study a model reference adaptive system identification (MRAS) algorithm is utilized to estimate the dynamic behavior of breast tissue from mechanical stress-strain datasets. A robot-assisted device (Robo-Tac-BMI) is going to mimic physical palpation on a 45 year old woman having a benign mass in the left breast. Stress-strain datasets will be collected over 14 regions of both breasts in a specific period of time. Then, a 2nd order linear model is adapted to the experimental datasets. It was confirmed that a unique dynamic model with maximum error about 0.89% is descriptive of the breast tissue behavior meanwhile mass detection may be achieved by 56.1% difference from the normal tissue.

  2. Grid-based Meteorological and Crisis Applications

    Science.gov (United States)

    Hluchy, Ladislav; Bartok, Juraj; Tran, Viet; Lucny, Andrej; Gazak, Martin

    2010-05-01

    forecast model is a subject of the parameterization and parameter optimization before its real deployment. The parameter optimization requires tens of evaluations of the parameterized model accuracy and each evaluation of the model parameters requires re-running of the hundreds of meteorological situations collected over the years and comparison of the model output with the observed data. The architecture and inherent heterogeneity of both examples and their computational complexity and their interfaces to other systems and services make them well suited for decomposition into a set of web and grid services. Such decomposition has been performed within several projects we participated or participate in cooperation with academic sphere, namely int.eu.grid (dispersion model deployed as a pilot application to an interactive grid), SEMCO-WS (semantic composition of the web and grid services), DMM (development of a significant meteorological phenomena prediction system based on the data mining), VEGA 2009-2011 and EGEE III. We present useful and practical applications of technologies of high performance computing. The use of grid technology provides access to much higher computation power not only for modeling and simulation, but also for the model parameterization and validation. This results in the model parameters optimization and more accurate simulation outputs. Having taken into account that the simulations are used for the aviation, road traffic and crisis management, even small improvement in accuracy of predictions may result in significant improvement of safety as well as cost reduction. We found grid computing useful for our applications. We are satisfied with this technology and our experience encourages us to extend its use. Within an ongoing project (DMM) we plan to include processing of satellite images which extends our requirement on computation very rapidly. We believe that thanks to grid computing we are able to handle the job almost in real time.

  3. System identification application using Hammerstein model

    Indian Academy of Sciences (India)

    SABAN OZER; HASAN ZORLU; SELCUK METE

    2016-06-01

    Generally, memoryless polynomial nonlinear model for nonlinear part and finite impulse response (FIR) model or infinite impulse response model for linear part are preferred in Hammerstein models in literature. In this paper, system identification applications of Hammerstein model that is cascade of nonlinear second order volterra and linear FIR model are studied. Recursive least square algorithm is used to identify the proposed Hammerstein model parameters. Furthermore, the results are compared to identify the success of proposed Hammerstein model and different types of models

  4. Sticker DNA computer model--PartⅡ:Application

    Institute of Scientific and Technical Information of China (English)

    XU Jin; LI Sanping; DONG Yafei; WEI Xiaopeng

    2004-01-01

    Sticker model is one of the basic models in the DNA computer models. This model is coded with single-double stranded DNA molecules. It has the following advantages that the operations require no strands extension and use no enzymes; What's more, the materials are reusable. Therefore, it arouses attention and interest of scientists in many fields. In this paper, we extend and improve the sticker model, which will be definitely beneficial to the construction of DNA computer. This paper is the second part of our series paper, which mainly focuses on the application of sticker model. It mainly consists of the following three sections: the matrix representation of sticker model is first presented; then a brief review of the past research on graph and combinatorial optimization, such as the minimal set covering problem, the vertex covering problem, Hamiltonian path or cycle problem, the maximal clique problem, the maximal independent problem and the Steiner spanning tree problem, is described; Finally a DNA algorithm for the graph isomorphic problem based on the sticker model is given.

  5. Model-based Abstraction of Data Provenance

    OpenAIRE

    Probst, Christian W.; Hansen, René Rydhof

    2014-01-01

    Identifying provenance of data provides insights to the origin of data and intermediate results, and has recently gained increased interest due to data-centric applications. In this work we extend a data-centric system view with actors handling the data and policies restricting actions. This extension is based on provenance analysis performed on system models. System models have been introduced to model and analyse spatial and organisational aspects of organisations, to identify, e.g., potent...

  6. Model-based tolerance intervals derived from cumulative historical composition data: application for substantial equivalence assessment of a genetically modified crop.

    Science.gov (United States)

    Hong, Bonnie; Fisher, Tracey L; Sult, Theresa S; Maxwell, Carl A; Mickelson, James A; Kishino, Hirohisa; Locke, Mary E H

    2014-10-01

    Compositional analysis is a requisite component of the substantial equivalence framework utilized to assess genetically modified (GM) crop safety. Statistical differences in composition data between GM and non-GM crops require a context in which to determine biological relevance. This context is provided by surveying the natural variation of key nutrient and antinutrient levels within the crop population with a history of safe use. Data accumulated from various genotypes with a history of safe use cultivated in relevant commercial crop-growing environments over multiple seasons are discussed as the appropriate data representative of this natural variation. A model-based parametric tolerance interval approach, which accounts for the correlated and unbalanced data structure of cumulative historical data collected from multisite field studies conducted over multiple seasons, is presented. This paper promotes the application of this tolerance interval approach to generate reference ranges for evaluation of the biological relevance of statistical differences identified during substantial equivalence assessment of a GM crop. PMID:25208038

  7. Holistic, model-based optimization of edge leveling as an enabler for lithographic focus control: application to a memory use case

    Science.gov (United States)

    Hasan, T.; Kang, Y.-S.; Kim, Y.-J.; Park, S.-J.; Jang, S.-Y.; Hu, K.-Y.; Koop, E. J.; Hinnen, P. C.; Voncken, M. M. A. J.

    2016-03-01

    Advancement of the next generation technology nodes and emerging memory devices demand tighter lithographic focus control. Although the leveling performance of the latest-generation scanners is state of the art, challenges remain at the wafer edge due to large process variations. There are several customer configurable leveling control options available in ASML scanners, some of which are application specific in their scope of leveling improvement. In this paper, we assess the usability of leveling non-correctable error models to identify yield limiting edge dies. We introduce a novel dies-inspec based holistic methodology for leveling optimization to guide tool users in selecting an optimal configuration of leveling options. Significant focus gain, and consequently yield gain, can be achieved with this integrated approach. The Samsung site in Hwaseong observed an improved edge focus performance in a production of a mid-end memory product layer running on an ASML NXT 1960 system. 50% improvement in focus and a 1.5%p gain in edge yield were measured with the optimized configurations.

  8. Intelligent Model for Traffic Safety Applications

    Directory of Open Access Journals (Sweden)

    C. Chellappan

    2012-01-01

    Full Text Available Problem statement: This study presents an analysis on road traffic system focused on the use of communications to detect dangerous vehicles on roads and highways and how it could be used to enhance driver safety. Approach: The intelligent traffic safety application model is based on all traffic flow theories developed in the last years, leading to reliable representations of road traffic, which is of major importance in achieving the attenuation of traffic problems. The model also includes the decision making process from the driver in accelerating, decelerating and changing lanes. Results: The individuality of each of these processes appears from the model parameters that are randomly generated from statistical distributions introduced as input parameters. Conclusion: This allows the integration of the individuality factor of the population elements yielding knowledge on various driving modes at wide variety of situations.

  9. Method for gesture based modeling

    DEFF Research Database (Denmark)

    2006-01-01

    A computer program based method is described for creating models using gestures. On an input device, such as an electronic whiteboard, a user draws a gesture which is recognized by a computer program and interpreted relative to a predetermined meta-model. Based on the interpretation, an algorithm...... is assigned to the gesture drawn by the user. The executed algorithm may, for example, consist in creating a new model element, modifying an existing model element, or deleting an existing model element....

  10. PBG based terahertz antenna for aerospace applications

    CERN Document Server

    Choudhury, Balamati; Jha, Rakesh Mohan

    2016-01-01

    This book focuses on high-gain antennas in the terahertz spectrum and their optimization. The terahertz spectrum is an unallocated EM spectrum, which is being explored for a number of applications, especially to meet increasing demands of high data rates for wireless space communications. Space communication systems using the terahertz spectrum can resolve the problems of limited bandwidth of present wireless communications without radio-frequency interference. This book describes design of such high-gain antennas and their performance enhancement using photonic band gap (PBG) substrates. Further, optimization of antenna models using evolutionary algorithm based computational engine has been included. The optimized high-performance compact antenna may be used for various wireless applications, such as inter-orbital communications and on-vehicle satellite communications.

  11. Multiagent-Based Model For ESCM

    OpenAIRE

    Delia MARINCAS

    2011-01-01

    Web based applications for Supply Chain Management (SCM) are now a necessity for every company in order to meet the increasing customer demands, to face the global competition and to make profit. Multiagent-based approach is appropriate for eSCM because it shows many of the characteristics a SCM system should have. For this reason, we have proposed a multiagent-based eSCM model which configures a virtual SC, automates the SC activities: selling, purchasing, manufacturing, planning, inventory,...

  12. Multi-variable grey model (MGM (1,n,q)) based on genetic algorithm and its application in urban water consumption

    Institute of Scientific and Technical Information of China (English)

    Yan; Han; Shi; Guoxu

    2007-01-01

    Urban water consumption has some characteristics of grey because it is influenced by economy, population, standard of living and so on. The multi-variable grey model (MGM(1,n)), as the expansion and complement of GM(1,1) model, reveals the relationship between restriction and stimulation among variables, and the genetic algorithm has the whole optimal and parallel characteristics. In this paper, the parameter q of MGM(1,n) model was optimized, and a multi-variable grey model (MGM(1,n,q)) was built by using the genetic algorithm. The model was validated by examining the urban water consumption from 1990 to 2003 in Dalian City. The result indicated that the multi-variable grey model (MGM(1,n,q)) based on genetic algorithm was better than MGM(1,n) model, and the MGM(1,n) model was better than MGM(1,1) model.

  13. GIS modelling of forest wood residues potential for energy use based on forest inventory data: Methodological approach and case study application

    OpenAIRE

    Panichelli, L.; Gnansounou, E.

    2008-01-01

    This paper presents an approach to perform geo-referenced estimations of forest wood residues availability for energy use based on forest inventory data integration into a GIS. Three different estimation methods are described. The first one evaluates biomass availability based on the application of biomass expansion factors to stem volume data of the forest inventories. The method accounts for forest dynamics and assigns management treatments in function of forest properties. The second metho...

  14. Model-based Abstraction of Data Provenance

    DEFF Research Database (Denmark)

    Probst, Christian W.; Hansen, René Rydhof

    2014-01-01

    to bigger models, and the analyses adapt accordingly. Our approach extends provenance both with the origin of data, the actors and processes involved in the handling of data, and policies applied while doing so. The model and corresponding analyses are based on a formal model of spatial and organisational......Identifying provenance of data provides insights to the origin of data and intermediate results, and has recently gained increased interest due to data-centric applications. In this work we extend a data-centric system view with actors handling the data and policies restricting actions....... This extension is based on provenance analysis performed on system models. System models have been introduced to model and analyse spatial and organisational aspects of organisations, to identify, e.g., potential insider threats. Both the models and analyses are naturally modular; models can be combined...

  15. A Comprehensive Methodology for Development, ParameterEstimation, and Uncertainty Analysis of Group Contribution Based Property Models -An Application to the Heat of Combustion

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Marcarie, Camille; Abildskov, Jens;

    2016-01-01

    A rigorous methodology is developed that addresses numerical and statistical issues when developing group contribution (GC) based property models such as regression methods, optimization algorithms, performance statistics, outlier treatment, parameter identifiability, and uncertainty of the predi...

  16. Application of model studies for quality control of bottom pressure based GLOSS sea level gauge at Takoradi Harbour (Ghana, West Africa)

    Digital Repository Service at National Institute of Oceanography (India)

    Joseph, A.; Mehra, P.; Desai, R.G.P.; Dotse, J.; Odammetey, J.T.; Nkebi, E.K.; VijayKumar, K.; Prabhudesai, S.

    Quality-control of bottom pressure based sea level gauge has been effected using a statistically derived simple linear model constructed from a set of bottom pressures and concurrent tide-staff measurements. The study reveals that the crucial factor...

  17. Model Construct Based Enterprise Model Architecture and Its Modeling Approach

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In order to support enterprise integration, a kind of model construct based enterprise model architecture and its modeling approach are studied in this paper. First, the structural makeup and internal relationships of enterprise model architecture are discussed. Then, the concept of reusable model construct (MC) which belongs to the control view and can help to derive other views is proposed. The modeling approach based on model construct consists of three steps, reference model architecture synthesis, enterprise model customization, system design and implementation. According to MC based modeling approach a case study with the background of one-kind-product machinery manufacturing enterprises is illustrated. It is shown that proposal model construct based enterprise model architecture and modeling approach are practical and efficient.

  18. Web协同应用中基于文档划分的一致性维护模型%Document-partition based model for maintaining data consistency in collaborative web-based applications

    Institute of Scientific and Technical Information of China (English)

    陈小碾

    2012-01-01

    The existing approaches for maintaining data consistency in the collaborative web-based applications will result in serious cost for the server. To solve this problem, a document-partition based model for consistency maintenance is proposed. The model introduces the idea of document partitioning on the basis of the operation transformation algorithm SLOT ( symmetric linear operational transformation). From the view of reducing both communication and memory cost, a dynamic document partitioning strategy and corresponding algorithms are proposed to adapt the dynamic of users' behaviors. The experimental result shows that the model can reduce both the communication and memory cost of the server effectively in a large-scale collaborative application.%针对已有的Web协同应用中的一致性维护方法会带来严重的服务器耗费问题,提出了一种基于文档划分的一致性维护模型.该模型在操作转换算法SLOT(symmetric linear operational transformation)的基础上引入文档划分的思想.从降低服务器通信和内存耗费的角度出发,结合用户数量和操作频率的变化,给出一种动态的文档划分策略及其实现算法.仿真实验结果表明,该模型可以有效地降低大规模协同应用中服务器的通信和内存耗费.

  19. Model-based Software Engineering

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  20. Application of stakeholder-based and modelling approaches for supporting robust adaptation decision making under future climatic uncertainty and changing urban-agricultural water demand

    Science.gov (United States)

    Bhave, Ajay; Dessai, Suraje; Conway, Declan; Stainforth, David

    2016-04-01

    Deep uncertainty in future climate change and socio-economic conditions necessitates the use of assess-risk-of-policy approaches over predict-then-act approaches for adaptation decision making. Robust Decision Making (RDM) approaches embody this principle and help evaluate the ability of adaptation options to satisfy stakeholder preferences under wide-ranging future conditions. This study involves the simultaneous application of two RDM approaches; qualitative and quantitative, in the Cauvery River Basin in Karnataka (population ~23 million), India. The study aims to (a) determine robust water resources adaptation options for the 2030s and 2050s and (b) compare the usefulness of a qualitative stakeholder-driven approach with a quantitative modelling approach. For developing a large set of future scenarios a combination of climate narratives and socio-economic narratives was used. Using structured expert elicitation with a group of climate experts in the Indian Summer Monsoon, climatic narratives were developed. Socio-economic narratives were developed to reflect potential future urban and agricultural water demand. In the qualitative RDM approach, a stakeholder workshop helped elicit key vulnerabilities, water resources adaptation options and performance criteria for evaluating options. During a second workshop, stakeholders discussed and evaluated adaptation options against the performance criteria for a large number of scenarios of climatic and socio-economic change in the basin. In the quantitative RDM approach, a Water Evaluation And Planning (WEAP) model was forced by precipitation and evapotranspiration data, coherent with the climatic narratives, together with water demand data based on socio-economic narratives. We find that compared to business-as-usual conditions options addressing urban water demand satisfy performance criteria across scenarios and provide co-benefits like energy savings and reduction in groundwater depletion, while options reducing

  1. The Model Study of Open Teaching Based on the Application of Knowledge Base System%基于知识库的开放性教学模式研究

    Institute of Scientific and Technical Information of China (English)

    翁胜斌; 李勇

    2014-01-01

    In such a knowledge-explosion age,the college education should change the cramming method of teaching which is commonly used,and promote the open teaching system to reappraise the role of teacher, students and classroom teaching. Given that background ,a new open teaching system was developed ,which based on the guide of action list,application of the knowledge base system and information system. This study will bring the open teaching from the concept research stage to the application theory stage. And the new open teaching system will be a reference model for the college education mode innovation.%在这个知识爆炸的时代,大学教育必须改变原有“填鸭式”的教学模式,推行开放性教学理念,重新界定教师、学生、课堂在教学过程中的作用。这一背景下,通过构建以业绩群为引领,知识库为核心,信息系统为支撑的开放性教学模式,促进了开放性教学从“理念”探讨走向“实践应用”研究,为大学高等教育的教学模式改革提供了参考范式。

  2. On the applicability of unimodal and bimodal van Genuchten-Mualem based models to peat and other organic soils under evaporation conditions

    Science.gov (United States)

    Dettmann, Ullrich; Bechtold, Michel; Frahm, Enrico; Tiemeyer, Bärbel

    2014-07-01

    Soil moisture is one of the key parameters controlling biogeochemical processes in peat and other organic soils. To understand and accurately model soil moisture dynamics and peatland hydrological functioning in general, knowledge about soil hydraulic properties is crucial. As peat differs in several aspects from mineral soils, the applicability of standard hydraulic functions (e.g. van Genuchten-Mualem model) developed for mineral soils to peat soil moisture dynamics might be questionable. In this study, the hydraulic properties of five types of peat and other organic soils from different German peatlands have been investigated by laboratory evaporation experiments. Soil hydraulic parameters of the commonly-applied van Genuchten-Mualem model and the bimodal model by Durner (1994) were inversely estimated using HYDRUS-1D and global optimization. The objective function included measured pressure heads and cumulative evaporation. The performance of eight model set-ups differing in the degree of complexity and the choice of fitting parameters were evaluated. Depending on the model set-up, botanical origin and degree of peat decomposition, the quality of the model results differed strongly. We show that fitted ‘tortuosity’ parameters τ of the van Genuchten-Mualem model can deviate very much from the default value of 0.5 that is frequently applied to mineral soils. Results indicate a rather small decrease of the hydraulic conductivity with increasing suction compared to mineral soils. Optimizing τ did therefore strongly reduce the model error at dry conditions when high pressure head gradients occurred. As strongly negative pressure heads in the investigated peatlands rarely occur, we also reduced the range of pressure heads in the inversion to a ‘wet range’ from 0 to -200 cm. For the ‘wet range’ model performance was highly dependent on the inclusion of macropores. Here, fitting only the macropore fraction of the bimodal model as immediately drainable

  3. Graph Model Based Indoor Tracking

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Lu, Hua; Yang, Bin

    2009-01-01

    infrastructure for different symbolic positioning technologies, e.g., Bluetooth and RFID. More specifically, the paper proposes a model of indoor space that comprises a base graph and mappings that represent the topology of indoor space at different levels. The resulting model can be used for one or several...... indoor positioning technologies. Focusing on RFID-based positioning, an RFID specific reader deployment graph model is built from the base graph model. This model is then used in several algorithms for constructing and refining trajectories from raw RFID readings. Empirical studies with implementations...

  4. Software Testing Method Based on Model Comparison

    Institute of Scientific and Technical Information of China (English)

    XIE Xiao-dong; LU Yan-sheng; MAO Cheng-yin

    2008-01-01

    A model comparison based software testing method (MCST) is proposed. In this method, the requirements and programs of software under test are transformed into the ones in the same form, and described by the same model describe language (MDL).Then, the requirements are transformed into a specification model and the programs into an implementation model. Thus, the elements and structures of the two models are compared, and the differences between them are obtained. Based on the diffrences, a test suite is generated. Different MDLs can be chosen for the software under test. The usages of two classical MDLs in MCST, the equivalence classes model and the extended finite state machine (EFSM) model, are described with example applications. The results show that the test suites generated by MCST are more efficient and smaller than some other testing methods, such as the path-coverage testing method, the object state diagram testing method, etc.

  5. Agent Based Modelling for Social Simulation

    NARCIS (Netherlands)

    Smit, S.K.; Ubink, E.M.; Vecht, B. van der; Langley, D.J.

    2013-01-01

    This document is the result of an exploratory project looking into the status of, and opportunities for Agent Based Modelling (ABM) at TNO. The project focussed on ABM applications containing social interactions and human factors, which we termed ABM for social simulation (ABM4SS). During the course

  6. The application of GIS based decision-tree models for generating the spatial distribution of hydromorphic organic landscapes in relation to digital terrain data

    OpenAIRE

    R. Bou Kheir; P. K. Bøcher; M. B. Greve; M. H. Greve

    2010-01-01

    Accurate information about organic/mineral soil occurrence is a prerequisite for many land resources management applications (including climate change mitigation). This paper aims at investigating the potential of using geomorphometrical analysis and decision tree modeling to predict the geographic distribution of hydromorphic organic landscapes in unsampled area in Denmark. Nine primary (elevation, slope angle, slope aspect, plan curvature, profile curvature, tangent curvature, flow directio...

  7. Constitutive modeling of two-phase metallic composites with application to tungsten-based composite 93W–4.9Ni–2.1Fe

    Energy Technology Data Exchange (ETDEWEB)

    Lu, W.R.; Gao, C.Y., E-mail: lxgao@zju.edu.cn; Ke, Y.L.

    2014-01-13

    The two-phase metallic composites, composed by the metallic particulate reinforcing phase and the metallic matrix phase, have attracted a lot of attention in recent years for their excellent material properties. However, the constitutive modeling of two-phase metallic composites is still lacking currently. Most used models for them are basically oriented for single-phase homogeneous metallic materials, and have not considered the microstructural evolution of the components in the composite. This paper develops a new constitutive model for two-phase metallic composites based on the thermally activated dislocation motion mechanism and the volume fraction evolution. By establishing the relation between microscopic volume fraction and macroscopic state variables (strain, strain rate and temperature), the evolution law of volume fraction during the plastic deformation in two-phase composites is proposed for the first time and introduced into the new model. Then the new model is applied to a typical two-phase tungsten-based composite – 93W–4.9Ni–2.1Fe tungsten heavy alloy. It has been found that our model can effectively describe the plastic deformation behaviors of the tungsten-based composite, because of the introduction of volume fraction evolution and the connecting of macroscopic state variables and micromechanical characteristics in the constitutive model. The model's validation by experimental data indicates that our new model can provide a satisfactory prediction of flow stress for two-phase metallic composites, which is better than conventional single-phase homogeneous constitutive models including the Johnson–Cook (JC), Khan–Huang–Liang (KHL), Nemat-Nasser–Li (NNL), Zerilli–Armstrong (ZA) and Voyiadjis–Abed (VA) models.

  8. Model validation, science and application

    NARCIS (Netherlands)

    Builtjes, P.J.H.; Flossmann, A.

    1998-01-01

    Over the last years there is a growing interest to try to establish a proper validation of atmospheric chemistry-transport (ATC) models. Model validation deals with the comparison of model results with experimental data, and in this way adresses both model uncertainty and uncertainty in, and adequac

  9. Model-based reasoning and large-knowledge bases

    International Nuclear Information System (INIS)

    In such engineering fields as nuclear power plant engineering, technical information expressed in the form of schematics is frequently used. A new paradigm for model-based reasoning (MBR) and an AI tool called PLEXSYS (plant expert system) using this paradigm has been developed. PLEXSYS and the underlying paradigm are specifically designed to handle schematic drawings, by expressing drawings as models and supporting various sophisticated searches on these models. Two application systems have been constructed with PLEXSYS: one generates PLEXSYS models from existing CAD data files, and the other provides functions for nuclear power plant design support. Since the models can be generated from existing data resources, the design support system automatically has full access to a large-scale model or knowledge base representing actual nuclear power plants. (author)

  10. Application of numerical models and codes

    OpenAIRE

    Vyzikas, Thomas

    2014-01-01

    This report indicates the importance of numerical modelling in the modelling process, gradually builds the essential background theory in the fields of fluid mechanics, wave mechanics and numerical modelling, discusses a list of commonly used software and finally recommends which models are more suitable for different engineering applications in a marine renewable energy project.

  11. Atom-Role-Based Access Control Model

    Science.gov (United States)

    Cai, Weihong; Huang, Richeng; Hou, Xiaoli; Wei, Gang; Xiao, Shui; Chen, Yindong

    Role-based access control (RBAC) model has been widely recognized as an efficient access control model and becomes a hot research topic of information security at present. However, in the large-scale enterprise application environments, the traditional RBAC model based on the role hierarchy has the following deficiencies: Firstly, it is unable to reflect the role relationships in complicated cases effectively, which does not accord with practical applications. Secondly, the senior role unconditionally inherits all permissions of the junior role, thus if a user is under the supervisor role, he may accumulate all permissions, and this easily causes the abuse of permission and violates the least privilege principle, which is one of the main security principles. To deal with these problems, we, after analyzing permission types and role relationships, proposed the concept of atom role and built an atom-role-based access control model, called ATRBAC, by dividing the permission set of each regular role based on inheritance path relationships. Through the application-specific analysis, this model can well meet the access control requirements.

  12. Model-based consensus

    NARCIS (Netherlands)

    Boumans, Marcel

    2014-01-01

    The aim of the rational-consensus method is to produce “rational consensus”, that is, “mathematical aggregation”, by weighing the performance of each expert on the basis of his or her knowledge and ability to judge relevant uncertainties. The measurement of the performance of the experts is based on

  13. Model-based consensus

    NARCIS (Netherlands)

    M. Boumans

    2014-01-01

    The aim of the rational-consensus method is to produce "rational consensus", that is, "mathematical aggregation", by weighing the performance of each expert on the basis of his or her knowledge and ability to judge relevant uncertainties. The measurement of the performance of the experts is based on

  14. Multi-dimensional rheology-based two-phase model for sediment transport and applications to sheet flow and pipeline scour

    Science.gov (United States)

    Lee, Cheng-Hsien; Low, Ying Min; Chiew, Yee-Meng

    2016-05-01

    Sediment transport is fundamentally a two-phase phenomenon involving fluid and sediments; however, many existing numerical models are one-phase approaches, which are unable to capture the complex fluid-particle and inter-particle interactions. In the last decade, two-phase models have gained traction; however, there are still many limitations in these models. For example, several existing two-phase models are confined to one-dimensional problems; in addition, the existing two-dimensional models simulate only the region outside the sand bed. This paper develops a new three-dimensional two-phase model for simulating sediment transport in the sheet flow condition, incorporating recently published rheological characteristics of sediments. The enduring-contact, inertial, and fluid viscosity effects are considered in determining sediment pressure and stresses, enabling the model to be applicable to a wide range of particle Reynolds number. A k - ɛ turbulence model is adopted to compute the Reynolds stresses. In addition, a novel numerical scheme is proposed, thus avoiding numerical instability caused by high sediment concentration and allowing the sediment dynamics to be computed both within and outside the sand bed. The present model is applied to two classical problems, namely, sheet flow and scour under a pipeline with favorable results. For sheet flow, the computed velocity is consistent with measured data reported in the literature. For pipeline scour, the computed scour rate beneath the pipeline agrees with previous experimental observations. However, the present model is unable to capture vortex shedding; consequently, the sediment deposition behind the pipeline is overestimated. Sensitivity analyses reveal that model parameters associated with turbulence have strong influence on the computed results.

  15. Cluster Based Text Classification Model

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Wiil, Uffe Kock

    2011-01-01

    We propose a cluster based classification model for suspicious email detection and other text classification tasks. The text classification tasks comprise many training examples that require a complex classification model. Using clusters for classification makes the model simpler and increases......, the classifier is trained on each cluster having reduced dimensionality and less number of examples. The experimental results show that the proposed model outperforms the existing classification models for the task of suspicious email detection and topic categorization on the Reuters-21578 and 20 Newsgroups...... datasets. Our model also outperforms A Decision Cluster Classification (ADCC) and the Decision Cluster Forest Classification (DCFC) models on the Reuters-21578 dataset....

  16. APPLICATION OF FRF ESTIMATOR BASED ON ERRORS-IN-VARIABLES MODEL IN MULTI-INPUT MULTI-OUTPUT VIBRATION CONTROL SYSTEM

    Institute of Scientific and Technical Information of China (English)

    GUAN Guangfeng; CONG Dacheng; HAN Junwei; LI Hongren

    2007-01-01

    The FRF estimator based on the errors-in-variables (EV) model of multi-input multi-output (MIMO) System is presented to reduce the bias error of FRF Hl estimator. The FRF Hl estimator is influenced by the noises in the inputs of the System and generates an under-estimation of the true FRF. The FRF estimator based on the EV model takes into account the errors in both the inputs and Outputs of the System and would lead to more accurate FRF estimation. The FRF estimator based on the EV model is applied to the waveform replication on the 6-DOF (degree-of-freedom) hydraulic Vibration table. The result shows that it is favorable to improve the control precision of the MIMO Vibration control system.

  17. Agent-based modeling and network dynamics

    CERN Document Server

    Namatame, Akira

    2016-01-01

    The book integrates agent-based modeling and network science. It is divided into three parts, namely, foundations, primary dynamics on and of social networks, and applications. The book begins with the network origin of agent-based models, known as cellular automata, and introduce a number of classic models, such as Schelling’s segregation model and Axelrod’s spatial game. The essence of the foundation part is the network-based agent-based models in which agents follow network-based decision rules. Under the influence of the substantial progress in network science in late 1990s, these models have been extended from using lattices into using small-world networks, scale-free networks, etc. The book also shows that the modern network science mainly driven by game-theorists and sociophysicists has inspired agent-based social scientists to develop alternative formation algorithms, known as agent-based social networks. The book reviews a number of pioneering and representative models in this family. Upon the gi...

  18. Neural Network-Based Modeling of PEM fuel cell and Controller Synthesis of a stand-alone system for residential application

    Directory of Open Access Journals (Sweden)

    Khaled Mammar

    2012-11-01

    Full Text Available The paper is focused especially on presenting possibilities of applying artificial neural networks at creating the optimal model PEM fuel cell. Various ANN approaches have been tested; the back-propagation feed-forward networks show satisfactory performance with regard to cell voltage prediction. The model is then used in a power system for residential application. This models include an ANN fuel cell stack model, reformer model and DC/AC inverter model. Furthermore a neural network (NNTC and fuzzy logic (FLC controllers are used to control active power of PEM fuel cell system. The controllers modifies the hydrogen flow feedback from the terminal load. The validity of the controller is verified when the fuel cell system model is used in conjunction with the NNT controller to predict the response of the active power to: (a computer-simulated step changes in the load active and reactive power demand, and (b actual active and reactive load demand of a single family residence. Simulation results confirmed the high performance capability of the neural network (NNTC to control power generation.

  19. Experimental-based Modelling and Simulation of Water Hydraulic Mechatronics Test Facilities for Motion Control and Operation in Environmental Sensitive Applications` Areas

    DEFF Research Database (Denmark)

    Conrad, Finn; Pobedza, J.; Sobczyk, A.

    2003-01-01

    The paper presents experimental-based modelling, simulation, analysis and design of water hydraulic actuators for motion control of machines, lifts, cranes and robots. The contributions includes results from on-going research projects on fluid power and mechatronics based on tap water hydraulic...... proportional valves and servo actuators for motion control and power transmission undertaken in co-operation by Technical University, DTU and Cracow University of Technology, CUT. The results of this research co-operation include engineering design and test of simulation models compared with two mechatronic...

  20. Business model concept and application

    OpenAIRE

    Ogonowska, Kinga

    2010-01-01

    In this thesis I would like to clarify the major approached to business models, define business model innovation, identify types of business models and innovations that are applied in the companies under research, indicate strengths and weaknesses of the business models studied and determine their innovative value. The sources of data include secondary from literature review, reports, corporate web pages and primary data from the interviews with employees of the Polish companies under ...

  1. Metagenomics: An Application Based Perspective

    Directory of Open Access Journals (Sweden)

    Yasir Bashir

    2014-01-01

    Full Text Available Metagenomics deals with the isolation of genetic material directly recovered from environmental samples. Metagenomics as an approach has emerged over the past two decades to elucidate a host of microbial communities inhabiting a specific niche with the goal of understanding their genetic diversity, population structure, and ecological role played by them. A number of new and novel molecules with significant functionalities and applications have been identified through this approach. In fact, many investigators are engaged in this field to unlock the untapped genetic resources with funding from governments sector. The sustainable economic future of modern industrialized societies requires the development of novel molecules, enzymes, processes, products, and applications. Metagenomics can also be applied to solve practical challenges in the field of medicine, agriculture, sustainability, and ecology. Metagenomics promises to provide new molecules and novel enzymes with diverse functions and enhanced features compared to the enzymes from the culturable microorganisms. Besides the application of metagenomics for unlocking novel biocatalysts from nature, it also has found applications in fields as diverse as bioremediation, personalized medicine, xenobiotic metabolism, and so forth.

  2. Applications and extensions of degradation modeling

    Energy Technology Data Exchange (ETDEWEB)

    Hsu, F.; Subudhi, M.; Samanta, P.K. (Brookhaven National Lab., Upton, NY (United States)); Vesely, W.E. (Science Applications International Corp., Columbus, OH (United States))

    1991-01-01

    Component degradation modeling being developed to understand the aging process can have many applications with potential advantages. Previous work has focused on developing the basic concepts and mathematical development of a simple degradation model. Using this simple model, times of degradations and failures occurrences were analyzed for standby components to detect indications of aging and to infer the effectiveness of maintenance in preventing age-related degradations from transforming to failures. Degradation modeling approaches can have broader applications in aging studies and in this paper, we discuss some of the extensions and applications of degradation modeling. The application and extension of degradation modeling approaches, presented in this paper, cover two aspects: (1) application to a continuously operating component, and (2) extension of the approach to analyze degradation-failure rate relationship. The application of the modeling approach to a continuously operating component (namely, air compressors) shows the usefulness of this approach in studying aging effects and the role of maintenance in this type component. In this case, aging effects in air compressors are demonstrated by the increase in both the degradation and failure rate and the faster increase in the failure rate compared to the degradation rate shows the ineffectiveness of the existing maintenance practices. Degradation-failure rate relationship was analyzed using data from residual heat removal system pumps. A simple linear model with a time-lag between these two parameters was studied. The application in this case showed a time-lag of 2 years for degradations to affect failure occurrences. 2 refs.

  3. Applications and extensions of degradation modeling

    Energy Technology Data Exchange (ETDEWEB)

    Hsu, F.; Subudhi, M.; Samanta, P.K. [Brookhaven National Lab., Upton, NY (United States); Vesely, W.E. [Science Applications International Corp., Columbus, OH (United States)

    1991-12-31

    Component degradation modeling being developed to understand the aging process can have many applications with potential advantages. Previous work has focused on developing the basic concepts and mathematical development of a simple degradation model. Using this simple model, times of degradations and failures occurrences were analyzed for standby components to detect indications of aging and to infer the effectiveness of maintenance in preventing age-related degradations from transforming to failures. Degradation modeling approaches can have broader applications in aging studies and in this paper, we discuss some of the extensions and applications of degradation modeling. The application and extension of degradation modeling approaches, presented in this paper, cover two aspects: (1) application to a continuously operating component, and (2) extension of the approach to analyze degradation-failure rate relationship. The application of the modeling approach to a continuously operating component (namely, air compressors) shows the usefulness of this approach in studying aging effects and the role of maintenance in this type component. In this case, aging effects in air compressors are demonstrated by the increase in both the degradation and failure rate and the faster increase in the failure rate compared to the degradation rate shows the ineffectiveness of the existing maintenance practices. Degradation-failure rate relationship was analyzed using data from residual heat removal system pumps. A simple linear model with a time-lag between these two parameters was studied. The application in this case showed a time-lag of 2 years for degradations to affect failure occurrences. 2 refs.

  4. Application of a G-equation based combustion model and detailed chemistry to prediction of autoignition in a gasoline direct injection engine

    Energy Technology Data Exchange (ETDEWEB)

    Juneja, Harmit [Wisconsin Engine Research Consultants (WERC), Madison, WI (United States); Sczomak, David P. [General Motors Powertrain Advanced Engineering, Pontiac, MI (United States); Ge, Hai-Wen; Yang, Shiyou [Wisconsin Univ., Madison (United States); Reitz, Rolf D. [Wisconsin Univ., Madison (United States). Dept. of Mechanical Engineering

    2010-07-01

    Autoignition in an experimental single cylinder homogeneous gasoline direct injection engine was modeled using a level set method (G-equation) based combustion model incorporating detailed chemical kinetics. Several improvements have been made to the combustion model recently and implemented in the KIVA-3V code. These improvements include a transport equation residual model, the modeling of flame front quenching in highly stratified mixtures, and a recently developed primary reference fuel (PRF) chemistry mechanism. An adaptive multi grid chemistry (AMC) model was also implemented to speed-up the chemistry calculation. The integrated model was used to simulate the combustion process including the prediction of autoignition in a gasoline direct-injection engine. Experimental data in the form of a spark timing sweep covering a highly knocking to a non-knocking operating condition was used to validate the combustion modeling approach. The improved G-equation model combined with detailed chemistry matches the experimental data very well and does an excellent job at predicting the onset, location, and intensity of knock as a function of spark timing. (orig.)

  5. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  6. The Application Model of Moving Objects in Cargo Delivery System

    Institute of Scientific and Technical Information of China (English)

    ZHANG Feng-li; ZHOU Ming-tian; XU Bo

    2004-01-01

    The development of spatio-temporal database systems is primarily motivated by applications which track and present mobile objects. In this paper, solutions for establishing the moving object database based on GPS/GIS environment are presented, and a data modeling of moving object is given by using Temporal logical to extent the query language, finally the application model in cargo delivery system is shown.

  7. Potential Teachers' Appropriate and Inappropriate Application of Pedagogical Resources in a Model-Based Physics Course: A "Knowledge in Pieces" Perspective on Teacher Learning

    Science.gov (United States)

    Harlow, Danielle B.; Bianchini, Julie A.; Swanson, Lauren H.; Dwyer, Hilary A.

    2013-01-01

    We used a "knowledge in pieces" perspective on teacher learning to document undergraduates' pedagogical resources in a model-based physics course for potential teachers. We defined pedagogical resources as small, discrete ideas about teaching science that are applied appropriately or inappropriately in specific contexts. Neither…

  8. Application of physiologically based toxicokinetic modelling to study the impact of the exposure scenario on the toxicokinetics and the behavioural effects of toluene in rats

    NARCIS (Netherlands)

    Asperen, J. van; Rijcken, W.R.P.; Lammers, J.H.C.M.

    2003-01-01

    The toxicity of inhalatory exposure to organic solvents may not only be related to the total external dose, but also to the pattern of exposure. In this study physiologically based toxicokinetic (PBTK) modelling has been used to study the impact of the exposure scenario on the toxicokinetics and the

  9. Property Modelling for Applications in Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    is missing, the atom connectivity based model is employed to predict the missing group interaction. In this way, a wide application range of the property modeling tool is ensured. Based on the property models, targeted computer-aided techniques have been developed for design and analysis of organic chemicals......, polymers, mixtures as well as separation processes. The presentation will highlight the framework (ICAS software) for property modeling, the property models and issues such as prediction accuracy, flexibility, maintenance and updating of the database. Also, application issues related to the use of property......Physical-chemical properties of pure chemicals and their mixtures play an important role in the design of chemicals based products and the processes that manufacture them. Although, the use of experimental data in design and analysis of chemicals based products and their processes is desirable...

  10. Metagenomics: An Application Based Perspective

    OpenAIRE

    Yasir Bashir; Salam Pradeep Singh; Bolin Kumar Konwar

    2014-01-01

    Metagenomics deals with the isolation of genetic material directly recovered from environmental samples. Metagenomics as an approach has emerged over the past two decades to elucidate a host of microbial communities inhabiting a specific niche with the goal of understanding their genetic diversity, population structure, and ecological role played by them. A number of new and novel molecules with significant functionalities and applications have been identified through this approach. In fact, ...

  11. Towards an MDA-based development methodology for distributed applications

    NARCIS (Netherlands)

    Gavras, A.; Belaunde, M.; Ferreira Pires, L.; Andrade Almeida, J.P.; van Sinderen, M.J.; Ferreira Pires, L.

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  12. A nonlinear-elastic constitutive model for soft connective tissue based on a histologic description: Application to female pelvic soft tissue.

    Science.gov (United States)

    Brieu, Mathias; Chantereau, Pierre; Gillibert, Jean; de Landsheere, Laurent; Lecomte, Pauline; Cosson, Michel

    2016-05-01

    To understand the mechanical behavior of soft tissues, two fields of science are essential: biomechanics and histology. Nonetheless, those two fields have not yet been studied together often enough to be unified by a comprehensive model. This study attempts to produce such model. Biomechanical uniaxial tension tests were performed on vaginal tissues from 7 patients undergoing surgery. In parallel, vaginal tissue from the same patients was histologically assessed to determine the elastic fiber ratio. These observations demonstrated a relationship between the stiffness of tissue and its elastin content. To extend this study, a mechanical model, based on an histologic description, was developed to quantitatively correlate the mechanical behavior of vaginal tissue to its elastic fiber content. A satisfactory single-parameter model was developed assuming that the mechanical behavior of collagen and elastin was the same for all patients and that tissues are only composed of collagen and elastin. This single-parameter model showed good correlation with experimental results. The single-parameter mechanical model described here, based on histological description, could be very useful in helping to understand and better describe soft tissues with a view to their characterization. The mechanical behavior of a tissue can thus be determined thanks to its elastin content without introducing too many unidentified parameters.

  13. Flower solid modeling based on sketches

    Institute of Scientific and Technical Information of China (English)

    Zhan DING; Shu-chang XU; Xiu-zi YE; Yin ZHANG; San-yuan ZHANG

    2008-01-01

    In this paper we propose a method to model flowers of solid shape. Based on (Ijiri et al., 2005)'s method, we separate individual flower modeling and inflorescence modeling procedures into structure and geometry modeling. We incorporate interactive editing gestures to allow the user to edit structure parameters freely onto structure diagram. Furthermore, we use free-hand sketching techniques to allow users to create and edit 3D geometrical elements freely and easily. The final step is to automatically merge all independent 3D geometrical elements into a single waterproof mesh. Our experiments show that this solid modeling approach is promising. Using our approach, novice users can create vivid flower models easily and freely. The generated flower model is waterproof. It can have applications in visualization, animation, gaming, and toys and decorations if printed out on 3D rapid prototyping devices.

  14. Recent Applications of Mesoscale Modeling to Nanotechnology and Drug Delivery

    Energy Technology Data Exchange (ETDEWEB)

    Maiti, A; Wescott, J; Kung, P; Goldbeck-Wood, G

    2005-02-11

    Mesoscale simulations have traditionally been used to investigate structural morphology of polymer in solution, melts and blends. Recently we have been pushing such modeling methods to important areas of Nanotechnology and Drug delivery that are well out of reach of classical molecular dynamics. This paper summarizes our efforts in three important emerging areas: (1) polymer-nanotube composites; (2) drug diffusivity through cell membranes; and (3) solvent exchange in nanoporous membranes. The first two applications are based on a bead-spring-based approach as encoded in the Dissipative Particle Dynamics (DPD) module. The last application used density-based Mesoscale modeling as implemented in the Mesodyn module.

  15. Model-based testing for embedded systems

    CERN Document Server

    Zander, Justyna; Mosterman, Pieter J

    2011-01-01

    What the experts have to say about Model-Based Testing for Embedded Systems: "This book is exactly what is needed at the exact right time in this fast-growing area. From its beginnings over 10 years ago of deriving tests from UML statecharts, model-based testing has matured into a topic with both breadth and depth. Testing embedded systems is a natural application of MBT, and this book hits the nail exactly on the head. Numerous topics are presented clearly, thoroughly, and concisely in this cutting-edge book. The authors are world-class leading experts in this area and teach us well-used

  16. Photocell modelling for thermophotovoltaic applications

    Energy Technology Data Exchange (ETDEWEB)

    Mayor, J.-C.; Durisch, W.; Grob, B.; Panitz, J.-C. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1999-08-01

    Goal of the modelling described here is the extrapolation of the performance characteristics of solar photocells to TPV working conditions. The model accounts for higher flux of radiation and for the higher temperatures reached in TPV converters. (author) 4 figs., 1 tab., 2 refs.

  17. Markov chains models, algorithms and applications

    CERN Document Server

    Ching, Wai-Ki; Ng, Michael K; Siu, Tak-Kuen

    2013-01-01

    This new edition of Markov Chains: Models, Algorithms and Applications has been completely reformatted as a text, complete with end-of-chapter exercises, a new focus on management science, new applications of the models, and new examples with applications in financial risk management and modeling of financial data.This book consists of eight chapters.  Chapter 1 gives a brief introduction to the classical theory on both discrete and continuous time Markov chains. The relationship between Markov chains of finite states and matrix theory will also be highlighted. Some classical iterative methods

  18. Electrochemistry-based Battery Modeling for Prognostics

    Science.gov (United States)

    Daigle, Matthew J.; Kulkarni, Chetan Shrikant

    2013-01-01

    Batteries are used in a wide variety of applications. In recent years, they have become popular as a source of power for electric vehicles such as cars, unmanned aerial vehicles, and commericial passenger aircraft. In such application domains, it becomes crucial to both monitor battery health and performance and to predict end of discharge (EOD) and end of useful life (EOL) events. To implement such technologies, it is crucial to understand how batteries work and to capture that knowledge in the form of models that can be used by monitoring, diagnosis, and prognosis algorithms. In this work, we develop electrochemistry-based models of lithium-ion batteries that capture the significant electrochemical processes, are computationally efficient, capture the effects of aging, and are of suitable accuracy for reliable EOD prediction in a variety of usage profiles. This paper reports on the progress of such a model, with results demonstrating the model validity and accurate EOD predictions.

  19. The Research and Application of Extended RBAC Model Based on Organization Structure%基于组织结构的RBAC扩展模型及应用

    Institute of Scientific and Technical Information of China (English)

    范志; 顾治波

    2013-01-01

    针对传统的基于角色的访问控制模型所存在的问题,提出了一种基于组织结构的RBAC扩展模型,该模型引入组织结构和用户组,有效地解决了目前RBAC权限管理存在的问题,满足大型企业对权限管理的精细化管理要求.%According to the problems in traditional RBAC model, an extended RBAC model based on organization structure is proposed. In the model, organizational structure and user group are introduced to solve the problem in traditional authorization, the model conform to modern large-scale enterprise organization structure management characteristics.

  20. Modeling Guru: Knowledge Base for NASA Modelers

    Science.gov (United States)

    Seablom, M. S.; Wojcik, G. S.; van Aartsen, B. H.

    2009-05-01

    Modeling Guru is an on-line knowledge-sharing resource for anyone involved with or interested in NASA's scientific models or High End Computing (HEC) systems. Developed and maintained by the NASA's Software Integration and Visualization Office (SIVO) and the NASA Center for Computational Sciences (NCCS), Modeling Guru's combined forums and knowledge base for research and collaboration is becoming a repository for the accumulated expertise of NASA's scientific modeling and HEC communities. All NASA modelers and associates are encouraged to participate and provide knowledge about the models and systems so that other users may benefit from their experience. Modeling Guru is divided into a hierarchy of communities, each with its own set forums and knowledge base documents. Current modeling communities include those for space science, land and atmospheric dynamics, atmospheric chemistry, and oceanography. In addition, there are communities focused on NCCS systems, HEC tools and libraries, and programming and scripting languages. Anyone may view most of the content on Modeling Guru (available at http://modelingguru.nasa.gov/), but you must log in to post messages and subscribe to community postings. The site offers a full range of "Web 2.0" features, including discussion forums, "wiki" document generation, document uploading, RSS feeds, search tools, blogs, email notification, and "breadcrumb" links. A discussion (a.k.a. forum "thread") is used to post comments, solicit feedback, or ask questions. If marked as a question, SIVO will monitor the thread, and normally respond within a day. Discussions can include embedded images, tables, and formatting through the use of the Rich Text Editor. Also, the user can add "Tags" to their thread to facilitate later searches. The "knowledge base" is comprised of documents that are used to capture and share expertise with others. The default "wiki" document lets users edit within the browser so others can easily collaborate on the