WorldWideScience

Sample records for previously implemented model-based

  1. A model based security testing method for protocol implementation.

    Science.gov (United States)

    Fu, Yu Long; Xin, Xiao Long

    2014-01-01

    The security of protocol implementation is important and hard to be verified. Since the penetration testing is usually based on the experience of the security tester and the specific protocol specifications, a formal and automatic verification method is always required. In this paper, we propose an extended model of IOLTS to describe the legal roles and intruders of security protocol implementations, and then combine them together to generate the suitable test cases to verify the security of protocol implementation.

  2. Implementing model based systems engineering in South Africa

    CSIR Research Space (South Africa)

    Oosthuizen, Rudolph

    2017-10-01

    Full Text Available significant benefits such as enhancing productivity and quality, reducing risk, and providing improved communications among the system development team. Implementing MBSE for the South African environment requires a process suited for the local environment...

  3. Potential implementation of reservoir computing models based on magnetic skyrmions

    Science.gov (United States)

    Bourianoff, George; Pinna, Daniele; Sitte, Matthias; Everschor-Sitte, Karin

    2018-05-01

    Reservoir Computing is a type of recursive neural network commonly used for recognizing and predicting spatio-temporal events relying on a complex hierarchy of nested feedback loops to generate a memory functionality. The Reservoir Computing paradigm does not require any knowledge of the reservoir topology or node weights for training purposes and can therefore utilize naturally existing networks formed by a wide variety of physical processes. Most efforts to implement reservoir computing prior to this have focused on utilizing memristor techniques to implement recursive neural networks. This paper examines the potential of magnetic skyrmion fabrics and the complex current patterns which form in them as an attractive physical instantiation for Reservoir Computing. We argue that their nonlinear dynamical interplay resulting from anisotropic magnetoresistance and spin-torque effects allows for an effective and energy efficient nonlinear processing of spatial temporal events with the aim of event recognition and prediction.

  4. The implementation of assessment model based on character building to improve students’ discipline and achievement

    Science.gov (United States)

    Rusijono; Khotimah, K.

    2018-01-01

    The purpose of this research was to investigate the effect of implementing the assessment model based on character building to improve discipline and student’s achievement. Assessment model based on character building includes three components, which are the behaviour of students, the efforts, and student’s achievement. This assessment model based on the character building is implemented in science philosophy and educational assessment courses, in Graduate Program of Educational Technology Department, Educational Faculty, Universitas Negeri Surabaya. This research used control group pre-test and post-test design. Data collection method used in this research were observation and test. The observation was used to collect the data about the disciplines of the student in the instructional process, while the test was used to collect the data about student’s achievement. Moreover, the study applied t-test to the analysis of data. The result of this research showed that assessment model based on character building improved discipline and student’s achievement.

  5. Implementing Lumberjacks and Black Swans Into Model-Based Tools to Support Human-Automation Interaction.

    Science.gov (United States)

    Sebok, Angelia; Wickens, Christopher D

    2017-03-01

    The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.

  6. Streaming Model Based Volume Ray Casting Implementation for Cell Broadband Engine

    Directory of Open Access Journals (Sweden)

    Jusub Kim

    2009-01-01

    Full Text Available Interactive high quality volume rendering is becoming increasingly more important as the amount of more complex volumetric data steadily grows. While a number of volumetric rendering techniques have been widely used, ray casting has been recognized as an effective approach for generating high quality visualization. However, for most users, the use of ray casting has been limited to datasets that are very small because of its high demands on computational power and memory bandwidth. However the recent introduction of the Cell Broadband Engine (Cell B.E. processor, which consists of 9 heterogeneous cores designed to handle extremely demanding computations with large streams of data, provides an opportunity to put the ray casting into practical use. In this paper, we introduce an efficient parallel implementation of volume ray casting on the Cell B.E. The implementation is designed to take full advantage of the computational power and memory bandwidth of the Cell B.E. using an intricate orchestration of the ray casting computation on the available heterogeneous resources. Specifically, we introduce streaming model based schemes and techniques to efficiently implement acceleration techniques for ray casting on Cell B.E. In addition to ensuring effective SIMD utilization, our method provides two key benefits: there is no cost for empty space skipping and there is no memory bottleneck on moving volumetric data for processing. Our experimental results show that we can interactively render practical datasets on a single Cell B.E. processor.

  7. Implementing model-based system engineering for the whole lifecycle of a spacecraft

    Science.gov (United States)

    Fischer, P. M.; Lüdtke, D.; Lange, C.; Roshani, F.-C.; Dannemann, F.; Gerndt, A.

    2017-09-01

    Design information of a spacecraft is collected over all phases in the lifecycle of a project. A lot of this information is exchanged between different engineering tasks and business processes. In some lifecycle phases, model-based system engineering (MBSE) has introduced system models and databases that help to organize such information and to keep it consistent for everyone. Nevertheless, none of the existing databases approached the whole lifecycle yet. Virtual Satellite is the MBSE database developed at DLR. It has been used for quite some time in Phase A studies and is currently extended for implementing it in the whole lifecycle of spacecraft projects. Since it is unforeseeable which future use cases such a database needs to support in all these different projects, the underlying data model has to provide tailoring and extension mechanisms to its conceptual data model (CDM). This paper explains the mechanisms as they are implemented in Virtual Satellite, which enables extending the CDM along the project without corrupting already stored information. As an upcoming major use case, Virtual Satellite will be implemented as MBSE tool in the S2TEP project. This project provides a new satellite bus for internal research and several different payload missions in the future. This paper explains how Virtual Satellite will be used to manage configuration control problems associated with such a multi-mission platform. It discusses how the S2TEP project starts using the software for collecting the first design information from concurrent engineering studies, then making use of the extension mechanisms of the CDM to introduce further information artefacts such as functional electrical architecture, thus linking more and more processes into an integrated MBSE approach.

  8. Validation and implementation of model based control strategies at an industrial wastewater treatment plant.

    Science.gov (United States)

    Demey, D; Vanderhaegen, B; Vanhooren, H; Liessens, J; Van Eyck, L; Hopkins, L; Vanrolleghem, P A

    2001-01-01

    In this paper, the practical implementation and validation of advanced control strategies, designed using model based techniques, at an industrial wastewater treatment plant is demonstrated. The plant under study is treating the wastewater of a large pharmaceutical production facility. The process characteristics of the wastewater treatment were quantified by means of tracer tests, intensive measurement campaigns and the use of on-line sensors. In parallel, a dynamical model of the complete wastewater plant was developed according to the specific kinetic characteristics of the sludge and the highly varying composition of the industrial wastewater. Based on real-time data and dynamic models, control strategies for the equalisation system, the polymer dosing and phosphorus addition were established. The control strategies are being integrated in the existing SCADA system combining traditional PLC technology with robust PC based control calculations. The use of intelligent control in wastewater treatment offers a wide spectrum of possibilities to upgrade existing plants, to increase the capacity of the plant and to eliminate peaks. This can result in a more stable and secure overall performance and, finally, in cost savings. The use of on-line sensors has a potential not only for monitoring concentrations, but also for manipulating flows and concentrations. This way the performance of the plant can be secured.

  9. Technical Note: FreeCT_ICD: An Open Source Implementation of a Model-Based Iterative Reconstruction Method using Coordinate Descent Optimization for CT Imaging Investigations.

    Science.gov (United States)

    Hoffman, John M; Noo, Frédéric; Young, Stefano; Hsieh, Scott S; McNitt-Gray, Michael

    2018-06-01

    To facilitate investigations into the impacts of acquisition and reconstruction parameters on quantitative imaging, radiomics and CAD using CT imaging, we previously released an open source implementation of a conventional weighted filtered backprojection reconstruction called FreeCT_wFBP. Our purpose was to extend that work by providing an open-source implementation of a model-based iterative reconstruction method using coordinate descent optimization, called FreeCT_ICD. Model-based iterative reconstruction offers the potential for substantial radiation dose reduction, but can impose substantial computational processing and storage requirements. FreeCT_ICD is an open source implementation of a model-based iterative reconstruction method that provides a reasonable tradeoff between these requirements. This was accomplished by adapting a previously proposed method that allows the system matrix to be stored with a reasonable memory requirement. The method amounts to describing the attenuation coefficient using rotating slices that follow the helical geometry. In the initially-proposed version, the rotating slices are themselves described using blobs. We have replaced this description by a unique model that relies on tri-linear interpolation together with the principles of Joseph's method. This model offers an improvement in memory requirement while still allowing highly accurate reconstruction for conventional CT geometries. The system matrix is stored column-wise and combined with an iterative coordinate descent (ICD) optimization. The result is FreeCT_ICD, which is a reconstruction program developed on the Linux platform using C++ libraries and the open source GNU GPL v2.0 license. The software is capable of reconstructing raw projection data of helical CT scans. In this work, the software has been described and evaluated by reconstructing datasets exported from a clinical scanner which consisted of an ACR accreditation phantom dataset and a clinical pediatric

  10. The implementation of discovery learning model based on lesson study to increase student's achievement in colloid

    Science.gov (United States)

    Suyanti, Retno Dwi; Purba, Deby Monika

    2017-03-01

    The objectives of this research are to get the increase student's achievement on the discovery learning model based on lesson study. Beside of that, this research also conducted to know the cognitive aspect. This research was done in three school that are SMA N 3 Medan. Population is all the students in SMA N 11 Medan which taken by purposive random sampling. The research instruments are achievement test instruments that have been validated. The research data analyzed by statistic using Ms Excell. The result data shows that the student's achievement taught by discovery learning model based on Lesson study higher than the student's achievement taught by direct instructional method. It can be seen from the average of gain and also proved with t-test, the normalized gain in experimental class of SMA N 11 is (0.74±0.12) and control class (0.45±0.12), at significant level α = 0.05, Ha is received and Ho is refused where tcount>ttable in SMA N 11 (9.81>1,66). Then get the improvement cognitive aspect from three of school is C2 where SMA N 11 is 0.84(high). Then the observation sheet result of lesson study from SMA N 11 92 % of student working together while 67% less in active using media.

  11. Factors influencing long-term adherence to two previously implemented hospital guidelines

    NARCIS (Netherlands)

    Knops, A. M.; Storm-Versloot, M. N.; Mank, A. P. M.; Ubbink, D. T.; Vermeulen, H.; Bossuyt, P. M. M.; Goossens, A.

    2010-01-01

    After successful implementation, adherence to hospital guidelines should be sustained. Long-term adherence to two hospital guidelines was audited. The overall aim was to explore factors accounting for their long-term adherence or non-adherence. A fluid balance guideline (FBG) and body temperature

  12. The Implementation of Character Education Model Based on Empowerment Theatre for Primary School Students

    Science.gov (United States)

    Anggraini, Purwati; Kusniarti, Tuti

    2016-01-01

    This study aimed at constructing character education model implemented in primary school. The research method was qualitative with five samples in total, comprising primary schools in Malang city/regency and one school as a pilot model. The pilot model was instructed by theatre coach teacher, parents, and school society. The result showed that…

  13. Ares Upper Stage Processes to Implement Model Based Design - Going Paperless

    Science.gov (United States)

    Gregory, Melanie

    2012-01-01

    Computer-Aided Design (CAD) has all but replaced the drafting board for design work. Increased productivity and accuracy should be natural outcomes of using CAD. Going from paper drawings only to paper drawings based on CAD models to CAD models and no drawings, or Model Based Design (MBD), is a natural progression in today?s world. There are many advantages to MBD over traditional design methods. To make the most of those advantages, standards should be in place and the proper foundation should be laid prior to transitioning to MBD. However, without a full understanding of the implications of MBD and the proper control of the data, the advantages are greatly diminished. Transitioning from a paper design world to an electronic design world means re-thinking how information gets controlled at its origin and distributed from one point to another. It means design methodology is critical, especially for large projects. It means preparation of standardized parts and processes as well as strong communication between all parties in order to maximize the benefits of MBD.

  14. Model-based design of RNA hybridization networks implemented in living cells.

    Science.gov (United States)

    Rodrigo, Guillermo; Prakash, Satya; Shen, Shensi; Majer, Eszter; Daròs, José-Antonio; Jaramillo, Alfonso

    2017-09-19

    Synthetic gene circuits allow the behavior of living cells to be reprogrammed, and non-coding small RNAs (sRNAs) are increasingly being used as programmable regulators of gene expression. However, sRNAs (natural or synthetic) are generally used to regulate single target genes, while complex dynamic behaviors would require networks of sRNAs regulating each other. Here, we report a strategy for implementing such networks that exploits hybridization reactions carried out exclusively by multifaceted sRNAs that are both targets of and triggers for other sRNAs. These networks are ultimately coupled to the control of gene expression. We relied on a thermodynamic model of the different stable conformational states underlying this system at the nucleotide level. To test our model, we designed five different RNA hybridization networks with a linear architecture, and we implemented them in Escherichia coli. We validated the network architecture at the molecular level by native polyacrylamide gel electrophoresis, as well as the network function at the bacterial population and single-cell levels with a fluorescent reporter. Our results suggest that it is possible to engineer complex cellular programs based on RNA from first principles. Because these networks are mainly based on physical interactions, our designs could be expanded to other organisms as portable regulatory resources or to implement biological computations. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  15. Design and Implementation of Mobile Blended Learning Model Based on WeChat Public Platform

    Directory of Open Access Journals (Sweden)

    Han Yanyan

    2017-01-01

    Full Text Available Merging together the ideas of mobile learning, blended learning and flipped classroom, a Mobile Blended Learning Model (MBLM is constructed. Based on WeChat Public Platform (WPP, MBLM can optimize the instructional process and improve the learning efficiency. A Mobile Blended Learning System(MBLS is implemented by using MBLM, and it is constructed by both WPP and auxiliary learning system which based on Java Web. This system has reasonable designed function, easy operation, and beautiful interface, so it can effectively promote the popularization of MBLM.

  16. Automatic treatment planning implementation using a database of previously treated patients

    International Nuclear Information System (INIS)

    Moore, J A; Evans, K; Yang, W; Herman, J; McNutt, T

    2014-01-01

    Purpose: Using a database of prior treated patients, it is possible to predict the dose to critical structures for future patients. Automatic treatment planning speeds the planning process by generating a good initial plan from predicted dose values. Methods: A SQL relational database of previously approved treatment plans is populated via an automated export from Pinnacle 3 . This script outputs dose and machine information and selected Regions of Interests as well as its associated Dose-Volume Histogram (DVH) and Overlap Volume Histograms (OVHs) with respect to the target structures. Toxicity information is exported from Mosaiq and added to the database for each patient. The SQL query is designed to ask the system for the lowest achievable dose for a specified region of interest (ROI) for each patient with a given volume of that ROI being as close or closer to the target than the current patient. Results: The additional time needed to calculate OVHs is approximately 1.5 minutes for a typical patient. Database lookup of planning objectives takes approximately 4 seconds. The combined additional time is less than that of a typical single plan optimization (2.5 mins). Conclusions: An automatic treatment planning interface has been successfully used by dosimetrists to quickly produce a number of SBRT pancreas treatment plans. The database can be used to compare dose to individual structures with the toxicity experienced and predict toxicities before planning for future patients.

  17. Behavior-based network management: a unique model-based approach to implementing cyber superiority

    Science.gov (United States)

    Seng, Jocelyn M.

    2016-05-01

    Behavior-Based Network Management (BBNM) is a technological and strategic approach to mastering the identification and assessment of network behavior, whether human-driven or machine-generated. Recognizing that all five U.S. Air Force (USAF) mission areas rely on the cyber domain to support, enhance and execute their tasks, BBNM is designed to elevate awareness and improve the ability to better understand the degree of reliance placed upon a digital capability and the operational risk.2 Thus, the objective of BBNM is to provide a holistic view of the digital battle space to better assess the effects of security, monitoring, provisioning, utilization management, allocation to support mission sustainment and change control. Leveraging advances in conceptual modeling made possible by a novel advancement in software design and implementation known as Vector Relational Data Modeling (VRDM™), the BBNM approach entails creating a network simulation in which meaning can be inferred and used to manage network behavior according to policy, such as quickly detecting and countering malicious behavior. Initial research configurations have yielded executable BBNM models as combinations of conceptualized behavior within a network management simulation that includes only concepts of threats and definitions of "good" behavior. A proof of concept assessment called "Lab Rat," was designed to demonstrate the simplicity of network modeling and the ability to perform adaptation. The model was tested on real world threat data and demonstrated adaptive and inferential learning behavior. Preliminary results indicate this is a viable approach towards achieving cyber superiority in today's volatile, uncertain, complex and ambiguous (VUCA) environment.

  18. Teaching and implementing autonomous robotic lab walkthroughs in a biotech laboratory through model-based visual tracking

    Science.gov (United States)

    Wojtczyk, Martin; Panin, Giorgio; Röder, Thorsten; Lenz, Claus; Nair, Suraj; Heidemann, Rüdiger; Goudar, Chetan; Knoll, Alois

    2010-01-01

    After utilizing robots for more than 30 years for classic industrial automation applications, service robots form a constantly increasing market, although the big breakthrough is still awaited. Our approach to service robots was driven by the idea of supporting lab personnel in a biotechnology laboratory. After initial development in Germany, a mobile robot platform extended with an industrial manipulator and the necessary sensors for indoor localization and object manipulation, has been shipped to Bayer HealthCare in Berkeley, CA, USA, a global player in the sector of biopharmaceutical products, located in the San Francisco bay area. The determined goal of the mobile manipulator is to support the off-shift staff to carry out completely autonomous or guided, remote controlled lab walkthroughs, which we implement utilizing a recent development of our computer vision group: OpenTL - an integrated framework for model-based visual tracking.

  19. Model-based Systems Engineering: Creation and Implementation of Model Validation Rules for MOS 2.0

    Science.gov (United States)

    Schmidt, Conrad K.

    2013-01-01

    Model-based Systems Engineering (MBSE) is an emerging modeling application that is used to enhance the system development process. MBSE allows for the centralization of project and system information that would otherwise be stored in extraneous locations, yielding better communication, expedited document generation and increased knowledge capture. Based on MBSE concepts and the employment of the Systems Modeling Language (SysML), extremely large and complex systems can be modeled from conceptual design through all system lifecycles. The Operations Revitalization Initiative (OpsRev) seeks to leverage MBSE to modernize the aging Advanced Multi-Mission Operations Systems (AMMOS) into the Mission Operations System 2.0 (MOS 2.0). The MOS 2.0 will be delivered in a series of conceptual and design models and documents built using the modeling tool MagicDraw. To ensure model completeness and cohesiveness, it is imperative that the MOS 2.0 models adhere to the specifications, patterns and profiles of the Mission Service Architecture Framework, thus leading to the use of validation rules. This paper outlines the process by which validation rules are identified, designed, implemented and tested. Ultimately, these rules provide the ability to maintain model correctness and synchronization in a simple, quick and effective manner, thus allowing the continuation of project and system progress.

  20. Implementation of internal model based control and individual pitch control to reduce fatigue loads and tower vibrations in wind turbines

    Science.gov (United States)

    Mohammadi, Ebrahim; Fadaeinedjad, Roohollah; Moschopoulos, Gerry

    2018-05-01

    Vibration control and fatigue loads reduction are important issues in large-scale wind turbines. Identifying the vibration frequencies and tuning dampers and controllers at these frequencies are major concerns in many control methods. In this paper, an internal model control (IMC) method with an adaptive algorithm is implemented to first identify the vibration frequency of the wind turbine tower and then to cancel the vibration signal. Standard individual pitch control (IPC) is also implemented to compare the performance of the controllers in term of fatigue loads reduction. Finally, the performance of the system when both controllers are implemented together is evaluated. Simulation results demonstrate that using only IMC or IPC alone has advantages and can reduce fatigue loads on specific components. IMC can identify and suppress tower vibrations in both fore-aft and side-to-side directions, whereas, IPC can reduce fatigue loads on blades, shaft and yaw bearings. When both IMC and IPC are implemented together, the advantages of both controllers can be used. The aforementioned analysis and comparisons were not studied in literature and this study fills this gap. FAST, AreoDyn and Simulink are used to simulate the mechanical, aerodynamic and electrical aspects of wind turbine.

  1. Implementation of a phenomenological DNB prediction model based on macroscale boiling flow processes in PWR fuel bundles

    International Nuclear Information System (INIS)

    Mohitpour, Maryam; Jahanfarnia, Gholamreza; Shams, Mehrzad

    2014-01-01

    Highlights: • A numerical framework was developed to mechanistically predict DNB in PWR bundles. • The DNB evaluation module was incorporated into the two-phase flow solver module. • Three-dimensional two-fluid model was the basis of two-phase flow solver module. • Liquid sublayer dryout model was adapted as CHF-triggering mechanism in DNB module. • Ability of DNB modeling approach was studied based on PSBT DNB tests in rod bundle. - Abstract: In this study, a numerical framework, comprising of a two-phase flow subchannel solver module and a Departure from Nucleate Boiling (DNB) evaluation module, was developed to mechanistically predict DNB in rod bundles of Pressurized Water Reactor (PWR). In this regard, the liquid sublayer dryout model was adapted as the Critical Heat Flux (CHF) triggering mechanism to reduce the dependency of the model on empirical correlations in the DNB evaluation module. To predict local flow boiling processes, a three-dimensional two-fluid formalism coupled with heat conduction was selected as the basic tool for the development of the two-phase flow subchannel analysis solver. Evaluation of the DNB modeling approach was performed against OECD/NRC NUPEC PWR Bundle tests (PSBT Benchmark) which supplied an extensive database for the development of truly mechanistic and consistent models for boiling transition and CHF. The results of the analyses demonstrated the need for additional assessment of the subcooled boiling model and the bulk condensation model implemented in the two-phase flow solver module. The proposed model slightly under-predicts the DNB power in comparison with the ones obtained from steady-state benchmark measurements. However, this prediction is acceptable compared with other codes. Another point about the DNB prediction model is that it has a conservative behavior. Examination of the axial and radial position of the first detected DNB using code-to-code comparisons on the basis of PSBT data indicated that the our

  2. Implementation of an electronic medical record system in previously computer-naïve primary care centres: a pilot study from Cyprus.

    Science.gov (United States)

    Samoutis, George; Soteriades, Elpidoforos S; Kounalakis, Dimitris K; Zachariadou, Theodora; Philalithis, Anastasios; Lionis, Christos

    2007-01-01

    The computer-based electronic medical record (EMR) is an essential new technology in health care, contributing to high-quality patient care and efficient patient management. The majority of southern European countries, however, have not yet implemented universal EMR systems and many efforts are still ongoing. We describe the development of an EMR system and its pilot implementation and evaluation in two previously computer-naïve public primary care centres in Cyprus. One urban and one rural primary care centre along with their personnel (physicians and nurses) were selected to participate. Both qualitative and quantitative evaluation tools were used during the implementation phase. Qualitative data analysis was based on the framework approach, whereas quantitative assessment was based on a nine-item questionnaire and EMR usage parameters. Two public primary care centres participated, and a total often health professionals served as EMR system evaluators. Physicians and nurses rated EMR relatively highly, while patients were the most enthusiastic supporters for the new information system. Major implementation impediments were the physicians' perceptions that EMR usage negatively affected their workflow, physicians' legal concerns, lack of incentives, system breakdowns, software design problems, transition difficulties and lack of familiarity with electronic equipment. The importance of combining qualitative and quantitative evaluation tools is highlighted. More efforts are needed for the universal adoption and routine use of EMR in the primary care system of Cyprus as several barriers to adoption exist; however, none is insurmountable. Computerised systems could improve efficiency and quality of care in Cyprus, benefiting the entire population.

  3. Experimental Design of a Polymeric Solution to Improve the Mobility Ratio in a Reservoir previous implementation of a pilot project of EOR

    Directory of Open Access Journals (Sweden)

    Vanessa Cuenca

    2016-12-01

    Full Text Available This paper describes experimental formulations of polymeric solutions through lab evaluations with the objective of finding optimum solution concentration to fluid mobility in reservoirs as previous step before implementing a pilot project of enhanced oil recovery. The polymers, firstly, were selected based on the properties from fluids from reservoir. Two types of polymers were used TCC-330 and EOR909 and the experimental tests were: thermal stability, compatibility, adsorption, salinity, and displacement. The design with the best results was with polymer TCC-330 at 1,500 ppm concentration.

  4. Laboratory Load Model Based on 150 kVA Power Frequency Converter and Simulink Real-Time – Concept, Implementation, Experiments

    Directory of Open Access Journals (Sweden)

    Robert Małkowski

    2016-09-01

    Full Text Available First section of the paper provides technical specification of laboratory load model basing on 150 kVA power frequency converter and Simulink Real-Time platform. Assumptions, as well as control algorithm structure is presented. Theoretical considerations based on criteria which load types may be simulated using discussed laboratory setup, are described. As described model contains transformer with thyristor-controlled tap changer, wider scope of device capabilities is presented. Paper lists and describes tunable parameters, both: tunable during device operation and changed only before starting the experiment. Implementation details are given in second section of paper. Hardware structure is presented and described. Information about used communication interface, data maintenance and storage solution, as well as used Simulink real-time features are presented. List and description of all measurements is provided. Potential of laboratory setup modifications is evaluated. Third section describes performed laboratory tests. Different load configurations are described and experimental results are presented. This includes simulation of under frequency load shedding, frequency and voltage dependent characteristics of groups of load units, time characteristics of group of different load units in a chosen area and arbitrary active and reactive power regulation basing on defined schedule. Different operation modes of control algorithm are described: apparent power control, active and reactive power control, active and reactive current RMS value control.

  5. Principles of models based engineering

    Energy Technology Data Exchange (ETDEWEB)

    Dolin, R.M.; Hefele, J.

    1996-11-01

    This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.

  6. SLS Navigation Model-Based Design Approach

    Science.gov (United States)

    Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas

    2018-01-01

    The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and

  7. Model-Based Power Plant Master Control

    Energy Technology Data Exchange (ETDEWEB)

    Boman, Katarina; Thomas, Jean; Funkquist, Jonas

    2010-08-15

    The main goal of the project has been to evaluate the potential of a coordinated master control for a solid fuel power plant in terms of tracking capability, stability and robustness. The control strategy has been model-based predictive control (MPC) and the plant used in the case study has been the Vattenfall power plant Idbaecken in Nykoeping. A dynamic plant model based on nonlinear physical models was used to imitate the true plant in MATLAB/SIMULINK simulations. The basis for this model was already developed in previous Vattenfall internal projects, along with a simulation model of the existing control implementation with traditional PID controllers. The existing PID control is used as a reference performance, and it has been thoroughly studied and tuned in these previous Vattenfall internal projects. A turbine model was developed with characteristics based on the results of steady-state simulations of the plant using the software EBSILON. Using the derived model as a representative for the actual process, an MPC control strategy was developed using linearization and gain-scheduling. The control signal constraints (rate of change) and constraints on outputs were implemented to comply with plant constraints. After tuning the MPC control parameters, a number of simulation scenarios were performed to compare the MPC strategy with the existing PID control structure. The simulation scenarios also included cases highlighting the robustness properties of the MPC strategy. From the study, the main conclusions are: - The proposed Master MPC controller shows excellent set-point tracking performance even though the plant has strong interactions and non-linearity, and the controls and their rate of change are bounded. - The proposed Master MPC controller is robust, stable in the presence of disturbances and parameter variations. Even though the current study only considered a very small number of the possible disturbances and modelling errors, the considered cases are

  8. Consideration of the bioavailability of metal/metalloid species in freshwaters: experiences regarding the implementation of biotic ligand model-based approaches in risk assessment frameworks.

    Science.gov (United States)

    Rüdel, Heinz; Díaz Muñiz, Cristina; Garelick, Hemda; Kandile, Nadia G; Miller, Bradley W; Pantoja Munoz, Leonardo; Peijnenburg, Willie J G M; Purchase, Diane; Shevah, Yehuda; van Sprang, Patrick; Vijver, Martina; Vink, Jos P M

    2015-05-01

    After the scientific development of biotic ligand models (BLMs) in recent decades, these models are now considered suitable for implementation in regulatory risk assessment of metals in freshwater bodies. The BLM approach has been described in many peer-reviewed publications, and the original complex BLMs have been applied in prospective risk assessment reports for metals and metal compounds. BLMs are now also recommended as suitable concepts for the site-specific evaluation of monitoring data in the context of the European Water Framework Directive. However, the use is hampered by the data requirements for the original BLMs (about 10 water parameters). Recently, several user-friendly BLM-based bioavailability software tools for assessing the aquatic toxicity of relevant metals (mainly copper, nickel, and zinc) became available. These tools only need a basic set of commonly determined water parameters as input (i.e., pH, hardness, dissolved organic matter, and dissolved metal concentration). Such tools seem appropriate to foster the implementation of routine site-specific water quality assessments. This work aims to review the existing bioavailability-based regulatory approaches and the application of available BLM-based bioavailability tools for this purpose. Advantages and possible drawbacks of these tools (e.g., feasibility, boundaries of validity) are discussed, and recommendations for further implementation are given.

  9. Population pharmacokinetics of busulfan in pediatric and young adult patients undergoing hematopoietic cell transplant: a model-based dosing algorithm for personalized therapy and implementation into routine clinical use.

    Science.gov (United States)

    Long-Boyle, Janel R; Savic, Rada; Yan, Shirley; Bartelink, Imke; Musick, Lisa; French, Deborah; Law, Jason; Horn, Biljana; Cowan, Morton J; Dvorak, Christopher C

    2015-04-01

    Population pharmacokinetic (PK) studies of busulfan in children have shown that individualized model-based algorithms provide improved targeted busulfan therapy when compared with conventional dose guidelines. The adoption of population PK models into routine clinical practice has been hampered by the tendency of pharmacologists to develop complex models too impractical for clinicians to use. The authors aimed to develop a population PK model for busulfan in children that can reliably achieve therapeutic exposure (concentration at steady state) and implement a simple model-based tool for the initial dosing of busulfan in children undergoing hematopoietic cell transplantation. Model development was conducted using retrospective data available in 90 pediatric and young adult patients who had undergone hematopoietic cell transplantation with busulfan conditioning. Busulfan drug levels and potential covariates influencing drug exposure were analyzed using the nonlinear mixed effects modeling software, NONMEM. The final population PK model was implemented into a clinician-friendly Microsoft Excel-based tool and used to recommend initial doses of busulfan in a group of 21 pediatric patients prospectively dosed based on the population PK model. Modeling of busulfan time-concentration data indicates that busulfan clearance displays nonlinearity in children, decreasing up to approximately 20% between the concentrations of 250-2000 ng/mL. Important patient-specific covariates found to significantly impact busulfan clearance were actual body weight and age. The percentage of individuals achieving a therapeutic concentration at steady state was significantly higher in subjects receiving initial doses based on the population PK model (81%) than in historical controls dosed on conventional guidelines (52%) (P = 0.02). When compared with the conventional dosing guidelines, the model-based algorithm demonstrates significant improvement for providing targeted busulfan therapy in

  10. SLS Model Based Design: A Navigation Perspective

    Science.gov (United States)

    Oliver, T. Emerson; Anzalone, Evan; Park, Thomas; Geohagan, Kevin

    2018-01-01

    The SLS Program has implemented a Model-based Design (MBD) and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team is responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1B design, the additional GPS Receiver hardware model is managed as a DMM at the vehicle design level. This paper describes the models, and discusses the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the navigation components.

  11. Clinical Implementation of a Model-Based In Vivo Dose Verification System for Stereotactic Body Radiation Therapy–Volumetric Modulated Arc Therapy Treatments Using the Electronic Portal Imaging Device

    Energy Technology Data Exchange (ETDEWEB)

    McCowan, Peter M., E-mail: pmccowan@cancercare.mb.ca [Medical Physics Department, CancerCare Manitoba, Winnipeg, Manitoba (Canada); Asuni, Ganiyu [Medical Physics Department, CancerCare Manitoba, Winnipeg, Manitoba (Canada); Van Uytven, Eric [Medical Physics Department, CancerCare Manitoba, Winnipeg, Manitoba (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba (Canada); VanBeek, Timothy [Medical Physics Department, CancerCare Manitoba, Winnipeg, Manitoba (Canada); McCurdy, Boyd M.C. [Medical Physics Department, CancerCare Manitoba, Winnipeg, Manitoba (Canada); Department of Physics and Astronomy, University of Manitoba, Winnipeg, Manitoba (Canada); Department of Radiology, University of Manitoba, Winnipeg, Manitoba (Canada); Loewen, Shaun K. [Department of Oncology, University of Calgary, Calgary, Alberta (Canada); Ahmed, Naseer; Bashir, Bashir; Butler, James B.; Chowdhury, Amitava; Dubey, Arbind; Leylek, Ahmet; Nashed, Maged [CancerCare Manitoba, Winnipeg, Manitoba (Canada)

    2017-04-01

    Purpose: To report findings from an in vivo dosimetry program implemented for all stereotactic body radiation therapy patients over a 31-month period and discuss the value and challenges of utilizing in vivo electronic portal imaging device (EPID) dosimetry clinically. Methods and Materials: From December 2013 to July 2016, 117 stereotactic body radiation therapy–volumetric modulated arc therapy patients (100 lung, 15 spine, and 2 liver) underwent 602 EPID-based in vivo dose verification events. A developed model-based dose reconstruction algorithm calculates the 3-dimensional dose distribution to the patient by back-projecting the primary fluence measured by the EPID during treatment. The EPID frame-averaging was optimized in June 2015. For each treatment, a 3%/3-mm γ comparison between our EPID-derived dose and the Eclipse AcurosXB–predicted dose to the planning target volume (PTV) and the ≥20% isodose volume were performed. Alert levels were defined as γ pass rates <85% (lung and liver) and <80% (spine). Investigations were carried out for all fractions exceeding the alert level and were classified as follows: EPID-related, algorithmic, patient setup, anatomic change, or unknown/unidentified errors. Results: The percentages of fractions exceeding the alert levels were 22.6% for lung before frame-average optimization and 8.0% for lung, 20.0% for spine, and 10.0% for liver after frame-average optimization. Overall, mean (± standard deviation) planning target volume γ pass rates were 90.7% ± 9.2%, 87.0% ± 9.3%, and 91.2% ± 3.4% for the lung, spine, and liver patients, respectively. Conclusions: Results from the clinical implementation of our model-based in vivo dose verification method using on-treatment EPID images is reported. The method is demonstrated to be valuable for routine clinical use for verifying delivered dose as well as for detecting errors.

  12. Model-based machine learning.

    Science.gov (United States)

    Bishop, Christopher M

    2013-02-13

    Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications.

  13. Least-squares model-based halftoning

    Science.gov (United States)

    Pappas, Thrasyvoulos N.; Neuhoff, David L.

    1992-08-01

    A least-squares model-based approach to digital halftoning is proposed. It exploits both a printer model and a model for visual perception. It attempts to produce an 'optimal' halftoned reproduction, by minimizing the squared error between the response of the cascade of the printer and visual models to the binary image and the response of the visual model to the original gray-scale image. Conventional methods, such as clustered ordered dither, use the properties of the eye only implicitly, and resist printer distortions at the expense of spatial and gray-scale resolution. In previous work we showed that our printer model can be used to modify error diffusion to account for printer distortions. The modified error diffusion algorithm has better spatial and gray-scale resolution than conventional techniques, but produces some well known artifacts and asymmetries because it does not make use of an explicit eye model. Least-squares model-based halftoning uses explicit eye models and relies on printer models that predict distortions and exploit them to increase, rather than decrease, both spatial and gray-scale resolution. We have shown that the one-dimensional least-squares problem, in which each row or column of the image is halftoned independently, can be implemented with the Viterbi's algorithm. Unfortunately, no closed form solution can be found in two dimensions. The two-dimensional least squares solution is obtained by iterative techniques. Experiments show that least-squares model-based halftoning produces more gray levels and better spatial resolution than conventional techniques. We also show that the least- squares approach eliminates the problems associated with error diffusion. Model-based halftoning can be especially useful in transmission of high quality documents using high fidelity gray-scale image encoders. As we have shown, in such cases halftoning can be performed at the receiver, just before printing. Apart from coding efficiency, this approach

  14. Model Based Temporal Reasoning

    Science.gov (United States)

    Rabin, Marla J.; Spinrad, Paul R.; Fall, Thomas C.

    1988-03-01

    Systems that assess the real world must cope with evidence that is uncertain, ambiguous, and spread over time. Typically, the most important function of an assessment system is to identify when activities are occurring that are unusual or unanticipated. Model based temporal reasoning addresses both of these requirements. The differences among temporal reasoning schemes lies in the methods used to avoid computational intractability. If we had n pieces of data and we wanted to examine how they were related, the worst case would be where we had to examine every subset of these points to see if that subset satisfied the relations. This would be 2n, which is intractable. Models compress this; if several data points are all compatible with a model, then that model represents all those data points. Data points are then considered related if they lie within the same model or if they lie in models that are related. Models thus address the intractability problem. They also address the problem of determining unusual activities if the data do not agree with models that are indicated by earlier data then something out of the norm is taking place. The models can summarize what we know up to that time, so when they are not predicting correctly, either something unusual is happening or we need to revise our models. The model based reasoner developed at Advanced Decision Systems is thus both intuitive and powerful. It is currently being used on one operational system and several prototype systems. It has enough power to be used in domains spanning the spectrum from manufacturing engineering and project management to low-intensity conflict and strategic assessment.

  15. Laparoscopy After Previous Laparotomy

    Directory of Open Access Journals (Sweden)

    Zulfo Godinjak

    2006-11-01

    Full Text Available Following the abdominal surgery, extensive adhesions often occur and they can cause difficulties during laparoscopic operations. However, previous laparotomy is not considered to be a contraindication for laparoscopy. The aim of this study is to present that an insertion of Veres needle in the region of umbilicus is a safe method for creating a pneumoperitoneum for laparoscopic operations after previous laparotomy. In the last three years, we have performed 144 laparoscopic operations in patients that previously underwent one or two laparotomies. Pathology of digestive system, genital organs, Cesarean Section or abdominal war injuries were the most common causes of previouslaparotomy. During those operations or during entering into abdominal cavity we have not experienced any complications, while in 7 patients we performed conversion to laparotomy following the diagnostic laparoscopy. In all patients an insertion of Veres needle and trocar insertion in the umbilical region was performed, namely a technique of closed laparoscopy. Not even in one patient adhesions in the region of umbilicus were found, and no abdominal organs were injured.

  16. Automated model-based testing of hybrid systems

    NARCIS (Netherlands)

    Osch, van M.P.W.J.

    2009-01-01

    In automated model-based input-output conformance testing, tests are automati- cally generated from a speci¯cation and automatically executed on an implemen- tation. Input is applied to the implementation and output is observed from the implementation. If the observed output is allowed according to

  17. Cognitive components underpinning the development of model-based learning.

    Science.gov (United States)

    Potter, Tracey C S; Bryce, Nessa V; Hartley, Catherine A

    2017-06-01

    Reinforcement learning theory distinguishes "model-free" learning, which fosters reflexive repetition of previously rewarded actions, from "model-based" learning, which recruits a mental model of the environment to flexibly select goal-directed actions. Whereas model-free learning is evident across development, recruitment of model-based learning appears to increase with age. However, the cognitive processes underlying the development of model-based learning remain poorly characterized. Here, we examined whether age-related differences in cognitive processes underlying the construction and flexible recruitment of mental models predict developmental increases in model-based choice. In a cohort of participants aged 9-25, we examined whether the abilities to infer sequential regularities in the environment ("statistical learning"), maintain information in an active state ("working memory") and integrate distant concepts to solve problems ("fluid reasoning") predicted age-related improvements in model-based choice. We found that age-related improvements in statistical learning performance did not mediate the relationship between age and model-based choice. Ceiling performance on our working memory assay prevented examination of its contribution to model-based learning. However, age-related improvements in fluid reasoning statistically mediated the developmental increase in the recruitment of a model-based strategy. These findings suggest that gradual development of fluid reasoning may be a critical component process underlying the emergence of model-based learning. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Model-based Software Engineering

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  19. Graph Model Based Indoor Tracking

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Lu, Hua; Yang, Bin

    2009-01-01

    The tracking of the locations of moving objects in large indoor spaces is important, as it enables a range of applications related to, e.g., security and indoor navigation and guidance. This paper presents a graph model based approach to indoor tracking that offers a uniform data management...

  20. Model-based security testing

    OpenAIRE

    Schieferdecker, Ina; Großmann, Jürgen; Schneider, Martin

    2012-01-01

    Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security...

  1. Embracing model-based designs for dose-finding trials.

    Science.gov (United States)

    Love, Sharon B; Brown, Sarah; Weir, Christopher J; Harbron, Chris; Yap, Christina; Gaschler-Markefski, Birgit; Matcham, James; Caffrey, Louise; McKevitt, Christopher; Clive, Sally; Craddock, Charlie; Spicer, James; Cornelius, Victoria

    2017-07-25

    Dose-finding trials are essential to drug development as they establish recommended doses for later-phase testing. We aim to motivate wider use of model-based designs for dose finding, such as the continual reassessment method (CRM). We carried out a literature review of dose-finding designs and conducted a survey to identify perceived barriers to their implementation. We describe the benefits of model-based designs (flexibility, superior operating characteristics, extended scope), their current uptake, and existing resources. The most prominent barriers to implementation of a model-based design were lack of suitable training, chief investigators' preference for algorithm-based designs (e.g., 3+3), and limited resources for study design before funding. We use a real-world example to illustrate how these barriers can be overcome. There is overwhelming evidence for the benefits of CRM. Many leading pharmaceutical companies routinely implement model-based designs. Our analysis identified barriers for academic statisticians and clinical academics in mirroring the progress industry has made in trial design. Unified support from funders, regulators, and journal editors could result in more accurate doses for later-phase testing, and increase the efficiency and success of clinical drug development. We give recommendations for increasing the uptake of model-based designs for dose-finding trials in academia.

  2. Model-based reasoning technology for the power industry

    International Nuclear Information System (INIS)

    Touchton, R.A.; Subramanyan, N.S.; Naser, J.A.

    1991-01-01

    This paper reports on model-based reasoning which refers to an expert system implementation methodology that uses a model of the system which is being reasoned about. Model-based representation and reasoning techniques offer many advantages and are highly suitable for domains where the individual components, their interconnection, and their behavior is well-known. Technology Applications, Inc. (TAI), under contract to the Electric Power Research Institute (EPRI), investigated the use of model-based reasoning in the power industry including the nuclear power industry. During this project, a model-based monitoring and diagnostic tool, called ProSys, was developed. Also, an alarm prioritization system was developed as a demonstration prototype

  3. Sandboxes for Model-Based Inquiry

    Science.gov (United States)

    Brady, Corey; Holbert, Nathan; Soylu, Firat; Novak, Michael; Wilensky, Uri

    2015-04-01

    In this article, we introduce a class of constructionist learning environments that we call Emergent Systems Sandboxes ( ESSs), which have served as a centerpiece of our recent work in developing curriculum to support scalable model-based learning in classroom settings. ESSs are a carefully specified form of virtual construction environment that support students in creating, exploring, and sharing computational models of dynamic systems that exhibit emergent phenomena. They provide learners with "entity"-level construction primitives that reflect an underlying scientific model. These primitives can be directly "painted" into a sandbox space, where they can then be combined, arranged, and manipulated to construct complex systems and explore the emergent properties of those systems. We argue that ESSs offer a means of addressing some of the key barriers to adopting rich, constructionist model-based inquiry approaches in science classrooms at scale. Situating the ESS in a large-scale science modeling curriculum we are implementing across the USA, we describe how the unique "entity-level" primitive design of an ESS facilitates knowledge system refinement at both an individual and social level, we describe how it supports flexible modeling practices by providing both continuous and discrete modes of executability, and we illustrate how it offers students a variety of opportunities for validating their qualitative understandings of emergent systems as they develop.

  4. Model Based Control of Refrigeration Systems

    DEFF Research Database (Denmark)

    Larsen, Lars Finn Sloth

    for automation of these procedures, that is to incorporate some "intelligence" in the control system, this project was started up. The main emphasis of this work has been on model based methods for system optimizing control in supermarket refrigeration systems. The idea of implementing a system optimizing...... control is to let an optimization procedure take over the task of operating the refrigeration system and thereby replace the role of the operator in the traditional control structure. In the context of refrigeration systems, the idea is to divide the optimizing control structure into two parts: A part...... optimizing the steady state operation "set-point optimizing control" and a part optimizing dynamic behaviour of the system "dynamical optimizing control". A novel approach for set-point optimization will be presented. The general idea is to use a prediction of the steady state, for computation of the cost...

  5. Model-Based Security Testing

    Directory of Open Access Journals (Sweden)

    Ina Schieferdecker

    2012-02-01

    Full Text Available Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security testing (MBST is a relatively new field and especially dedicated to the systematic and efficient specification and documentation of security test objectives, security test cases and test suites, as well as to their automated or semi-automated generation. In particular, the combination of security modelling and test generation approaches is still a challenge in research and of high interest for industrial applications. MBST includes e.g. security functional testing, model-based fuzzing, risk- and threat-oriented testing, and the usage of security test patterns. This paper provides a survey on MBST techniques and the related models as well as samples of new methods and tools that are under development in the European ITEA2-project DIAMONDS.

  6. Commercial Implementation of Model-Based Manufacturing of Nanostructured Metals

    Energy Technology Data Exchange (ETDEWEB)

    Lowe, Terry C. [Los Alamos National Laboratory

    2012-07-24

    Computational modeling is an essential tool for commercial production of nanostructured metals. Strength is limited by imperfections at the high strength levels that are achievable in nanostructured metals. Processing to achieve homogeneity at the micro- and nano-scales is critical. Manufacturing of nanostructured metals is intrinsically a multi-scale problem. Manufacturing of nanostructured metal products requires computer control, monitoring and modeling. Large scale manufacturing of bulk nanostructured metals by Severe Plastic Deformation is a multi-scale problem. Computational modeling at all scales is essential. Multiple scales of modeling must be integrated to predict and control nanostructural, microstructural, macrostructural product characteristics and production processes.

  7. Model-based analysis and simulation of regenerative heat wheel

    DEFF Research Database (Denmark)

    Wu, Zhuang; Melnik, Roderick V. N.; Borup, F.

    2006-01-01

    The rotary regenerator (also called the heat wheel) is an important component of energy intensive sectors, which is used in many heat recovery systems. In this paper, a model-based analysis of a rotary regenerator is carried out with a major emphasis given to the development and implementation of...

  8. Model-Based Engineering of Supervisory Controllers using CIF

    NARCIS (Netherlands)

    Schiffelers, R.R.H.; Theunissen, R.J.M.; Beek, van D.A.; Rooda, J.E.; Levendovsky, T.; Lengyel, L.

    2009-01-01

    In the Model-Based Engineering (MBE) paradigm, models are the core elements in the design process of a system from its requirements to the actual implementation of the system. By means of Supervisory Control Theory (SCT), supervisory controllers (supervisors) can be synthesized instead of

  9. Issues in practical model-based diagnosis

    NARCIS (Netherlands)

    Bakker, R.R.; Bakker, R.R.; van den Bempt, P.C.A.; van den Bempt, P.C.A.; Mars, Nicolaas; Out, D.-J.; Out, D.J.; van Soest, D.C.; van Soes, D.C.

    1993-01-01

    The model-based diagnosis project at the University of Twente has been directed at improving the practical usefulness of model-based diagnosis. In cooperation with industrial partners, the research addressed the modeling problem and the efficiency problem in model-based reasoning. Main results of

  10. Model-based sensor diagnosis

    International Nuclear Information System (INIS)

    Milgram, J.; Dormoy, J.L.

    1994-09-01

    Running a nuclear power plant involves monitoring data provided by the installation's sensors. Operators and computerized systems then use these data to establish a diagnostic of the plant. However, the instrumentation system is complex, and is not immune to faults and failures. This paper presents a system for detecting sensor failures using a topological description of the installation and a set of component models. This model of the plant implicitly contains relations between sensor data. These relations must always be checked if all the components are functioning correctly. The failure detection task thus consists of checking these constraints. The constraints are extracted in two stages. Firstly, a qualitative model of their existence is built using structural analysis. Secondly, the models are formally handled according to the results of the structural analysis, in order to establish the constraints on the sensor data. This work constitutes an initial step in extending model-based diagnosis, as the information on which it is based is suspect. This work will be followed by surveillance of the detection system. When the instrumentation is assumed to be sound, the unverified constraints indicate errors on the plant model. (authors). 8 refs., 4 figs

  11. Model-Based Reconstructive Elasticity Imaging Using Ultrasound

    Directory of Open Access Journals (Sweden)

    Salavat R. Aglyamov

    2007-01-01

    Full Text Available Elasticity imaging is a reconstructive imaging technique where tissue motion in response to mechanical excitation is measured using modern imaging systems, and the estimated displacements are then used to reconstruct the spatial distribution of Young's modulus. Here we present an ultrasound elasticity imaging method that utilizes the model-based technique for Young's modulus reconstruction. Based on the geometry of the imaged object, only one axial component of the strain tensor is used. The numerical implementation of the method is highly efficient because the reconstruction is based on an analytic solution of the forward elastic problem. The model-based approach is illustrated using two potential clinical applications: differentiation of liver hemangioma and staging of deep venous thrombosis. Overall, these studies demonstrate that model-based reconstructive elasticity imaging can be used in applications where the geometry of the object and the surrounding tissue is somewhat known and certain assumptions about the pathology can be made.

  12. Intellectual Model-Based Configuration Management Conception

    Directory of Open Access Journals (Sweden)

    Bartusevics Arturs

    2014-07-01

    Full Text Available Software configuration management is one of the most important disciplines within the software development project, which helps control the software evolution process and allows including into the end project only tested and validated changes. To achieve this, software management completes certain tasks. Concrete tools are used for technical implementation of tasks, such as version control systems, servers of continuous integration, compilers, etc. A correct configuration management process usually requires several tools, which mutually exchange information by generating various kinds of transfers. When it comes to introducing the configuration management process, often there are situations when tool installation is started, yet at that given moment there is no general picture of the total process. The article offers a model-based configuration management concept, which foresees the development of an abstract model for the configuration management process that later is transformed to lower abstraction level models and tools are indicated to support the technical process. A solution of this kind allows a more rational introduction and configuration of tools

  13. Previously unknown species of Aspergillus.

    Science.gov (United States)

    Gautier, M; Normand, A-C; Ranque, S

    2016-08-01

    The use of multi-locus DNA sequence analysis has led to the description of previously unknown 'cryptic' Aspergillus species, whereas classical morphology-based identification of Aspergillus remains limited to the section or species-complex level. The current literature highlights two main features concerning these 'cryptic' Aspergillus species. First, the prevalence of such species in clinical samples is relatively high compared with emergent filamentous fungal taxa such as Mucorales, Scedosporium or Fusarium. Second, it is clearly important to identify these species in the clinical laboratory because of the high frequency of antifungal drug-resistant isolates of such Aspergillus species. Matrix-assisted laser desorption/ionization-time of flight mass spectrometry (MALDI-TOF MS) has recently been shown to enable the identification of filamentous fungi with an accuracy similar to that of DNA sequence-based methods. As MALDI-TOF MS is well suited to the routine clinical laboratory workflow, it facilitates the identification of these 'cryptic' Aspergillus species at the routine mycology bench. The rapid establishment of enhanced filamentous fungi identification facilities will lead to a better understanding of the epidemiology and clinical importance of these emerging Aspergillus species. Based on routine MALDI-TOF MS-based identification results, we provide original insights into the key interpretation issues of a positive Aspergillus culture from a clinical sample. Which ubiquitous species that are frequently isolated from air samples are rarely involved in human invasive disease? Can both the species and the type of biological sample indicate Aspergillus carriage, colonization or infection in a patient? Highly accurate routine filamentous fungi identification is central to enhance the understanding of these previously unknown Aspergillus species, with a vital impact on further improved patient care. Copyright © 2016 European Society of Clinical Microbiology and

  14. Analytical Model-based Fault Detection and Isolation in Control Systems

    DEFF Research Database (Denmark)

    Vukic, Z.; Ozbolt, H.; Blanke, M.

    1998-01-01

    The paper gives an introduction and an overview of the field of fault detection and isolation for control systems. The summary of analytical (quantitative model-based) methodds and their implementation are presented. The focus is given to mthe analytical model-based fault-detection and fault...

  15. Energy, mass, model-based displays, and memory recall

    International Nuclear Information System (INIS)

    Beltracchi, L.

    1989-01-01

    The operation of a pressurized water reactor in the context of the conservation laws for energy and mass is discussed. These conservation laws are the basis of the Rankine heat engine cycle. Computer graphic implementation of the heat engine cycle, in terms of temperature-entropy coordinates for water, serves as a model-based display of the plant process. A human user of this display, trained in first principles of the process, may exercise a monitoring strategy based on the conservation laws

  16. On Model Based Synthesis of Embedded Control Software

    OpenAIRE

    Alimguzhin, Vadim; Mari, Federico; Melatti, Igor; Salvo, Ivano; Tronci, Enrico

    2012-01-01

    Many Embedded Systems are indeed Software Based Control Systems (SBCSs), that is control systems whose controller consists of control software running on a microcontroller device. This motivates investigation on Formal Model Based Design approaches for control software. Given the formal model of a plant as a Discrete Time Linear Hybrid System and the implementation specifications (that is, number of bits in the Analog-to-Digital (AD) conversion) correct-by-construction control software can be...

  17. A tool for model based diagnostics of the AGS Booster

    International Nuclear Information System (INIS)

    Luccio, A.

    1993-01-01

    A model-based algorithmic tool was developed to search for lattice errors by a systematic analysis of orbit data in the AGS Booster synchrotron. The algorithm employs transfer matrices calculated with MAD between points in the ring. Iterative model fitting of the data allows one to find and eventually correct magnet displacements and angles or field errors. The tool, implemented on a HP-Apollo workstation system, has proved very general and of immediate physical interpretation

  18. Development of a robust model-based reactivity control system

    International Nuclear Information System (INIS)

    Rovere, L.A.; Otaduy, P.J.; Brittain, C.R.

    1990-01-01

    This paper describes the development and implementation of a digital model-based reactivity control system that incorporates a knowledge of the plant physics into the control algorithm to improve system performance. This controller is composed of a model-based module and modified proportional-integral-derivative (PID) module. The model-based module has an estimation component to synthesize unmeasurable process variables that are necessary for the control action computation. These estimated variables, besides being used within the control algorithm, will be used for diagnostic purposes by a supervisory control system under development. The PID module compensates for inaccuracies in model coefficients by supplementing the model-based output with a correction term that eliminates any demand tracking or steady state errors. This control algorithm has been applied to develop controllers for a simulation of liquid metal reactors in a multimodular plant. It has shown its capability to track demands in neutron power much more accurately than conventional controllers, reducing overshoots to almost negligible value while providing a good degree of robustness to unmodeled dynamics. 10 refs., 4 figs

  19. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  20. Model-based expert systems for linac computer controls

    International Nuclear Information System (INIS)

    Lee, M.J.

    1988-09-01

    The use of machine modeling and beam simulation programs for the control of accelerator operation has become standard practice. The success of a model-based control operation depends on how the parameter to be controlled is measured, how the measured data is analyzed, how the result of the analysis is interpreted, and how a solution is implemented. There is considerable interest in applying expert systems technology that can automate all of these processes. The design of an expert system to control the beam trajectory in linear accelerators will be discussed as an illustration of this approach. 4 figs., 1 tab

  1. Traceability in Model-Based Testing

    Directory of Open Access Journals (Sweden)

    Mathew George

    2012-11-01

    Full Text Available The growing complexities of software and the demand for shorter time to market are two important challenges that face today’s IT industry. These challenges demand the increase of both productivity and quality of software. Model-based testing is a promising technique for meeting these challenges. Traceability modeling is a key issue and challenge in model-based testing. Relationships between the different models will help to navigate from one model to another, and trace back to the respective requirements and the design model when the test fails. In this paper, we present an approach for bridging the gaps between the different models in model-based testing. We propose relation definition markup language (RDML for defining the relationships between models.

  2. Model-based testing for embedded systems

    CERN Document Server

    Zander, Justyna; Mosterman, Pieter J

    2011-01-01

    What the experts have to say about Model-Based Testing for Embedded Systems: "This book is exactly what is needed at the exact right time in this fast-growing area. From its beginnings over 10 years ago of deriving tests from UML statecharts, model-based testing has matured into a topic with both breadth and depth. Testing embedded systems is a natural application of MBT, and this book hits the nail exactly on the head. Numerous topics are presented clearly, thoroughly, and concisely in this cutting-edge book. The authors are world-class leading experts in this area and teach us well-used

  3. Model-based internal wave processing

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  4. Model Based Control of Reefer Container Systems

    DEFF Research Database (Denmark)

    Sørensen, Kresten Kjær

    This thesis is concerned with the development of model based control for the Star Cool refrigerated container (reefer) with the objective of reducing energy consumption. This project has been carried out under the Danish Industrial PhD programme and has been financed by Lodam together with the Da......This thesis is concerned with the development of model based control for the Star Cool refrigerated container (reefer) with the objective of reducing energy consumption. This project has been carried out under the Danish Industrial PhD programme and has been financed by Lodam together...

  5. Model-based human reliability analysis: prospects and requirements

    International Nuclear Information System (INIS)

    Mosleh, A.; Chang, Y.H.

    2004-01-01

    Major limitations of the conventional methods for human reliability analysis (HRA), particularly those developed for operator response analysis in probabilistic safety assessments (PSA) of nuclear power plants, are summarized as a motivation for the need and a basis for developing requirements for the next generation HRA methods. It is argued that a model-based approach that provides explicit cognitive causal links between operator behaviors and directly or indirectly measurable causal factors should be at the core of the advanced methods. An example of such causal model is briefly reviewed, where due to the model complexity and input requirements can only be currently implemented in a dynamic PSA environment. The computer simulation code developed for this purpose is also described briefly, together with current limitations in the models, data, and the computer implementation

  6. Service creation: a model-based approach

    NARCIS (Netherlands)

    Quartel, Dick; van Sinderen, Marten J.; Ferreira Pires, Luis

    1999-01-01

    This paper presents a model-based approach to support service creation. In this approach, services are assumed to be created from (available) software components. The creation process may involve multiple design steps in which the requested service is repeatedly decomposed into more detailed

  7. Model based development of engine control algorithms

    NARCIS (Netherlands)

    Dekker, H.J.; Sturm, W.L.

    1996-01-01

    Model based development of engine control systems has several advantages. The development time and costs are strongly reduced because much of the development and optimization work is carried out by simulating both engine and control system. After optimizing the control algorithm it can be executed

  8. An acoustical model based monitoring network

    NARCIS (Netherlands)

    Wessels, P.W.; Basten, T.G.H.; Eerden, F.J.M. van der

    2010-01-01

    In this paper the approach for an acoustical model based monitoring network is demonstrated. This network is capable of reconstructing a noise map, based on the combination of measured sound levels and an acoustic model of the area. By pre-calculating the sound attenuation within the network the

  9. Approximation Algorithms for Model-Based Diagnosis

    NARCIS (Netherlands)

    Feldman, A.B.

    2010-01-01

    Model-based diagnosis is an area of abductive inference that uses a system model, together with observations about system behavior, to isolate sets of faulty components (diagnoses) that explain the observed behavior, according to some minimality criterion. This thesis presents greedy approximation

  10. Probabilistic Model-based Background Subtraction

    DEFF Research Database (Denmark)

    Krüger, Volker; Anderson, Jakob; Prehn, Thomas

    2005-01-01

    is the correlation between pixels. In this paper we introduce a model-based background subtraction approach which facilitates prior knowledge of pixel correlations for clearer and better results. Model knowledge is being learned from good training video data, the data is stored for fast access in a hierarchical...

  11. Opinion dynamics model based on quantum formalism

    Energy Technology Data Exchange (ETDEWEB)

    Artawan, I. Nengah, E-mail: nengahartawan@gmail.com [Theoretical Physics Division, Department of Physics, Udayana University (Indonesia); Trisnawati, N. L. P., E-mail: nlptrisnawati@gmail.com [Biophysics, Department of Physics, Udayana University (Indonesia)

    2016-03-11

    Opinion dynamics model based on quantum formalism is proposed. The core of the quantum formalism is on the half spin dynamics system. In this research the implicit time evolution operators are derived. The analogy between the model with Deffuant dan Sznajd models is discussed.

  12. Model-based auditing using REA

    NARCIS (Netherlands)

    Weigand, H.; Elsas, P.

    2012-01-01

    The recent financial crisis has renewed interest in the value of the owner-ordered auditing tradition that starts from society's long-term interest rather than management interest. This tradition uses a model-based auditing approach in which control requirements are derived in a principled way. A

  13. Model-based testing for software safety

    NARCIS (Netherlands)

    Gurbuz, Havva Gulay; Tekinerdogan, Bedir

    2017-01-01

    Testing safety-critical systems is crucial since a failure or malfunction may result in death or serious injuries to people, equipment, or environment. An important challenge in testing is the derivation of test cases that can identify the potential faults. Model-based testing adopts models of a

  14. Model-Based Engine Control Architecture with an Extended Kalman Filter

    Science.gov (United States)

    Csank, Jeffrey T.; Connolly, Joseph W.

    2016-01-01

    This paper discusses the design and implementation of an extended Kalman filter (EKF) for model-based engine control (MBEC). Previously proposed MBEC architectures feature an optimal tuner Kalman Filter (OTKF) to produce estimates of both unmeasured engine parameters and estimates for the health of the engine. The success of this approach relies on the accuracy of the linear model and the ability of the optimal tuner to update its tuner estimates based on only a few sensors. Advances in computer processing are making it possible to replace the piece-wise linear model, developed off-line, with an on-board nonlinear model running in real-time. This will reduce the estimation errors associated with the linearization process, and is typically referred to as an extended Kalman filter. The nonlinear extended Kalman filter approach is applied to the Commercial Modular Aero-Propulsion System Simulation 40,000 (C-MAPSS40k) and compared to the previously proposed MBEC architecture. The results show that the EKF reduces the estimation error, especially during transient operation.

  15. Springer handbook of model-based science

    CERN Document Server

    Bertolotti, Tommaso

    2017-01-01

    The handbook offers the first comprehensive reference guide to the interdisciplinary field of model-based reasoning. It highlights the role of models as mediators between theory and experimentation, and as educational devices, as well as their relevance in testing hypotheses and explanatory functions. The Springer Handbook merges philosophical, cognitive and epistemological perspectives on models with the more practical needs related to the application of this tool across various disciplines and practices. The result is a unique, reliable source of information that guides readers toward an understanding of different aspects of model-based science, such as the theoretical and cognitive nature of models, as well as their practical and logical aspects. The inferential role of models in hypothetical reasoning, abduction and creativity once they are constructed, adopted, and manipulated for different scientific and technological purposes is also discussed. Written by a group of internationally renowned experts in ...

  16. Model-based version management system framework

    International Nuclear Information System (INIS)

    Mehmood, W.

    2016-01-01

    In this paper we present a model-based version management system. Version Management System (VMS) a branch of software configuration management (SCM) aims to provide a controlling mechanism for evolution of software artifacts created during software development process. Controlling the evolution requires many activities to perform, such as, construction and creation of versions, identification of differences between versions, conflict detection and merging. Traditional VMS systems are file-based and consider software systems as a set of text files. File based VMS systems are not adequate for performing software configuration management activities such as, version control on software artifacts produced in earlier phases of the software life cycle. New challenges of model differencing, merge, and evolution control arise while using models as central artifact. The goal of this work is to present a generic framework model-based VMS which can be used to overcome the problem of tradition file-based VMS systems and provide model versioning services. (author)

  17. Online constrained model-based reinforcement learning

    CSIR Research Space (South Africa)

    Van Niekerk, B

    2017-08-01

    Full Text Available Constrained Model-based Reinforcement Learning Benjamin van Niekerk School of Computer Science University of the Witwatersrand South Africa Andreas Damianou∗ Amazon.com Cambridge, UK Benjamin Rosman Council for Scientific and Industrial Research, and School... MULTIPLE SHOOTING Using direct multiple shooting (Bock and Plitt, 1984), problem (1) can be transformed into a structured non- linear program (NLP). First, the time horizon [t0, t0 + T ] is partitioned into N equal subintervals [tk, tk+1] for k = 0...

  18. Statistical models based on conditional probability distributions

    International Nuclear Information System (INIS)

    Narayanan, R.S.

    1991-10-01

    We present a formulation of statistical mechanics models based on conditional probability distribution rather than a Hamiltonian. We show that it is possible to realize critical phenomena through this procedure. Closely linked with this formulation is a Monte Carlo algorithm, in which a configuration generated is guaranteed to be statistically independent from any other configuration for all values of the parameters, in particular near the critical point. (orig.)

  19. PV panel model based on datasheet values

    DEFF Research Database (Denmark)

    Sera, Dezso; Teodorescu, Remus; Rodriguez, Pedro

    2007-01-01

    This work presents the construction of a model for a PV panel using the single-diode five-parameters model, based exclusively on data-sheet parameters. The model takes into account the series and parallel (shunt) resistance of the panel. The equivalent circuit and the basic equations of the PV cell....... Based on these equations, a PV panel model, which is able to predict the panel behavior in different temperature and irradiance conditions, is built and tested....

  20. A framework for model-based optimization of bioprocesses under uncertainty: Identifying critical parameters and operating variables

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Meyer, Anne S.; Gernaey, Krist

    2011-01-01

    This study presents the development and application of a systematic model-based framework for bioprocess optimization, evaluated on a cellulosic ethanol production case study. The implementation of the framework involves the use of dynamic simulations, sophisticated uncertainty analysis (Monte...

  1. Blind Separation of Acoustic Signals Combining SIMO-Model-Based Independent Component Analysis and Binary Masking

    Directory of Open Access Journals (Sweden)

    Hiekata Takashi

    2006-01-01

    Full Text Available A new two-stage blind source separation (BSS method for convolutive mixtures of speech is proposed, in which a single-input multiple-output (SIMO-model-based independent component analysis (ICA and a new SIMO-model-based binary masking are combined. SIMO-model-based ICA enables us to separate the mixed signals, not into monaural source signals but into SIMO-model-based signals from independent sources in their original form at the microphones. Thus, the separated signals of SIMO-model-based ICA can maintain the spatial qualities of each sound source. Owing to this attractive property, our novel SIMO-model-based binary masking can be applied to efficiently remove the residual interference components after SIMO-model-based ICA. The experimental results reveal that the separation performance can be considerably improved by the proposed method compared with that achieved by conventional BSS methods. In addition, the real-time implementation of the proposed BSS is illustrated.

  2. A model-based risk management framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune

    2002-08-15

    The ongoing research activity addresses these issues through two co-operative activities. The first is the IST funded research project CORAS, where Institutt for energiteknikk takes part as responsible for the work package for Risk Analysis. The main objective of the CORAS project is to develop a framework to support risk assessment of security critical systems. The second, called the Halden Open Dependability Demonstrator (HODD), is established in cooperation between Oestfold University College, local companies and HRP. The objective of HODD is to provide an open-source test bed for testing, teaching and learning about risk analysis methods, risk analysis tools, and fault tolerance techniques. The Inverted Pendulum Control System (IPCON), which main task is to keep a pendulum balanced and controlled, is the first system that has been established. In order to make risk assessment one need to know what a system does, or is intended to do. Furthermore, the risk assessment requires correct descriptions of the system, its context and all relevant features. A basic assumption is that a precise model of this knowledge, based on formal or semi-formal descriptions, such as UML, will facilitate a systematic risk assessment. It is also necessary to have a framework to integrate the different risk assessment methods. The experiences so far support this hypothesis. This report presents CORAS and the CORAS model-based risk management framework, including a preliminary guideline for model-based risk assessment. The CORAS framework for model-based risk analysis offers a structured and systematic approach to identify and assess security issues of ICT systems. From the initial assessment of IPCON, we also believe that the framework is applicable in a safety context. Further work on IPCON, as well as the experiences from the CORAS trials, will provide insight and feedback for further improvements. (Author)

  3. Model-Based Motion Tracking of Infants

    DEFF Research Database (Denmark)

    Olsen, Mikkel Damgaard; Herskind, Anna; Nielsen, Jens Bo

    2014-01-01

    Even though motion tracking is a widely used technique to analyze and measure human movements, only a few studies focus on motion tracking of infants. In recent years, a number of studies have emerged focusing on analyzing the motion pattern of infants, using computer vision. Most of these studies...... are based on 2D images, but few are based on 3D information. In this paper, we present a model-based approach for tracking infants in 3D. The study extends a novel study on graph-based motion tracking of infants and we show that the extension improves the tracking results. A 3D model is constructed...

  4. Embedded Control System Design A Model Based Approach

    CERN Document Server

    Forrai, Alexandru

    2013-01-01

    Control system design is a challenging task for practicing engineers. It requires knowledge of different engineering fields, a good understanding of technical specifications and good communication skills. The current book introduces the reader into practical control system design, bridging  the gap between theory and practice.  The control design techniques presented in the book are all model based., considering the needs and possibilities of practicing engineers. Classical control design techniques are reviewed and methods are presented how to verify the robustness of the design. It is how the designed control algorithm can be implemented in real-time and tested, fulfilling different safety requirements. Good design practices and the systematic software development process are emphasized in the book according to the generic standard IEC61508. The book is mainly addressed to practicing control and embedded software engineers - working in research and development – as well as graduate students who are face...

  5. Test-Driven, Model-Based Systems Engineering

    DEFF Research Database (Denmark)

    Munck, Allan

    Hearing systems have evolved over many years from simple mechanical devices (horns) to electronic units consisting of microphones, amplifiers, analog filters, loudspeakers, batteries, etc. Digital signal processors replaced analog filters to provide better performance end new features. Central....... This thesis concerns methods for identifying, selecting and implementing tools for various aspects of model-based systems engineering. A comprehensive method was proposed that include several novel steps such as techniques for analyzing the gap between requirements and tool capabilities. The method...... was verified with good results in two case studies for selection of a traceability tool (single-tool scenario) and a set of modeling tools (multi-tool scenarios). Models must be subjected to testing to allow engineers to predict functionality and performance of systems. Test-first strategies are known...

  6. Intelligent Transportation and Evacuation Planning A Modeling-Based Approach

    CERN Document Server

    Naser, Arab

    2012-01-01

    Intelligent Transportation and Evacuation Planning: A Modeling-Based Approach provides a new paradigm for evacuation planning strategies and techniques. Recently, evacuation planning and modeling have increasingly attracted interest among researchers as well as government officials. This interest stems from the recent catastrophic hurricanes and weather-related events that occurred in the southeastern United States (Hurricane Katrina and Rita). The evacuation methods that were in place before and during the hurricanes did not work well and resulted in thousands of deaths. This book offers insights into the methods and techniques that allow for implementing mathematical-based, simulation-based, and integrated optimization and simulation-based engineering approaches for evacuation planning. This book also: Comprehensively discusses the application of mathematical models for evacuation and intelligent transportation modeling Covers advanced methodologies in evacuation modeling and planning Discusses principles a...

  7. Integrating Design Decision Management with Model-based Software Development

    DEFF Research Database (Denmark)

    Könemann, Patrick

    Design decisions are continuously made during the development of software systems and are important artifacts for design documentation. Dedicated decision management systems are often used to capture such design knowledge. Most such systems are, however, separated from the design artifacts...... of the system. In model-based software development, where design models are used to develop a software system, outcomes of many design decisions have big impact on design models. The realization of design decisions is often manual and tedious work on design models. Moreover, keeping design models consistent......, or by ignoring the causes. This substitutes manual reviews to some extent. The concepts, implemented in a tool, have been validated with design patterns, refactorings, and domain level tests that comprise a replay of a real project. This proves the applicability of the solution to realistic examples...

  8. Optimal Model-Based Control in HVAC Systems

    DEFF Research Database (Denmark)

    Komareji, Mohammad; Stoustrup, Jakob; Rasmussen, Henrik

    2008-01-01

    is developed. Then the optimal control structure is designed and implemented. The HVAC system is splitted into two subsystems. By selecting the right set-points and appropriate cost functions for each subsystem controller the optimal control strategy is respected to gaurantee the minimum thermal and electrical......This paper presents optimal model-based control of a heating, ventilating, and air-conditioning (HVAC) system. This HVAC system is made of two heat exchangers: an air-to-air heat exchanger (a rotary wheel heat recovery) and a water-to- air heat exchanger. First dynamic model of the HVAC system...... energy consumption. Finally, the controller is applied to control the mentioned HVAC system and the results show that the expected goals are fulfilled....

  9. Model-Based Systems Engineering Approach to Managing Mass Margin

    Science.gov (United States)

    Chung, Seung H.; Bayer, Todd J.; Cole, Bjorn; Cooke, Brian; Dekens, Frank; Delp, Christopher; Lam, Doris

    2012-01-01

    When designing a flight system from concept through implementation, one of the fundamental systems engineering tasks ismanaging the mass margin and a mass equipment list (MEL) of the flight system. While generating a MEL and computing a mass margin is conceptually a trivial task, maintaining consistent and correct MELs and mass margins can be challenging due to the current practices of maintaining duplicate information in various forms, such as diagrams and tables, and in various media, such as files and emails. We have overcome this challenge through a model-based systems engineering (MBSE) approach within which we allow only a single-source-of-truth. In this paper we describe the modeling patternsused to capture the single-source-of-truth and the views that have been developed for the Europa Habitability Mission (EHM) project, a mission concept study, at the Jet Propulsion Laboratory (JPL).

  10. Model-Based Systems Engineering Pilot Program at NASA Langley

    Science.gov (United States)

    Vipavetz, Kevin G.; Murphy, Douglas G.; Infeld, Samatha I.

    2012-01-01

    NASA Langley Research Center conducted a pilot program to evaluate the benefits of using a Model-Based Systems Engineering (MBSE) approach during the early phase of the Materials International Space Station Experiment-X (MISSE-X) project. The goal of the pilot was to leverage MBSE tools and methods, including the Systems Modeling Language (SysML), to understand the net gain of utilizing this approach on a moderate size flight project. The System Requirements Review (SRR) success criteria were used to guide the work products desired from the pilot. This paper discusses the pilot project implementation, provides SysML model examples, identifies lessons learned, and describes plans for further use on MBSE on MISSE-X.

  11. Modular Architecture for Integrated Model-Based Decision Support.

    Science.gov (United States)

    Gaebel, Jan; Schreiber, Erik; Oeser, Alexander; Oeltze-Jafra, Steffen

    2018-01-01

    Model-based decision support systems promise to be a valuable addition to oncological treatments and the implementation of personalized therapies. For the integration and sharing of decision models, the involved systems must be able to communicate with each other. In this paper, we propose a modularized architecture of dedicated systems for the integration of probabilistic decision models into existing hospital environments. These systems interconnect via web services and provide model sharing and processing capabilities for clinical information systems. Along the lines of IHE integration profiles from other disciplines and the meaningful reuse of routinely recorded patient data, our approach aims for the seamless integration of decision models into hospital infrastructure and the physicians' daily work.

  12. Model Based Control of Moisture Sorption in a Historical Interior

    Directory of Open Access Journals (Sweden)

    P. Zítek

    2005-01-01

    Full Text Available This paper deals with a novel scheme for microclimate control in historical exhibition rooms, inhibiting moisture sorption phenomena that are inadmissible from the preventive conservation point of view. The impact of air humidity is the most significant harmful exposure for a great deal of the cultural heritage deposited in remote historical buildings. Leaving the interior temperature to run almost its spontaneous yearly cycle, the proposed non-linear model-based control protects exhibits from harmful variations in moisture content by compensating the temperature drifts with an adequate adjustment of the air humidity. Already implemented in a medieval interior since 1999, the proposed microclimate control has proved capable of permanently maintaining constant a desirable moisture content in organic or porous materials in the interior of a building. 

  13. A Requirements Analysis Model Based on QFD

    Institute of Scientific and Technical Information of China (English)

    TANG Zhi-wei; Nelson K.H.Tang

    2004-01-01

    The enterprise resource planning (ERP) system has emerged to offer an integrated IT solution and more and more enterprises are increasing by adopting this system and regarding it as an important innovation. However, there is already evidence of high failure risks in ERP project implementation, one major reason is poor analysis of the requirements for system implementation. In this paper, the importance of requirements analysis for ERP project implementation is highlighted, and a requirements analysis model by applying quality function deployment (QFD) is presented, which will support to conduct requirements analysis for ERP project.

  14. Model-free and model-based reward prediction errors in EEG.

    Science.gov (United States)

    Sambrook, Thomas D; Hardwick, Ben; Wills, Andy J; Goslin, Jeremy

    2018-05-24

    Learning theorists posit two reinforcement learning systems: model-free and model-based. Model-based learning incorporates knowledge about structure and contingencies in the world to assign candidate actions with an expected value. Model-free learning is ignorant of the world's structure; instead, actions hold a value based on prior reinforcement, with this value updated by expectancy violation in the form of a reward prediction error. Because they use such different learning mechanisms, it has been previously assumed that model-based and model-free learning are computationally dissociated in the brain. However, recent fMRI evidence suggests that the brain may compute reward prediction errors to both model-free and model-based estimates of value, signalling the possibility that these systems interact. Because of its poor temporal resolution, fMRI risks confounding reward prediction errors with other feedback-related neural activity. In the present study, EEG was used to show the presence of both model-based and model-free reward prediction errors and their place in a temporal sequence of events including state prediction errors and action value updates. This demonstration of model-based prediction errors questions a long-held assumption that model-free and model-based learning are dissociated in the brain. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Confronting uncertainty in model-based geostatistics using Markov Chain Monte Carlo simulation

    NARCIS (Netherlands)

    Minasny, B.; Vrugt, J.A.; McBratney, A.B.

    2011-01-01

    This paper demonstrates for the first time the use of Markov Chain Monte Carlo (MCMC) simulation for parameter inference in model-based soil geostatistics. We implemented the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm to jointly summarize the posterior

  16. Advanced autonomous model-based operation of industrial process systems (Autoprofit) : technological developments and future perspectives

    NARCIS (Netherlands)

    Ozkan, L.; Bombois, X.J.A.; Ludlage, J.H.A.; Rojas, C.R.; Hjalmarsson, H.; Moden, P.E.; Lundh, M.; Backx, A.C.P.M.; Van den Hof, P.M.J.

    2016-01-01

    Model-based operation support technology such as Model Predictive Control (MPC) is a proven and accepted technology for multivariable and constrained large scale control problems in process industry. Despite the growing number of successful implementations, the low level of operational efficiency of

  17. Beyond the Central Dogma: Model-Based Learning of How Genes Determine Phenotypes

    Science.gov (United States)

    Reinagel, Adam; Speth, Elena Bray

    2016-01-01

    In an introductory biology course, we implemented a learner-centered, model-based pedagogy that frequently engaged students in building conceptual models to explain how genes determine phenotypes. Model-building tasks were incorporated within case studies and aimed at eliciting students' understanding of 1) the origin of variation in a population…

  18. A model-based software development methodology for high-end automotive components

    NARCIS (Netherlands)

    Ravanan, Mahmoud

    2014-01-01

    This report provides a model-based software development methodology for high-end automotive components. The V-model is used as a process model throughout the development of the software platform. It offers a framework that simplifies the relation between requirements, design, implementation,

  19. Virtual medical plant modeling based on L-system | Ding | African ...

    African Journals Online (AJOL)

    ... aid of graphics and PlantVR, we implemented the plant shape and 3-D structure's reconstruction. Conclusion: Three-dimensional structure virtual plant growth model based on time- controlled L-system has been successfully established. Keywords: Drug R&D, toxicity, medical plants, fractals; L-system; quasi binary-trees.

  20. Unifying Model-Based and Reactive Programming within a Model-Based Executive

    Science.gov (United States)

    Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)

    1999-01-01

    Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.

  1. Preoperative screening: value of previous tests.

    Science.gov (United States)

    Macpherson, D S; Snow, R; Lofgren, R P

    1990-12-15

    To determine the frequency of tests done in the year before elective surgery that might substitute for preoperative screening tests and to determine the frequency of test results that change from a normal value to a value likely to alter perioperative management. Retrospective cohort analysis of computerized laboratory data (complete blood count, sodium, potassium, and creatinine levels, prothrombin time, and partial thromboplastin time). Urban tertiary care Veterans Affairs Hospital. Consecutive sample of 1109 patients who had elective surgery in 1988. At admission, 7549 preoperative tests were done, 47% of which duplicated tests performed in the previous year. Of 3096 previous results that were normal as defined by hospital reference range and done closest to the time of but before admission (median interval, 2 months), 13 (0.4%; 95% CI, 0.2% to 0.7%), repeat values were outside a range considered acceptable for surgery. Most of the abnormalities were predictable from the patient's history, and most were not noted in the medical record. Of 461 previous tests that were abnormal, 78 (17%; CI, 13% to 20%) repeat values at admission were outside a range considered acceptable for surgery (P less than 0.001, frequency of clinically important abnormalities of patients with normal previous results with those with abnormal previous results). Physicians evaluating patients preoperatively could safely substitute the previous test results analyzed in this study for preoperative screening tests if the previous tests are normal and no obvious indication for retesting is present.

  2. Automatic electromagnetic valve for previous vacuum

    International Nuclear Information System (INIS)

    Granados, C. E.; Martin, F.

    1959-01-01

    A valve which permits the maintenance of an installation vacuum when electric current fails is described. It also lets the air in the previous vacuum bomb to prevent the oil ascending in the vacuum tubes. (Author)

  3. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  4. Soft object deformation monitoring and learning for model-based robotic hand manipulation.

    Science.gov (United States)

    Cretu, Ana-Maria; Payeur, Pierre; Petriu, Emil M

    2012-06-01

    This paper discusses the design and implementation of a framework that automatically extracts and monitors the shape deformations of soft objects from a video sequence and maps them with force measurements with the goal of providing the necessary information to the controller of a robotic hand to ensure safe model-based deformable object manipulation. Measurements corresponding to the interaction force at the level of the fingertips and to the position of the fingertips of a three-finger robotic hand are associated with the contours of a deformed object tracked in a series of images using neural-network approaches. The resulting model captures the behavior of the object and is able to predict its behavior for previously unseen interactions without any assumption on the object's material. The availability of such models can contribute to the improvement of a robotic hand controller, therefore allowing more accurate and stable grasp while providing more elaborate manipulation capabilities for deformable objects. Experiments performed for different objects, made of various materials, reveal that the method accurately captures and predicts the object's shape deformation while the object is submitted to external forces applied by the robot fingers. The proposed method is also fast and insensitive to severe contour deformations, as well as to smooth changes in lighting, contrast, and background.

  5. Model-based estimation with boundary side information or boundary regularization

    International Nuclear Information System (INIS)

    Chiao, P.C.; Rogers, W.L.; Fessler, J.A.; Clinthorne, N.H.; Hero, A.O.

    1994-01-01

    The authors have previously developed a model-based strategy for joint estimation of myocardial perfusion and boundaries using ECT (Emission Computed Tomography). The authors have also reported difficulties with boundary estimation in low contrast and low count rate situations. In this paper, the authors propose using boundary side information (obtainable from high resolution MRI and CT images) or boundary regularization to improve both perfusion and boundary estimation in these situations. To fuse boundary side information into the emission measurements, the authors formulate a joint log-likelihood function to include auxiliary boundary measurements as well as ECT projection measurements. In addition, the authors introduce registration parameters to align auxiliary boundary measurements with ECT measurements and jointly estimate these parameters with other parameters of interest from the composite measurements. In simulated PET O-15 water myocardial perfusion studies using a simplified model, the authors show that the joint estimation improves perfusion estimation performance and gives boundary alignment accuracy of <0.5 mm even at 0.2 million counts. The authors implement boundary regularization through formulating a penalized log-likelihood function. The authors also demonstrate in simulations that simultaneous regularization of the epicardial boundary and myocardial thickness gives comparable perfusion estimation accuracy with the use of boundary side information

  6. Model-based estimation with boundary side information or boundary regularization [cardiac emission CT].

    Science.gov (United States)

    Chiao, P C; Rogers, W L; Fessler, J A; Clinthorne, N H; Hero, A O

    1994-01-01

    The authors have previously developed a model-based strategy for joint estimation of myocardial perfusion and boundaries using ECT (emission computed tomography). They have also reported difficulties with boundary estimation in low contrast and low count rate situations. Here they propose using boundary side information (obtainable from high resolution MRI and CT images) or boundary regularization to improve both perfusion and boundary estimation in these situations. To fuse boundary side information into the emission measurements, the authors formulate a joint log-likelihood function to include auxiliary boundary measurements as well as ECT projection measurements. In addition, they introduce registration parameters to align auxiliary boundary measurements with ECT measurements and jointly estimate these parameters with other parameters of interest from the composite measurements. In simulated PET O-15 water myocardial perfusion studies using a simplified model, the authors show that the joint estimation improves perfusion estimation performance and gives boundary alignment accuracy of <0.5 mm even at 0.2 million counts. They implement boundary regularization through formulating a penalized log-likelihood function. They also demonstrate in simulations that simultaneous regularization of the epicardial boundary and myocardial thickness gives comparable perfusion estimation accuracy with the use of boundary side information.

  7. The Actualization of Literary Learning Model Based on Verbal-Linguistic Intelligence

    Science.gov (United States)

    Hali, Nur Ihsan

    2017-01-01

    This article is inspired by Howard Gardner's concept of linguistic intelligence and also from some authors' previous writings. All of them became the authors' reference in developing ideas on constructing a literary learning model based on linguistic intelligence. The writing of this article is not done by collecting data empirically, but by…

  8. Diminishing musyarakah investment model based on equity

    Science.gov (United States)

    Jaffar, Maheran Mohd; Zain, Shaharir Mohamad; Jemain, Abdul Aziz

    2017-11-01

    Most of the mudharabah and musyarakah contract funds are involved in debt financing. This does not support the theory that profit sharing contract is better than that of debt financing due to the sharing of risks and ownership of equity. Indeed, it is believed that Islamic banking is a financial model based on equity or musyarakah which emphasis on the sharing of risks, profit and loss in the investment between the investor and entrepreneur. The focus of this paper is to introduce the mathematical model that internalizes diminishing musyarakah, the sharing of profit and equity between entrepreneur and investor. The entrepreneur pays monthly-differed payment to buy out the equity that belongs to the investor (bank) where at the end of the specified period, the entrepreneur owns the business and the investor (bank) exits the joint venture. The model is able to calculate the amount of equity at any time for both parties and hence would be a guide in helping to estimate the value of investment should the entrepreneur or investor exit before the end of the specified period. The model is closer to the Islamic principles for justice and fairness.

  9. Model-Based Method for Sensor Validation

    Science.gov (United States)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  10. Model based energy benchmarking for glass furnace

    International Nuclear Information System (INIS)

    Sardeshpande, Vishal; Gaitonde, U.N.; Banerjee, Rangan

    2007-01-01

    Energy benchmarking of processes is important for setting energy efficiency targets and planning energy management strategies. Most approaches used for energy benchmarking are based on statistical methods by comparing with a sample of existing plants. This paper presents a model based approach for benchmarking of energy intensive industrial processes and illustrates this approach for industrial glass furnaces. A simulation model for a glass furnace is developed using mass and energy balances, and heat loss equations for the different zones and empirical equations based on operating practices. The model is checked with field data from end fired industrial glass furnaces in India. The simulation model enables calculation of the energy performance of a given furnace design. The model results show the potential for improvement and the impact of different operating and design preferences on specific energy consumption. A case study for a 100 TPD end fired furnace is presented. An achievable minimum energy consumption of about 3830 kJ/kg is estimated for this furnace. The useful heat carried by glass is about 53% of the heat supplied by the fuel. Actual furnaces operating at these production scales have a potential for reduction in energy consumption of about 20-25%

  11. 3-D model-based vehicle tracking.

    Science.gov (United States)

    Lou, Jianguang; Tan, Tieniu; Hu, Weiming; Yang, Hao; Maybank, Steven J

    2005-10-01

    This paper aims at tracking vehicles from monocular intensity image sequences and presents an efficient and robust approach to three-dimensional (3-D) model-based vehicle tracking. Under the weak perspective assumption and the ground-plane constraint, the movements of model projection in the two-dimensional image plane can be decomposed into two motions: translation and rotation. They are the results of the corresponding movements of 3-D translation on the ground plane (GP) and rotation around the normal of the GP, which can be determined separately. A new metric based on point-to-line segment distance is proposed to evaluate the similarity between an image region and an instantiation of a 3-D vehicle model under a given pose. Based on this, we provide an efficient pose refinement method to refine the vehicle's pose parameters. An improved EKF is also proposed to track and to predict vehicle motion with a precise kinematics model. Experimental results with both indoor and outdoor data show that the algorithm obtains desirable performance even under severe occlusion and clutter.

  12. 77 FR 70176 - Previous Participation Certification

    Science.gov (United States)

    2012-11-23

    ... participants' previous participation in government programs and ensure that the past record is acceptable prior... information is designed to be 100 percent automated and digital submission of all data and certifications is... government programs and ensure that the past record is acceptable prior to granting approval to participate...

  13. On the Tengiz petroleum deposit previous study

    International Nuclear Information System (INIS)

    Nysangaliev, A.N.; Kuspangaliev, T.K.

    1997-01-01

    Tengiz petroleum deposit previous study is described. Some consideration about structure of productive formation, specific characteristic properties of petroleum-bearing collectors are presented. Recommendation on their detail study and using of experience on exploration and development of petroleum deposit which have analogy on most important geological and industrial parameters are given. (author)

  14. Subsequent pregnancy outcome after previous foetal death

    NARCIS (Netherlands)

    Nijkamp, J. W.; Korteweg, F. J.; Holm, J. P.; Timmer, A.; Erwich, J. J. H. M.; van Pampus, M. G.

    Objective: A history of foetal death is a risk factor for complications and foetal death in subsequent pregnancies as most previous risk factors remain present and an underlying cause of death may recur. The purpose of this study was to evaluate subsequent pregnancy outcome after foetal death and to

  15. Model based control of refrigeration systems

    Energy Technology Data Exchange (ETDEWEB)

    Sloth Larsen, L.F.

    2005-11-15

    The subject for this Ph.D. thesis is model based control of refrigeration systems. Model based control covers a variety of different types of controls, that incorporates mathematical models. In this thesis the main subject therefore has been restricted to deal with system optimizing control. The optimizing control is divided into two layers, where the system oriented top layers deals with set-point optimizing control and the lower layer deals with dynamical optimizing control in the subsystems. The thesis has two main contributions, i.e. a novel approach for set-point optimization and a novel approach for desynchronization based on dynamical optimization. The focus in the development of the proposed set-point optimizing control has been on deriving a simple and general method, that with ease can be applied on various compositions of the same class of systems, such as refrigeration systems. The method is based on a set of parameter depended static equations describing the considered process. By adapting the parameters to the given process, predict the steady state and computing a steady state gradient of the cost function, the process can be driven continuously towards zero gradient, i.e. the optimum (if the cost function is convex). The method furthermore deals with system constrains by introducing barrier functions, hereby the best possible performance taking the given constrains in to account can be obtained, e.g. under extreme operational conditions. The proposed method has been applied on a test refrigeration system, placed at Aalborg University, for minimization of the energy consumption. Here it was proved that by using general static parameter depended system equations it was possible drive the set-points close to the optimum and thus reduce the power consumption with up to 20%. In the dynamical optimizing layer the idea is to optimize the operation of the subsystem or the groupings of subsystems, that limits the obtainable system performance. In systems

  16. Model Based Autonomy for Robust Mars Operations

    Science.gov (United States)

    Kurien, James A.; Nayak, P. Pandurang; Williams, Brian C.; Lau, Sonie (Technical Monitor)

    1998-01-01

    Space missions have historically relied upon a large ground staff, numbering in the hundreds for complex missions, to maintain routine operations. When an anomaly occurs, this small army of engineers attempts to identify and work around the problem. A piloted Mars mission, with its multiyear duration, cost pressures, half-hour communication delays and two-week blackouts cannot be closely controlled by a battalion of engineers on Earth. Flight crew involvement in routine system operations must also be minimized to maximize science return. It also may be unrealistic to require the crew have the expertise in each mission subsystem needed to diagnose a system failure and effect a timely repair, as engineers did for Apollo 13. Enter model-based autonomy, which allows complex systems to autonomously maintain operation despite failures or anomalous conditions, contributing to safe, robust, and minimally supervised operation of spacecraft, life support, In Situ Resource Utilization (ISRU) and power systems. Autonomous reasoning is central to the approach. A reasoning algorithm uses a logical or mathematical model of a system to infer how to operate the system, diagnose failures and generate appropriate behavior to repair or reconfigure the system in response. The 'plug and play' nature of the models enables low cost development of autonomy for multiple platforms. Declarative, reusable models capture relevant aspects of the behavior of simple devices (e.g. valves or thrusters). Reasoning algorithms combine device models to create a model of the system-wide interactions and behavior of a complex, unique artifact such as a spacecraft. Rather than requiring engineers to all possible interactions and failures at design time or perform analysis during the mission, the reasoning engine generates the appropriate response to the current situation, taking into account its system-wide knowledge, the current state, and even sensor failures or unexpected behavior.

  17. On the Use of Generalized Volume Scattering Models for the Improvement of General Polarimetric Model-Based Decomposition

    Directory of Open Access Journals (Sweden)

    Qinghua Xie

    2017-01-01

    Full Text Available Recently, a general polarimetric model-based decomposition framework was proposed by Chen et al., which addresses several well-known limitations in previous decomposition methods and implements a simultaneous full-parameter inversion by using complete polarimetric information. However, it only employs four typical models to characterize the volume scattering component, which limits the parameter inversion performance. To overcome this issue, this paper presents two general polarimetric model-based decomposition methods by incorporating the generalized volume scattering model (GVSM or simplified adaptive volume scattering model, (SAVSM proposed by Antropov et al. and Huang et al., respectively, into the general decomposition framework proposed by Chen et al. By doing so, the final volume coherency matrix structure is selected from a wide range of volume scattering models within a continuous interval according to the data itself without adding unknowns. Moreover, the new approaches rely on one nonlinear optimization stage instead of four as in the previous method proposed by Chen et al. In addition, the parameter inversion procedure adopts the modified algorithm proposed by Xie et al. which leads to higher accuracy and more physically reliable output parameters. A number of Monte Carlo simulations of polarimetric synthetic aperture radar (PolSAR data are carried out and show that the proposed method with GVSM yields an overall improvement in the final accuracy of estimated parameters and outperforms both the version using SAVSM and the original approach. In addition, C-band Radarsat-2 and L-band AIRSAR fully polarimetric images over the San Francisco region are also used for testing purposes. A detailed comparison and analysis of decomposition results over different land-cover types are conducted. According to this study, the use of general decomposition models leads to a more accurate quantitative retrieval of target parameters. However, there

  18. Subsequent childbirth after a previous traumatic birth.

    Science.gov (United States)

    Beck, Cheryl Tatano; Watson, Sue

    2010-01-01

    Nine percent of new mothers in the United States who participated in the Listening to Mothers II Postpartum Survey screened positive for meeting the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition criteria for posttraumatic stress disorder after childbirth. Women who have had a traumatic birth experience report fewer subsequent children and a longer length of time before their second baby. Childbirth-related posttraumatic stress disorder impacts couples' physical relationship, communication, conflict, emotions, and bonding with their children. The purpose of this study was to describe the meaning of women's experiences of a subsequent childbirth after a previous traumatic birth. Phenomenology was the research design used. An international sample of 35 women participated in this Internet study. Women were asked, "Please describe in as much detail as you can remember your subsequent pregnancy, labor, and delivery following your previous traumatic birth." Colaizzi's phenomenological data analysis approach was used to analyze the stories of the 35 women. Data analysis yielded four themes: (a) riding the turbulent wave of panic during pregnancy; (b) strategizing: attempts to reclaim their body and complete the journey to motherhood; (c) bringing reverence to the birthing process and empowering women; and (d) still elusive: the longed-for healing birth experience. Subsequent childbirth after a previous birth trauma has the potential to either heal or retraumatize women. During pregnancy, women need permission and encouragement to grieve their prior traumatic births to help remove the burden of their invisible pain.

  19. Model-based reasoning and the control of process plants

    International Nuclear Information System (INIS)

    Vaelisuo, Heikki

    1993-02-01

    In addition to feedback control, safe and economic operation of industrial process plants requires discrete-event type logic control like for example automatic control sequences, interlocks, etc. A lot of complex routine reasoning is involved in the design and verification and validation (VandV) of such automatics. Similar reasoning tasks are encountered during plant operation in action planning and fault diagnosis. The low-level part of the required problem solving is so straightforward that it could be accomplished by a computer if only there were plant models which allow versatile mechanised reasoning. Such plant models and corresponding inference algorithms are the main subject of this report. Deep knowledge and qualitative modelling play an essential role in this work. Deep knowledge refers to mechanised reasoning based on the first principles of the phenomena in the problem domain. Qualitative modelling refers to knowledge representation formalism and related reasoning methods which allow solving problems on an abstraction level higher than for example traditional simulation and optimisation. Prolog is a commonly used platform for artificial intelligence (Al) applications. Constraint logic languages like CLP(R) and Prolog-III extend the scope of logic programming to numeric problem solving. In addition they allow a programming style which often reduces the computational complexity significantly. An approach to model-based reasoning implemented in constraint logic programming language CLP(R) is presented. The approach is based on some of the principles of QSIM, an algorithm for qualitative simulation. It is discussed how model-based reasoning can be applied in the design and VandV of plant automatics and in action planning during plant operation. A prototype tool called ISIR is discussed and some initial results obtained during the development of the tool are presented. The results presented originate from preliminary test results of the prototype obtained

  20. Model-based Acceleration Control of Turbofan Engines with a Hammerstein-Wiener Representation

    Science.gov (United States)

    Wang, Jiqiang; Ye, Zhifeng; Hu, Zhongzhi; Wu, Xin; Dimirovsky, Georgi; Yue, Hong

    2017-05-01

    Acceleration control of turbofan engines is conventionally designed through either schedule-based or acceleration-based approach. With the widespread acceptance of model-based design in aviation industry, it becomes necessary to investigate the issues associated with model-based design for acceleration control. In this paper, the challenges for implementing model-based acceleration control are explained; a novel Hammerstein-Wiener representation of engine models is introduced; based on the Hammerstein-Wiener model, a nonlinear generalized minimum variance type of optimal control law is derived; the feature of the proposed approach is that it does not require the inversion operation that usually upsets those nonlinear control techniques. The effectiveness of the proposed control design method is validated through a detailed numerical study.

  1. Optimal portfolio model based on WVAR

    OpenAIRE

    Hao, Tianyu

    2012-01-01

    This article is focused on using a new measurement of risk-- Weighted Value at Risk to develop a new method of constructing initiate from the TVAR solving problem, based on MATLAB software, using the historical simulation method (avoiding income distribution will be assumed to be normal), the results of previous studies also based on, study the U.S. Nasdaq composite index, combining the Simpson formula for the solution of TVAR and its deeply study; then, through the representation of WVAR for...

  2. Model-based Bayesian signal extraction algorithm for peripheral nerves

    Science.gov (United States)

    Eggers, Thomas E.; Dweiri, Yazan M.; McCallum, Grant A.; Durand, Dominique M.

    2017-10-01

    Objective. Multi-channel cuff electrodes have recently been investigated for extracting fascicular-level motor commands from mixed neural recordings. Such signals could provide volitional, intuitive control over a robotic prosthesis for amputee patients. Recent work has demonstrated success in extracting these signals in acute and chronic preparations using spatial filtering techniques. These extracted signals, however, had low signal-to-noise ratios and thus limited their utility to binary classification. In this work a new algorithm is proposed which combines previous source localization approaches to create a model based method which operates in real time. Approach. To validate this algorithm, a saline benchtop setup was created to allow the precise placement of artificial sources within a cuff and interference sources outside the cuff. The artificial source was taken from five seconds of chronic neural activity to replicate realistic recordings. The proposed algorithm, hybrid Bayesian signal extraction (HBSE), is then compared to previous algorithms, beamforming and a Bayesian spatial filtering method, on this test data. An example chronic neural recording is also analyzed with all three algorithms. Main results. The proposed algorithm improved the signal to noise and signal to interference ratio of extracted test signals two to three fold, as well as increased the correlation coefficient between the original and recovered signals by 10-20%. These improvements translated to the chronic recording example and increased the calculated bit rate between the recovered signals and the recorded motor activity. Significance. HBSE significantly outperforms previous algorithms in extracting realistic neural signals, even in the presence of external noise sources. These results demonstrate the feasibility of extracting dynamic motor signals from a multi-fascicled intact nerve trunk, which in turn could extract motor command signals from an amputee for the end goal of

  3. Model-based explanation of plant knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Huuskonen, P.J. [VTT Electronics, Oulu (Finland). Embedded Software

    1997-12-31

    This thesis deals with computer explanation of knowledge related to design and operation of industrial plants. The needs for explanation are motivated through case studies and literature reviews. A general framework for analysing plant explanations is presented. Prototypes demonstrate key mechanisms for implementing parts of the framework. Power plants, steel mills, paper factories, and high energy physics control systems are studied to set requirements for explanation. The main problems are seen to be either lack or abundance of information. Design knowledge in particular is found missing at plants. Support systems and automation should be enhanced with ways to explain plant knowledge to the plant staff. A framework is formulated for analysing explanations of plant knowledge. It consists of three parts: 1. a typology of explanation, organised by the class of knowledge (factual, functional, or strategic) and by the target of explanation (processes, automation, or support systems), 2. an identification of explanation tasks generic for the plant domain, and 3. an identification of essential model types for explanation (structural, behavioural, functional, and teleological). The tasks use the models to create the explanations of the given classes. Key mechanisms are discussed to implement the generic explanation tasks. Knowledge representations based on objects and their relations form a vocabulary to model and present plant knowledge. A particular class of models, means-end models, are used to explain plant knowledge. Explanations are generated through searches in the models. Hypertext is adopted to communicate explanations over dialogue based on context. The results are demonstrated in prototypes. The VICE prototype explains the reasoning of an expert system for diagnosis of rotating machines at power plants. The Justifier prototype explains design knowledge obtained from an object-oriented plant design tool. Enhanced access mechanisms into on-line documentation are

  4. Model-based explanation of plant knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Huuskonen, P J [VTT Electronics, Oulu (Finland). Embedded Software

    1998-12-31

    This thesis deals with computer explanation of knowledge related to design and operation of industrial plants. The needs for explanation are motivated through case studies and literature reviews. A general framework for analysing plant explanations is presented. Prototypes demonstrate key mechanisms for implementing parts of the framework. Power plants, steel mills, paper factories, and high energy physics control systems are studied to set requirements for explanation. The main problems are seen to be either lack or abundance of information. Design knowledge in particular is found missing at plants. Support systems and automation should be enhanced with ways to explain plant knowledge to the plant staff. A framework is formulated for analysing explanations of plant knowledge. It consists of three parts: 1. a typology of explanation, organised by the class of knowledge (factual, functional, or strategic) and by the target of explanation (processes, automation, or support systems), 2. an identification of explanation tasks generic for the plant domain, and 3. an identification of essential model types for explanation (structural, behavioural, functional, and teleological). The tasks use the models to create the explanations of the given classes. Key mechanisms are discussed to implement the generic explanation tasks. Knowledge representations based on objects and their relations form a vocabulary to model and present plant knowledge. A particular class of models, means-end models, are used to explain plant knowledge. Explanations are generated through searches in the models. Hypertext is adopted to communicate explanations over dialogue based on context. The results are demonstrated in prototypes. The VICE prototype explains the reasoning of an expert system for diagnosis of rotating machines at power plants. The Justifier prototype explains design knowledge obtained from an object-oriented plant design tool. Enhanced access mechanisms into on-line documentation are

  5. Symbolic Processing Combined with Model-Based Reasoning

    Science.gov (United States)

    James, Mark

    2009-01-01

    A computer program for the detection of present and prediction of future discrete states of a complex, real-time engineering system utilizes a combination of symbolic processing and numerical model-based reasoning. One of the biggest weaknesses of a purely symbolic approach is that it enables prediction of only future discrete states while missing all unmodeled states or leading to incorrect identification of an unmodeled state as a modeled one. A purely numerical approach is based on a combination of statistical methods and mathematical models of the applicable physics and necessitates development of a complete model to the level of fidelity required for prediction. In addition, a purely numerical approach does not afford the ability to qualify its results without some form of symbolic processing. The present software implements numerical algorithms to detect unmodeled events and symbolic algorithms to predict expected behavior, correlate the expected behavior with the unmodeled events, and interpret the results in order to predict future discrete states. The approach embodied in this software differs from that of the BEAM methodology (aspects of which have been discussed in several prior NASA Tech Briefs articles), which provides for prediction of future measurements in the continuous-data domain.

  6. A New Non-LTE Model based on Super Configurations

    Science.gov (United States)

    Bar-Shalom, A.; Klapisch, M.

    1996-11-01

    Non-LTE effects are vital for the simulation of radiation in hot plasmas involving even medium Z materials. However, the exceedingly large number of atomic energy levels forbids using a detailed collisional radiative model on-line in the hydrodynamic simulations. For this purpose, greatly simplified models are required. We implemented recently Busquet's model(M. Busquet, Phys. Fluids B, 5, 4191 (1993)) in NRL's RAD2D Hydro code in conservative form (M. Klapisch et al., Bull. Am. Phys. Soc., 40, 1806 (1995), and poster at this meeting.). This model is quick and the results make sense, but in the absence of precisely defined experiments, it is difficult to asses its accuracy. We present here a new collisional radiative model based on superconfigurations( A. Bar-Shalom, J. Oreg, J. F. Seely, U. Feldman, C. M. Brown, B. A. Hammel, R. W. Lee and C. A. Back, Phys. Rev. E, 52, 6686 (1995).), intended to be a benchmark for approximate models used in hydro-codes. It uses accurate rates from the HULLAC Code. Results for various elements will be presented and compared with RADIOM.

  7. An innovation wall model based on interlayer ventilation

    International Nuclear Information System (INIS)

    Feng Jinmei; Lian Zhiwei; Hou Zhijian

    2008-01-01

    The thermal characteristics of the external wall are important to the energy consumption of the air conditioning system. Great attention should also be paid to the energy loss of the air exhaust. An innovation wall model based on interlayer ventilation is presented in this paper. The interlayer ventilation wall combines the wall and air exhaust of heating, ventilating and air conditioning (HVAC). The results of the experiment show that the energy loss of the exhaust air can be fully recovered by the interlayer ventilation wall. The cooling load can be reduced greatly because the temperature difference between the internal surface of the interlayer ventilation wall and the indoor air is very small. Clearly, the small temperature difference can enhance thermal comfort. In order to popularize the interlayer ventilation wall, technical and economical analysis is presented in this paper. Based on the buildings in the Shanghai area and a standard air conditioning system, a 4 years payback period for interlayer ventilation wall implementation was found according to the analysis

  8. Books average previous decade of economic misery.

    Science.gov (United States)

    Bentley, R Alexander; Acerbi, Alberto; Ormerod, Paul; Lampos, Vasileios

    2014-01-01

    For the 20(th) century since the Depression, we find a strong correlation between a 'literary misery index' derived from English language books and a moving average of the previous decade of the annual U.S. economic misery index, which is the sum of inflation and unemployment rates. We find a peak in the goodness of fit at 11 years for the moving average. The fit between the two misery indices holds when using different techniques to measure the literary misery index, and this fit is significantly better than other possible correlations with different emotion indices. To check the robustness of the results, we also analysed books written in German language and obtained very similar correlations with the German economic misery index. The results suggest that millions of books published every year average the authors' shared economic experiences over the past decade.

  9. Books Average Previous Decade of Economic Misery

    Science.gov (United States)

    Bentley, R. Alexander; Acerbi, Alberto; Ormerod, Paul; Lampos, Vasileios

    2014-01-01

    For the 20th century since the Depression, we find a strong correlation between a ‘literary misery index’ derived from English language books and a moving average of the previous decade of the annual U.S. economic misery index, which is the sum of inflation and unemployment rates. We find a peak in the goodness of fit at 11 years for the moving average. The fit between the two misery indices holds when using different techniques to measure the literary misery index, and this fit is significantly better than other possible correlations with different emotion indices. To check the robustness of the results, we also analysed books written in German language and obtained very similar correlations with the German economic misery index. The results suggest that millions of books published every year average the authors' shared economic experiences over the past decade. PMID:24416159

  10. Mars 2020 Model Based Systems Engineering Pilot

    Science.gov (United States)

    Dukes, Alexandra Marie

    2017-01-01

    The pilot study is led by the Integration Engineering group in NASA's Launch Services Program (LSP). The Integration Engineering (IE) group is responsible for managing the interfaces between the spacecraft and launch vehicle. This pilot investigates the utility of Model-Based Systems Engineering (MBSE) with respect to managing and verifying interface requirements. The main objectives of the pilot are to model several key aspects of the Mars 2020 integrated operations and interface requirements based on the design and verification artifacts from Mars Science Laboratory (MSL) and to demonstrate how MBSE could be used by LSP to gain further insight on the interface between the spacecraft and launch vehicle as well as to enhance how LSP manages the launch service. The method used to accomplish this pilot started through familiarization of SysML, MagicDraw, and the Mars 2020 and MSL systems through books, tutorials, and NASA documentation. MSL was chosen as the focus of the model since its processes and verifications translate easily to the Mars 2020 mission. The study was further focused by modeling specialized systems and processes within MSL in order to demonstrate the utility of MBSE for the rest of the mission. The systems chosen were the In-Flight Disconnect (IFD) system and the Mass Properties process. The IFD was chosen as a system of focus since it is an interface between the spacecraft and launch vehicle which can demonstrate the usefulness of MBSE from a system perspective. The Mass Properties process was chosen as a process of focus since the verifications for mass properties occur throughout the lifecycle and can demonstrate the usefulness of MBSE from a multi-discipline perspective. Several iterations of both perspectives have been modeled and evaluated. While the pilot study will continue for another 2 weeks, pros and cons of using MBSE for LSP IE have been identified. A pro of using MBSE includes an integrated view of the disciplines, requirements, and

  11. Response to health insurance by previously uninsured rural children.

    Science.gov (United States)

    Tilford, J M; Robbins, J M; Shema, S J; Farmer, F L

    1999-08-01

    To examine the healthcare utilization and costs of previously uninsured rural children. Four years of claims data from a school-based health insurance program located in the Mississippi Delta. All children who were not Medicaid-eligible or were uninsured, were eligible for limited benefits under the program. The 1987 National Medical Expenditure Survey (NMES) was used to compare utilization of services. The study represents a natural experiment in the provision of insurance benefits to a previously uninsured population. Premiums for the claims cost were set with little or no information on expected use of services. Claims from the insurer were used to form a panel data set. Mixed model logistic and linear regressions were estimated to determine the response to insurance for several categories of health services. The use of services increased over time and approached the level of utilization in the NMES. Conditional medical expenditures also increased over time. Actuarial estimates of claims cost greatly exceeded actual claims cost. The provision of a limited medical, dental, and optical benefit package cost approximately $20-$24 per member per month in claims paid. An important uncertainty in providing health insurance to previously uninsured populations is whether a pent-up demand exists for health services. Evidence of a pent-up demand for medical services was not supported in this study of rural school-age children. States considering partnerships with private insurers to implement the State Children's Health Insurance Program could lower premium costs by assembling basic data on previously uninsured children.

  12. Model-based accelerator controls: What, why and how

    International Nuclear Information System (INIS)

    Sidhu, S.S.

    1987-01-01

    Model-based control is defined as a gamut of techniques whose aim is to improve the reliability of an accelerator and enhance the capabilities of the operator, and therefore of the whole control system. The aim of model-based control is seen as gradually moving the function of model-reference from the operator to the computer. The role of the operator in accelerator control and the need for and application of model-based control are briefly summarized

  13. Previous Experience a Model of Practice UNAE

    Directory of Open Access Journals (Sweden)

    Ormary Barberi Ruiz

    2017-02-01

    Full Text Available The statements presented in this article represents a preliminary version of the proposed model of pre-professional practices (PPP of the National University of Education (UNAE of Ecuador, an urgent institutional necessity is revealed in the descriptive analyzes conducted from technical support - administrative (reports, interviews, testimonials, pedagogical foundations of UNAE (curricular directionality, transverse axes in practice, career plan, approach and diagnostic examination as subject nature of the pre professional practice and the demand of socio educational contexts where the practices have been emerging to resize them. By relating these elements allowed conceiving the modeling of the processes of the pre-professional practices for the development of professional skills of future teachers through four components: contextual projective, implementation (tutoring, accompaniment (teaching couple and monitoring (meetings at the beginning, during and end of practice. The initial training of teachers is inherent to teaching (academic and professional training, research and links with the community, these are fundamental pillars of Ecuadorian higher education.

  14. Model-Based Photoacoustic Image Reconstruction using Compressed Sensing and Smoothed L0 Norm

    OpenAIRE

    Mozaffarzadeh, Moein; Mahloojifar, Ali; Nasiriavanaki, Mohammadreza; Orooji, Mahdi

    2018-01-01

    Photoacoustic imaging (PAI) is a novel medical imaging modality that uses the advantages of the spatial resolution of ultrasound imaging and the high contrast of pure optical imaging. Analytical algorithms are usually employed to reconstruct the photoacoustic (PA) images as a result of their simple implementation. However, they provide a low accurate image. Model-based (MB) algorithms are used to improve the image quality and accuracy while a large number of transducers and data acquisition a...

  15. Model-based fault diagnosis techniques design schemes, algorithms, and tools

    CERN Document Server

    Ding, Steven

    2008-01-01

    The objective of this book is to introduce basic model-based FDI schemes, advanced analysis and design algorithms, and the needed mathematical and control theory tools at a level for graduate students and researchers as well as for engineers. This is a textbook with extensive examples and references. Most methods are given in the form of an algorithm that enables a direct implementation in a programme. Comparisons among different methods are included when possible.

  16. Research on user behavior authentication model based on stochastic Petri nets

    Science.gov (United States)

    Zhang, Chengyuan; Xu, Haishui

    2017-08-01

    A behavioural authentication model based on stochastic Petri net is proposed to meet the randomness, uncertainty and concurrency characteristics of user behaviour. The use of random models in the location, changes, arc and logo to describe the characteristics of a variety of authentication and game relationships, so as to effectively implement the graphical user behaviour authentication model analysis method, according to the corresponding proof to verify the model is valuable.

  17. Model-based normalization for iterative 3D PET image

    International Nuclear Information System (INIS)

    Bai, B.; Li, Q.; Asma, E.; Leahy, R.M.; Holdsworth, C.H.; Chatziioannou, A.; Tai, Y.C.

    2002-01-01

    We describe a method for normalization in 3D PET for use with maximum a posteriori (MAP) or other iterative model-based image reconstruction methods. This approach is an extension of previous factored normalization methods in which we include separate factors for detector sensitivity, geometric response, block effects and deadtime. Since our MAP reconstruction approach already models some of the geometric factors in the forward projection, the normalization factors must be modified to account only for effects not already included in the model. We describe a maximum likelihood approach to joint estimation of the count-rate independent normalization factors, which we apply to data from a uniform cylindrical source. We then compute block-wise and block-profile deadtime correction factors using singles and coincidence data, respectively, from a multiframe cylindrical source. We have applied this method for reconstruction of data from the Concorde microPET P4 scanner. Quantitative evaluation of this method using well-counter measurements of activity in a multicompartment phantom compares favourably with normalization based directly on cylindrical source measurements. (author)

  18. Underestimation of Severity of Previous Whiplash Injuries

    Science.gov (United States)

    Naqui, SZH; Lovell, SJ; Lovell, ME

    2008-01-01

    INTRODUCTION We noted a report that more significant symptoms may be expressed after second whiplash injuries by a suggested cumulative effect, including degeneration. We wondered if patients were underestimating the severity of their earlier injury. PATIENTS AND METHODS We studied recent medicolegal reports, to assess subjects with a second whiplash injury. They had been asked whether their earlier injury was worse, the same or lesser in severity. RESULTS From the study cohort, 101 patients (87%) felt that they had fully recovered from their first injury and 15 (13%) had not. Seventy-six subjects considered their first injury of lesser severity, 24 worse and 16 the same. Of the 24 that felt the violence of their first accident was worse, only 8 had worse symptoms, and 16 felt their symptoms were mainly the same or less than their symptoms from their second injury. Statistical analysis of the data revealed that the proportion of those claiming a difference who said the previous injury was lesser was 76% (95% CI 66–84%). The observed proportion with a lesser injury was considerably higher than the 50% anticipated. CONCLUSIONS We feel that subjects may underestimate the severity of an earlier injury and associated symptoms. Reasons for this may include secondary gain rather than any proposed cumulative effect. PMID:18201501

  19. [Electronic cigarettes - effects on health. Previous reports].

    Science.gov (United States)

    Napierała, Marta; Kulza, Maksymilian; Wachowiak, Anna; Jabłecka, Katarzyna; Florek, Ewa

    2014-01-01

    Currently very popular in the market of tobacco products have gained electronic cigarettes (ang. E-cigarettes). These products are considered to be potentially less harmful in compared to traditional tobacco products. However, current reports indicate that the statements of the producers regarding to the composition of the e- liquids not always are sufficient, and consumers often do not have reliable information on the quality of the product used by them. This paper contain a review of previous reports on the composition of e-cigarettes and their impact on health. Most of the observed health effects was related to symptoms of the respiratory tract, mouth, throat, neurological complications and sensory organs. Particularly hazardous effects of the e-cigarettes were: pneumonia, congestive heart failure, confusion, convulsions, hypotension, aspiration pneumonia, face second-degree burns, blindness, chest pain and rapid heartbeat. In the literature there is no information relating to passive exposure by the aerosols released during e-cigarette smoking. Furthermore, the information regarding to the use of these products in the long term are not also available.

  20. Enabling Accessibility Through Model-Based User Interface Development.

    Science.gov (United States)

    Ziegler, Daniel; Peissner, Matthias

    2017-01-01

    Adaptive user interfaces (AUIs) can increase the accessibility of interactive systems. They provide personalized display and interaction modes to fit individual user needs. Most AUI approaches rely on model-based development, which is considered relatively demanding. This paper explores strategies to make model-based development more attractive for mainstream developers.

  1. Model-Based Software Testing for Object-Oriented Software

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Model-based testing is one of the best solutions for testing object-oriented software. It has a better test coverage than other testing styles. Model-based testing takes into consideration behavioural aspects of a class, which are usually unchecked in other testing methods. An increase in the complexity of software has forced the software industry…

  2. Towards automatic model based controller design for reconfigurable plants

    DEFF Research Database (Denmark)

    Michelsen, Axel Gottlieb; Stoustrup, Jakob; Izadi-Zamanabadi, Roozbeh

    2008-01-01

    This paper introduces model-based Plug and Play Process Control, a novel concept for process control, which allows a model-based control system to be reconfigured when a sensor or an actuator is plugged into a controlled process. The work reported in this paper focuses on composing a monolithic m...

  3. Model based design introduction: modeling game controllers to microprocessor architectures

    Science.gov (United States)

    Jungwirth, Patrick; Badawy, Abdel-Hameed

    2017-04-01

    We present an introduction to model based design. Model based design is a visual representation, generally a block diagram, to model and incrementally develop a complex system. Model based design is a commonly used design methodology for digital signal processing, control systems, and embedded systems. Model based design's philosophy is: to solve a problem - a step at a time. The approach can be compared to a series of steps to converge to a solution. A block diagram simulation tool allows a design to be simulated with real world measurement data. For example, if an analog control system is being upgraded to a digital control system, the analog sensor input signals can be recorded. The digital control algorithm can be simulated with the real world sensor data. The output from the simulated digital control system can then be compared to the old analog based control system. Model based design can compared to Agile software develop. The Agile software development goal is to develop working software in incremental steps. Progress is measured in completed and tested code units. Progress is measured in model based design by completed and tested blocks. We present a concept for a video game controller and then use model based design to iterate the design towards a working system. We will also describe a model based design effort to develop an OS Friendly Microprocessor Architecture based on the RISC-V.

  4. Learning of Chemical Equilibrium through Modelling-Based Teaching

    Science.gov (United States)

    Maia, Poliana Flavia; Justi, Rosaria

    2009-01-01

    This paper presents and discusses students' learning process of chemical equilibrium from a modelling-based approach developed from the use of the "Model of Modelling" diagram. The investigation was conducted in a regular classroom (students 14-15 years old) and aimed at discussing how modelling-based teaching can contribute to students…

  5. Mechanics and model-based control of advanced engineering systems

    CERN Document Server

    Irschik, Hans; Krommer, Michael

    2014-01-01

    Mechanics and Model-Based Control of Advanced Engineering Systems collects 32 contributions presented at the International Workshop on Advanced Dynamics and Model Based Control of Structures and Machines, which took place in St. Petersburg, Russia in July 2012. The workshop continued a series of international workshops, which started with a Japan-Austria Joint Workshop on Mechanics and Model Based Control of Smart Materials and Structures and a Russia-Austria Joint Workshop on Advanced Dynamics and Model Based Control of Structures and Machines. In the present volume, 10 full-length papers based on presentations from Russia, 9 from Austria, 8 from Japan, 3 from Italy, one from Germany and one from Taiwan are included, which represent the state of the art in the field of mechanics and model based control, with particular emphasis on the application of advanced structures and machines.

  6. A human motion model based on maps for navigation systems

    Directory of Open Access Journals (Sweden)

    Kaiser Susanna

    2011-01-01

    Full Text Available Abstract Foot-mounted indoor positioning systems work remarkably well when using additionally the knowledge of floor-plans in the localization algorithm. Walls and other structures naturally restrict the motion of pedestrians. No pedestrian can walk through walls or jump from one floor to another when considering a building with different floor-levels. By incorporating known floor-plans in sequential Bayesian estimation processes such as particle filters (PFs, long-term error stability can be achieved as long as the map is sufficiently accurate and the environment sufficiently constraints pedestrians' motion. In this article, a new motion model based on maps and floor-plans is introduced that is capable of weighting the possible headings of the pedestrian as a function of the local environment. The motion model is derived from a diffusion algorithm that makes use of the principle of a source effusing gas and is used in the weighting step of a PF implementation. The diffusion algorithm is capable of including floor-plans as well as maps with areas of different degrees of accessibility. The motion model more effectively represents the probability density function of possible headings that are restricted by maps and floor-plans than a simple binary weighting of particles (i.e., eliminating those that crossed walls and keeping the rest. We will show that the motion model will help for obtaining better performance in critical navigation scenarios where two or more modes may be competing for some of the time (multi-modal scenarios.

  7. Model-Based GN and C Simulation and Flight Software Development for Orion Missions beyond LEO

    Science.gov (United States)

    Odegard, Ryan; Milenkovic, Zoran; Henry, Joel; Buttacoli, Michael

    2014-01-01

    For Orion missions beyond low Earth orbit (LEO), the Guidance, Navigation, and Control (GN&C) system is being developed using a model-based approach for simulation and flight software. Lessons learned from the development of GN&C algorithms and flight software for the Orion Exploration Flight Test One (EFT-1) vehicle have been applied to the development of further capabilities for Orion GN&C beyond EFT-1. Continuing the use of a Model-Based Development (MBD) approach with the Matlab®/Simulink® tool suite, the process for GN&C development and analysis has been largely improved. Furthermore, a model-based simulation environment in Simulink, rather than an external C-based simulation, greatly eases the process for development of flight algorithms. The benefits seen by employing lessons learned from EFT-1 are described, as well as the approach for implementing additional MBD techniques. Also detailed are the key enablers for improvements to the MBD process, including enhanced configuration management techniques for model-based software systems, automated code and artifact generation, and automated testing and integration.

  8. Model-Based Reasoning in Humans Becomes Automatic with Training.

    Directory of Open Access Journals (Sweden)

    Marcos Economides

    2015-09-01

    Full Text Available Model-based and model-free reinforcement learning (RL have been suggested as algorithmic realizations of goal-directed and habitual action strategies. Model-based RL is more flexible than model-free but requires sophisticated calculations using a learnt model of the world. This has led model-based RL to be identified with slow, deliberative processing, and model-free RL with fast, automatic processing. In support of this distinction, it has recently been shown that model-based reasoning is impaired by placing subjects under cognitive load--a hallmark of non-automaticity. Here, using the same task, we show that cognitive load does not impair model-based reasoning if subjects receive prior training on the task. This finding is replicated across two studies and a variety of analysis methods. Thus, task familiarity permits use of model-based reasoning in parallel with other cognitive demands. The ability to deploy model-based reasoning in an automatic, parallelizable fashion has widespread theoretical implications, particularly for the learning and execution of complex behaviors. It also suggests a range of important failure modes in psychiatric disorders.

  9. HEART TRANSPLANTATION IN PATIENTS WITH PREVIOUS OPEN HEART SURGERY

    Directory of Open Access Journals (Sweden)

    R. Sh. Saitgareev

    2016-01-01

    Full Text Available Heart Transplantation (HTx to date remains the most effective and radical method of treatment of patients with end-stage heart failure. The defi cit of donor hearts is forcing to resort increasingly to the use of different longterm mechanical circulatory support systems, including as a «bridge» to the follow-up HTx. According to the ISHLT Registry the number of recipients underwent cardiopulmonary bypass surgery increased from 40% in the period from 2004 to 2008 to 49.6% for the period from 2009 to 2015. HTx performed in repeated patients, on the one hand, involves considerable technical diffi culties and high risks; on the other hand, there is often no alternative medical intervention to HTx, and if not dictated by absolute contradictions the denial of the surgery is equivalent to 100% mortality. This review summarizes the results of a number of published studies aimed at understanding the immediate and late results of HTx in patients, previously underwent open heart surgery. The effect of resternotomy during HTx and that of the specifi c features associated with its implementation in recipients previously operated on open heart, and its effects on the immediate and long-term survival were considered in this review. Results of studies analyzing the risk factors for perioperative complications in repeated recipients were also demonstrated. Separately, HTx risks after implantation of prolonged mechanical circulatory support systems were examined. The literature does not allow to clearly defi ning the impact factor of earlier performed open heart surgery on the course of perioperative period and on the prognosis of survival in recipients who underwent HTx. On the other hand, subject to the regular fl ow of HTx and the perioperative period the risks in this clinical situation are justifi ed as a long-term prognosis of recipients previously conducted open heart surgery and are comparable to those of patients who underwent primary HTx. Studies

  10. Implementing a Multiple Criteria Model Base in Co-Op with a Graphical User Interface Generator

    Science.gov (United States)

    1993-09-23

    PROMETHEE ................................ 44 A. THE ALGORITHM S ................................... 44 1. Basic Algorithm of PROMETHEE I and... PROMETHEE II ..... 45 a. Use of the Algorithm in PROMETHEE I ............. 49 b. Use of the Algorithm in PROMETHEE II ............. 50 V 2. Algorithm of... PROMETHEE V ......................... 50 B. SCREEN DESIGNS OF PROMETHEE ...................... 51 1. PROMETHEE I and PROMETHEE II ................... 52 a

  11. A Model-based Avionic Prognostic Reasoner (MAPR)

    Data.gov (United States)

    National Aeronautics and Space Administration — The Model-based Avionic Prognostic Reasoner (MAPR) presented in this paper is an innovative solution for non-intrusively monitoring the state of health (SoH) and...

  12. A Model-Based Prognostics Approach Applied to Pneumatic Valves

    Data.gov (United States)

    National Aeronautics and Space Administration — Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain...

  13. A Model-based Prognostics Approach Applied to Pneumatic Valves

    Data.gov (United States)

    National Aeronautics and Space Administration — Within the area of systems health management, the task of prognostics centers on predicting when components will fail. Model-based prognostics exploits domain...

  14. Model-based Prognostics with Concurrent Damage Progression Processes

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics approaches rely on physics-based models that describe the behavior of systems and their components. These models must account for the several...

  15. Model-based Prognostics with Fixed-lag Particle Filters

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based prognostics exploits domain knowl- edge of the system, its components, and how they fail by casting the underlying physical phenom- ena in a...

  16. Variability in Dopamine Genes Dissociates Model-Based and Model-Free Reinforcement Learning.

    Science.gov (United States)

    Doll, Bradley B; Bath, Kevin G; Daw, Nathaniel D; Frank, Michael J

    2016-01-27

    Considerable evidence suggests that multiple learning systems can drive behavior. Choice can proceed reflexively from previous actions and their associated outcomes, as captured by "model-free" learning algorithms, or flexibly from prospective consideration of outcomes that might occur, as captured by "model-based" learning algorithms. However, differential contributions of dopamine to these systems are poorly understood. Dopamine is widely thought to support model-free learning by modulating plasticity in striatum. Model-based learning may also be affected by these striatal effects, or by other dopaminergic effects elsewhere, notably on prefrontal working memory function. Indeed, prominent demonstrations linking striatal dopamine to putatively model-free learning did not rule out model-based effects, whereas other studies have reported dopaminergic modulation of verifiably model-based learning, but without distinguishing a prefrontal versus striatal locus. To clarify the relationships between dopamine, neural systems, and learning strategies, we combine a genetic association approach in humans with two well-studied reinforcement learning tasks: one isolating model-based from model-free behavior and the other sensitive to key aspects of striatal plasticity. Prefrontal function was indexed by a polymorphism in the COMT gene, differences of which reflect dopamine levels in the prefrontal cortex. This polymorphism has been associated with differences in prefrontal activity and working memory. Striatal function was indexed by a gene coding for DARPP-32, which is densely expressed in the striatum where it is necessary for synaptic plasticity. We found evidence for our hypothesis that variations in prefrontal dopamine relate to model-based learning, whereas variations in striatal dopamine function relate to model-free learning. Decisions can stem reflexively from their previously associated outcomes or flexibly from deliberative consideration of potential choice outcomes

  17. Combustion engine diagnosis model-based condition monitoring of gasoline and diesel engines and their components

    CERN Document Server

    Isermann, Rolf

    2017-01-01

    This book offers first a short introduction to advanced supervision, fault detection and diagnosis methods. It then describes model-based methods of fault detection and diagnosis for the main components of gasoline and diesel engines, such as the intake system, fuel supply, fuel injection, combustion process, turbocharger, exhaust system and exhaust gas aftertreatment. Additionally, model-based fault diagnosis of electrical motors, electric, pneumatic and hydraulic actuators and fault-tolerant systems is treated. In general series production sensors are used. It includes abundant experimental results showing the detection and diagnosis quality of implemented faults. Written for automotive engineers in practice, it is also of interest to graduate students of mechanical and electrical engineering and computer science. The Content Introduction.- I SUPERVISION, FAULT DETECTION AND DIAGNOSIS METHODS.- Supervision, Fault-Detection and Fault-Diagnosis Methods - a short Introduction.- II DIAGNOSIS OF INTERNAL COMBUST...

  18. Fuzzy model-based control of a nuclear reactor

    International Nuclear Information System (INIS)

    Van Den Durpel, L.; Ruan, D.

    1994-01-01

    The fuzzy model-based control of a nuclear power reactor is an emerging research topic world-wide. SCK-CEN is dealing with this research in a preliminary stage, including two aspects, namely fuzzy control and fuzzy modelling. The aim is to combine both methodologies in contrast to conventional model-based PID control techniques, and to state advantages of including fuzzy parameters as safety and operator feedback. This paper summarizes the general scheme of this new research project

  19. System Dynamics as Model-Based Theory Building

    OpenAIRE

    Schwaninger, Markus; Grösser, Stefan N.

    2008-01-01

    This paper introduces model-based theory building as a feature of system dynamics (SD) with large potential. It presents a systemic approach to actualizing that potential, thereby opening up a new perspective on theory building in the social sciences. The question addressed is if and how SD enables the construction of high-quality theories. This contribution is based on field experiment type projects which have been focused on model-based theory building, specifically the construction of a mi...

  20. A model-based approach to estimating forest area

    Science.gov (United States)

    Ronald E. McRoberts

    2006-01-01

    A logistic regression model based on forest inventory plot data and transformations of Landsat Thematic Mapper satellite imagery was used to predict the probability of forest for 15 study areas in Indiana, USA, and 15 in Minnesota, USA. Within each study area, model-based estimates of forest area were obtained for circular areas with radii of 5 km, 10 km, and 15 km and...

  1. Model-based Sensor Data Acquisition and Management

    OpenAIRE

    Aggarwal, Charu C.; Sathe, Saket; Papaioannou, Thanasis G.; Jeung, Ho Young; Aberer, Karl

    2012-01-01

    In recent years, due to the proliferation of sensor networks, there has been a genuine need of researching techniques for sensor data acquisition and management. To this end, a large number of techniques have emerged that advocate model-based sensor data acquisition and management. These techniques use mathematical models for performing various, day-to-day tasks involved in managing sensor data. In this chapter, we survey the state-of-the-art techniques for model-based sensor data acquisition...

  2. The Comparison of Distributed P2P Trust Models Based on Quantitative Parameters in the File Downloading Scenarios

    Directory of Open Access Journals (Sweden)

    Jingpei Wang

    2016-01-01

    Full Text Available Varied P2P trust models have been proposed recently; it is necessary to develop an effective method to evaluate these trust models to resolve the commonalities (guiding the newly generated trust models in theory and individuality (assisting a decision maker in choosing an optimal trust model to implement in specific context issues. A new method for analyzing and comparing P2P trust models based on hierarchical parameters quantization in the file downloading scenarios is proposed in this paper. Several parameters are extracted from the functional attributes and quality feature of trust relationship, as well as requirements from the specific network context and the evaluators. Several distributed P2P trust models are analyzed quantitatively with extracted parameters modeled into a hierarchical model. The fuzzy inferring method is applied to the hierarchical modeling of parameters to fuse the evaluated values of the candidate trust models, and then the relative optimal one is selected based on the sorted overall quantitative values. Finally, analyses and simulation are performed. The results show that the proposed method is reasonable and effective compared with the previous algorithms.

  3. Discrete event model-based simulation for train movement on a single-line railway

    International Nuclear Information System (INIS)

    Xu Xiao-Ming; Li Ke-Ping; Yang Li-Xing

    2014-01-01

    The aim of this paper is to present a discrete event model-based approach to simulate train movement with the considered energy-saving factor. We conduct extensive case studies to show the dynamic characteristics of the traffic flow and demonstrate the effectiveness of the proposed approach. The simulation results indicate that the proposed discrete event model-based simulation approach is suitable for characterizing the movements of a group of trains on a single railway line with less iterations and CPU time. Additionally, some other qualitative and quantitative characteristics are investigated. In particular, because of the cumulative influence from the previous trains, the following trains should be accelerated or braked frequently to control the headway distance, leading to more energy consumption. (general)

  4. Low-lying qq(qq)-bar states in a relativistic model based on the Bethe-Salpeter equation

    International Nuclear Information System (INIS)

    Ram, B.; Kriss, V.

    1985-01-01

    Low-lying qq(qq)-bar states are analysed in a previously given relativistic model based on the Bethe-Salpeter equation. It is not got M-diquonia, P-mesonia, or meson molecules, but it is got T-diquonia

  5. A Model Based Approach to Increase the Part Accuracy in Robot Based Incremental Sheet Metal Forming

    International Nuclear Information System (INIS)

    Meier, Horst; Laurischkat, Roman; Zhu Junhong

    2011-01-01

    One main influence on the dimensional accuracy in robot based incremental sheet metal forming results from the compliance of the involved robot structures. Compared to conventional machine tools the low stiffness of the robot's kinematic results in a significant deviation of the planned tool path and therefore in a shape of insufficient quality. To predict and compensate these deviations offline, a model based approach, consisting of a finite element approach, to simulate the sheet forming, and a multi body system, modeling the compliant robot structure, has been developed. This paper describes the implementation and experimental verification of the multi body system model and its included compensation method.

  6. A model based message passing approach for flexible and scalable home automation controllers

    Energy Technology Data Exchange (ETDEWEB)

    Bienhaus, D. [INNIAS GmbH und Co. KG, Frankenberg (Germany); David, K.; Klein, N.; Kroll, D. [ComTec Kassel Univ., SE Kassel Univ. (Germany); Heerdegen, F.; Jubeh, R.; Zuendorf, A. [Kassel Univ. (Germany). FG Software Engineering; Hofmann, J. [BSC Computer GmbH, Allendorf (Germany)

    2012-07-01

    There is a large variety of home automation systems that are largely proprietary systems from different vendors. In addition, the configuration and administration of home automation systems is frequently a very complex task especially, if more complex functionality shall be achieved. Therefore, an open model for home automation was developed that is especially designed for easy integration of various home automation systems. This solution also provides a simple modeling approach that is inspired by typical home automation components like switches, timers, etc. In addition, a model based technology to achieve rich functionality and usability was implemented. (orig.)

  7. Model-Based, Closed-Loop Control of PZT Creep for Cavity Ring-Down Spectroscopy.

    Science.gov (United States)

    McCartt, A D; Ognibene, T J; Bench, G; Turteltaub, K W

    2014-09-01

    Cavity ring-down spectrometers typically employ a PZT stack to modulate the cavity transmission spectrum. While PZTs ease instrument complexity and aid measurement sensitivity, PZT hysteresis hinders the implementation of cavity-length-stabilized, data-acquisition routines. Once the cavity length is stabilized, the cavity's free spectral range imparts extreme linearity and precision to the measured spectrum's wavelength axis. Methods such as frequency-stabilized cavity ring-down spectroscopy have successfully mitigated PZT hysteresis, but their complexity limits commercial applications. Described herein is a single-laser, model-based, closed-loop method for cavity length control.

  8. Pilot Implementations

    DEFF Research Database (Denmark)

    Manikas, Maria Ie

    by conducting a literature review. The concept of pilot implementation, although commonly used in practice, is rather disregarded in research. In the literature, pilot implementations are mainly treated as secondary to the learning outcomes and are presented as merely a means to acquire knowledge about a given...... objective. The prevalent understanding is that pilot implementations are an ISD technique that extends prototyping from the lab and into test during real use. Another perception is that pilot implementations are a project multiple of co-existing enactments of the pilot implementation. From this perspective......This PhD dissertation engages in the study of pilot (system) implementation. In the field of information systems, pilot implementations are commissioned as a way to learn from real use of a pilot system with real data, by real users during an information systems development (ISD) project and before...

  9. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid...... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types......, forms and complexity, together with their associated parameters. An example of a model-based system for design of chemicals based formulated products is also given....

  10. Model Based Mission Assurance: Emerging Opportunities for Robotic Systems

    Science.gov (United States)

    Evans, John W.; DiVenti, Tony

    2016-01-01

    The emergence of Model Based Systems Engineering (MBSE) in a Model Based Engineering framework has created new opportunities to improve effectiveness and efficiencies across the assurance functions. The MBSE environment supports not only system architecture development, but provides for support of Systems Safety, Reliability and Risk Analysis concurrently in the same framework. Linking to detailed design will further improve assurance capabilities to support failures avoidance and mitigation in flight systems. This also is leading new assurance functions including model assurance and management of uncertainty in the modeling environment. Further, the assurance cases, a structured hierarchal argument or model, are emerging as a basis for supporting a comprehensive viewpoint in which to support Model Based Mission Assurance (MBMA).

  11. Theory and experiments in model-based space system anomaly management

    Science.gov (United States)

    Kitts, Christopher Adam

    This research program consists of an experimental study of model-based reasoning methods for detecting, diagnosing and resolving anomalies that occur when operating a comprehensive space system. Using a first principles approach, several extensions were made to the existing field of model-based fault detection and diagnosis in order to develop a general theory of model-based anomaly management. Based on this theory, a suite of algorithms were developed and computationally implemented in order to detect, diagnose and identify resolutions for anomalous conditions occurring within an engineering system. The theory and software suite were experimentally verified and validated in the context of a simple but comprehensive, student-developed, end-to-end space system, which was developed specifically to support such demonstrations. This space system consisted of the Sapphire microsatellite which was launched in 2001, several geographically distributed and Internet-enabled communication ground stations, and a centralized mission control complex located in the Space Technology Center in the NASA Ames Research Park. Results of both ground-based and on-board experiments demonstrate the speed, accuracy, and value of the algorithms compared to human operators, and they highlight future improvements required to mature this technology.

  12. What is needed to eliminate new pediatric HIV infections: The contribution of model-based analyses

    Science.gov (United States)

    Doherty, Katie; Ciaranello, Andrea

    2013-01-01

    Purpose of Review Computer simulation models can identify key clinical, operational, and economic interventions that will be needed to achieve the elimination of new pediatric HIV infections. In this review, we summarize recent findings from model-based analyses of strategies for prevention of mother-to-child HIV transmission (MTCT). Recent Findings In order to achieve elimination of MTCT (eMTCT), model-based studies suggest that scale-up of services will be needed in several domains: uptake of services and retention in care (the PMTCT “cascade”), interventions to prevent HIV infections in women and reduce unintended pregnancies (the “four-pronged approach”), efforts to support medication adherence through long periods of pregnancy and breastfeeding, and strategies to make breastfeeding safer and/or shorter. Models also project the economic resources that will be needed to achieve these goals in the most efficient ways to allocate limited resources for eMTCT. Results suggest that currently recommended PMTCT regimens (WHO Option A, Option B, and Option B+) will be cost-effective in most settings. Summary Model-based results can guide future implementation science, by highlighting areas in which additional data are needed to make informed decisions and by outlining critical interventions that will be necessary in order to eliminate new pediatric HIV infections. PMID:23743788

  13. Designing the database for a reliability aware Model-Based System Engineering process

    International Nuclear Information System (INIS)

    Cressent, Robin; David, Pierre; Idasiak, Vincent; Kratz, Frederic

    2013-01-01

    This article outlines the need for a reliability database to implement model-based description of components failure modes and dysfunctional behaviors. We detail the requirements such a database should honor and describe our own solution: the Dysfunctional Behavior Database (DBD). Through the description of its meta-model, the benefits of integrating the DBD in the system design process is highlighted. The main advantages depicted are the possibility to manage feedback knowledge at various granularity and semantic levels and to ease drastically the interactions between system engineering activities and reliability studies. The compliance of the DBD with other reliability database such as FIDES is presented and illustrated. - Highlights: ► Model-Based System Engineering is more and more used in the industry. ► It results in a need for a reliability database able to deal with model-based description of dysfunctional behavior. ► The Dysfunctional Behavior Database aims to fulfill that need. ► It helps dealing with feedback management thanks to its structured meta-model. ► The DBD can profit from other reliability database such as FIDES.

  14. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  15. Towards model-based testing of electronic funds transfer systems

    OpenAIRE

    Asaadi, H.R.; Khosravi, R.; Mousavi, M.R.; Noroozi, N.

    2010-01-01

    We report on our first experience with applying model-based testing techniques to an operational Electronic Funds Transfer (EFT) switch. The goal is to test the conformance of the EFT switch to the standard flows described by the ISO 8583 standard. To this end, we first make a formalization of the transaction flows specified in the ISO 8583 standard in terms of a Labeled Transition System (LTS). This formalization paves the way for model-based testing based on the formal notion of Input-Outpu...

  16. Pilot implementation

    DEFF Research Database (Denmark)

    Hertzum, Morten; Bansler, Jørgen P.; Havn, Erling C.

    2012-01-01

    A recurrent problem in information-systems development (ISD) is that many design shortcomings are not detected during development, but first after the system has been delivered and implemented in its intended environment. Pilot implementations appear to promise a way to extend prototyping from...... the laboratory to the field, thereby allowing users to experience a system design under realistic conditions and developers to get feedback from realistic use while the design is still malleable. We characterize pilot implementation, contrast it with prototyping, propose a iveelement model of pilot...... implementation and provide three empirical illustrations of our model. We conclude that pilot implementation has much merit as an ISD technique when system performance is contingent on context. But we also warn developers that, despite their seductive conceptual simplicity, pilot implementations can be difficult...

  17. Improving Science Process Skills for Primary School Students Through 5E Instructional Model-Based Learning

    Science.gov (United States)

    Choirunnisa, N. L.; Prabowo, P.; Suryanti, S.

    2018-01-01

    The main objective of this study is to describe the effectiveness of 5E instructional model-based learning to improve primary school students’ science process skills. The science process skills is important for students as it is the foundation for enhancing the mastery of concepts and thinking skills needed in the 21st century. The design of this study was experimental involving one group pre-test and post-test design. The result of this study shows that (1) the implementation of learning in both of classes, IVA and IVB, show that the percentage of learning implementation increased which indicates a better quality of learning and (2) the percentage of students’ science process skills test results on the aspects of observing, formulating hypotheses, determining variable, interpreting data and communicating increased as well.

  18. Non-frontal Model Based Approach to Forensic Face Recognition

    NARCIS (Netherlands)

    Dutta, A.; Veldhuis, Raymond N.J.; Spreeuwers, Lieuwe Jan

    2012-01-01

    In this paper, we propose a non-frontal model based approach which ensures that a face recognition system always gets to compare images having similar view (or pose). This requires a virtual suspect reference set that consists of non-frontal suspect images having pose similar to the surveillance

  19. Model-Based GUI Testing Using Uppaal at Novo Nordisk

    DEFF Research Database (Denmark)

    H. Hjort, Ulrik; Rasmussen, Jacob Illum; Larsen, Kim Guldstrand

    2009-01-01

    This paper details a collaboration between Aalborg University and Novo Nordiskin developing an automatic model-based test generation tool for system testing of the graphical user interface of a medical device on an embedded platform. The tool takes as input an UML Statemachine model and generates...

  20. Model Based Fault Detection in a Centrifugal Pump Application

    DEFF Research Database (Denmark)

    Kallesøe, Carsten; Cocquempot, Vincent; Izadi-Zamanabadi, Roozbeh

    2006-01-01

    A model based approach for fault detection in a centrifugal pump, driven by an induction motor, is proposed in this paper. The fault detection algorithm is derived using a combination of structural analysis, observer design and Analytical Redundancy Relation (ARR) design. Structural considerations...

  1. Perceptual decision neurosciences: a model-based review

    NARCIS (Netherlands)

    Mulder, M.J.; van Maanen, L.; Forstmann, B.U.

    2014-01-01

    In this review we summarize findings published over the past 10 years focusing on the neural correlates of perceptual decision-making. Importantly, this review highlights only studies that employ a model-based approach, i.e., they use quantitative cognitive models in combination with neuroscientific

  2. A comparative study of independent particle model based ...

    Indian Academy of Sciences (India)

    We find that among these three independent particle model based methods, the ss-VSCF method provides most accurate results in the thermal averages followed by t-SCF and the v-VSCF is the least accurate. However, the ss-VSCF is found to be computationally very expensive for the large molecules. The t-SCF gives ...

  3. Model-based safety architecture framework for complex systems

    NARCIS (Netherlands)

    Schuitemaker, Katja; Rajabali Nejad, Mohammadreza; Braakhuis, J.G.; Podofillini, Luca; Sudret, Bruno; Stojadinovic, Bozidar; Zio, Enrico; Kröger, Wolfgang

    2015-01-01

    The shift to transparency and rising need of the general public for safety, together with the increasing complexity and interdisciplinarity of modern safety-critical Systems of Systems (SoS) have resulted in a Model-Based Safety Architecture Framework (MBSAF) for capturing and sharing architectural

  4. Adopting a Models-Based Approach to Teaching Physical Education

    Science.gov (United States)

    Casey, Ashley; MacPhail, Ann

    2018-01-01

    Background: The popularised notion of models-based practice (MBP) is one that focuses on the delivery of a model, e.g. Cooperative Learning, Sport Education, Teaching Personal and Social Responsibility, Teaching Games for Understanding. Indeed, while an abundance of research studies have examined the delivery of a single model and some have…

  5. Towards model-based testing of electronic funds transfer systems

    NARCIS (Netherlands)

    Asaadi, H.R.; Khosravi, R.; Mousavi, M.R.; Noroozi, N.; Arbab, F.; Sirjani, M.

    2012-01-01

    We report on our first experience with applying model-based testing techniques to an operational Electronic Funds Transfer (EFT) switch. The goal is to test the conformance of the EFT switch to the standard flows described by the ISO 8583 standard. To this end, we first make a formalization of the

  6. Towards model-based testing of electronic funds transfer systems

    NARCIS (Netherlands)

    Asaadi, H.R.; Khosravi, R.; Mousavi, M.R.; Noroozi, N.

    2010-01-01

    We report on our first experience with applying model-based testing techniques to an operational Electronic Funds Transfer (EFT) switch. The goal is to test the conformance of the EFT switch to the standard flows described by the ISO 8583 standard. To this end, we first make a formalization of the

  7. Model-based monitoring of rotors with multiple coexisting faults

    International Nuclear Information System (INIS)

    Rossner, Markus

    2015-01-01

    Monitoring systems are applied to many rotors, but only few monitoring systems can separate coexisting errors and identify their quantity. This research project solves this problem using a combination of signal-based and model-based monitoring. The signal-based part performs a pre-selection of possible errors; these errors are further separated with model-based methods. This approach is demonstrated for the errors unbalance, bow, stator-fixed misalignment, rotor-fixed misalignment and roundness errors. For the model-based part, unambiguous error definitions and models are set up. The Ritz approach reduces the model order and therefore speeds up the diagnosis. Identification algorithms are developed for the different rotor faults. Hereto, reliable damage indicators and proper sub steps of the diagnosis have to be defined. For several monitoring problems, measuring both deflection and bearing force is very useful. The monitoring system is verified by experiments on an academic rotor test rig. The interpretation of the measurements requires much knowledge concerning the dynamics of the rotor. Due to the model-based approach, the system can separate errors with similar signal patterns and identify bow and roundness error online at operation speed. [de

  8. Product Lifecycle Management Architecture: A Model Based Systems Engineering Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Noonan, Nicholas James [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This report is an analysis of the Product Lifecycle Management (PLM) program. The analysis is centered on a need statement generated by a Nuclear Weapons (NW) customer. The need statement captured in this report creates an opportunity for the PLM to provide a robust service as a solution. Lifecycles for both the NW and PLM are analyzed using Model Based System Engineering (MBSE).

  9. Expediting model-based optoacoustic reconstructions with tomographic symmetries

    International Nuclear Information System (INIS)

    Lutzweiler, Christian; Deán-Ben, Xosé Luís; Razansky, Daniel

    2014-01-01

    Purpose: Image quantification in optoacoustic tomography implies the use of accurate forward models of excitation, propagation, and detection of optoacoustic signals while inversions with high spatial resolution usually involve very large matrices, leading to unreasonably long computation times. The development of fast and memory efficient model-based approaches represents then an important challenge to advance on the quantitative and dynamic imaging capabilities of tomographic optoacoustic imaging. Methods: Herein, a method for simplification and acceleration of model-based inversions, relying on inherent symmetries present in common tomographic acquisition geometries, has been introduced. The method is showcased for the case of cylindrical symmetries by using polar image discretization of the time-domain optoacoustic forward model combined with efficient storage and inversion strategies. Results: The suggested methodology is shown to render fast and accurate model-based inversions in both numerical simulations andpost mortem small animal experiments. In case of a full-view detection scheme, the memory requirements are reduced by one order of magnitude while high-resolution reconstructions are achieved at video rate. Conclusions: By considering the rotational symmetry present in many tomographic optoacoustic imaging systems, the proposed methodology allows exploiting the advantages of model-based algorithms with feasible computational requirements and fast reconstruction times, so that its convenience and general applicability in optoacoustic imaging systems with tomographic symmetries is anticipated

  10. Model based active power control of a wind turbine

    DEFF Research Database (Denmark)

    Mirzaei, Mahmood; Soltani, Mohsen; Poulsen, Niels Kjølstad

    2014-01-01

    in the electricity market that selling the reserve power is more profitable than producing with the full capacity. Therefore wind turbines can be down-regulated and sell the differential capacity as the reserve power. In this paper we suggest a model based approach to control wind turbines for active power reference...

  11. Nonlinear Model-Based Fault Detection for a Hydraulic Actuator

    NARCIS (Netherlands)

    Van Eykeren, L.; Chu, Q.P.

    2011-01-01

    This paper presents a model-based fault detection algorithm for a specific fault scenario of the ADDSAFE project. The fault considered is the disconnection of a control surface from its hydraulic actuator. Detecting this type of fault as fast as possible helps to operate an aircraft more cost

  12. Model based decision support for planning of road maintenance

    NARCIS (Netherlands)

    van Harten, Aart; Worm, J.M.; Worm, J.M.

    1996-01-01

    In this article we describe a Decision Support Model, based on Operational Research methods, for the multi-period planning of maintenance of bituminous pavements. This model is a tool for the road manager to assist in generating an optimal maintenance plan for a road. Optimal means: minimising the

  13. An Approach to Quality Estimation in Model-Based Development

    DEFF Research Database (Denmark)

    Holmegaard, Jens Peter; Koch, Peter; Ravn, Anders Peter

    2004-01-01

    We present an approach to estimation of parameters for design space exploration in Model-Based Development, where synthesis of a system is done in two stages. Component qualities like space, execution time or power consumption are defined in a repository by platform dependent values. Connectors...

  14. The Actualization of Literary Learning Model Based on Verbal-Linguistic Intelligence

    Directory of Open Access Journals (Sweden)

    Nur Ihsan Halil

    2017-10-01

    Full Text Available This article is inspired by Howard Gardner's concept of linguistic intelligence and also from some authors' previous writings. All of them became the authors' reference in developing ideas on constructing a literary learning model based on linguistic intelligence. The writing of this article is not done by collecting data empirically, but by developing and constructing an existing concept, namely the concept of linguistic intelligence, which is disseminated into a literature-based learning of verbal-linguistic intelligence. The purpose of this paper is to answer the question of how to apply the literary learning model based on the verbal-linguistic intelligence. Then, regarding Gardner's concept, the author formulated a literary learning model based on the verbal-linguistic intelligence through a story-telling learning model with five steps namely arguing, discussing, interpreting, speaking, and writing about literary works. In short, the writer draw a conclusion that learning-based models of verbal-linguistic intelligence can be designed with attention into five components namely (1 definition, (2 characteristics, (3 teaching strategy, (4 final learning outcomes, and (5 figures.

  15. Comparing model-based and model-free analysis methods for QUASAR arterial spin labeling perfusion quantification.

    Science.gov (United States)

    Chappell, Michael A; Woolrich, Mark W; Petersen, Esben T; Golay, Xavier; Payne, Stephen J

    2013-05-01

    Amongst the various implementations of arterial spin labeling MRI methods for quantifying cerebral perfusion, the QUASAR method is unique. By using a combination of labeling with and without flow suppression gradients, the QUASAR method offers the separation of macrovascular and tissue signals. This permits local arterial input functions to be defined and "model-free" analysis, using numerical deconvolution, to be used. However, it remains unclear whether arterial spin labeling data are best treated using model-free or model-based analysis. This work provides a critical comparison of these two approaches for QUASAR arterial spin labeling in the healthy brain. An existing two-component (arterial and tissue) model was extended to the mixed flow suppression scheme of QUASAR to provide an optimal model-based analysis. The model-based analysis was extended to incorporate dispersion of the labeled bolus, generally regarded as the major source of discrepancy between the two analysis approaches. Model-free and model-based analyses were compared for perfusion quantification including absolute measurements, uncertainty estimation, and spatial variation in cerebral blood flow estimates. Major sources of discrepancies between model-free and model-based analysis were attributed to the effects of dispersion and the degree to which the two methods can separate macrovascular and tissue signal. Copyright © 2012 Wiley Periodicals, Inc.

  16. Advances in Bayesian Model Based Clustering Using Particle Learning

    Energy Technology Data Exchange (ETDEWEB)

    Merl, D M

    2009-11-19

    Recent work by Carvalho, Johannes, Lopes and Polson and Carvalho, Lopes, Polson and Taddy introduced a sequential Monte Carlo (SMC) alternative to traditional iterative Monte Carlo strategies (e.g. MCMC and EM) for Bayesian inference for a large class of dynamic models. The basis of SMC techniques involves representing the underlying inference problem as one of state space estimation, thus giving way to inference via particle filtering. The key insight of Carvalho et al was to construct the sequence of filtering distributions so as to make use of the posterior predictive distribution of the observable, a distribution usually only accessible in certain Bayesian settings. Access to this distribution allows a reversal of the usual propagate and resample steps characteristic of many SMC methods, thereby alleviating to a large extent many problems associated with particle degeneration. Furthermore, Carvalho et al point out that for many conjugate models the posterior distribution of the static variables can be parametrized in terms of [recursively defined] sufficient statistics of the previously observed data. For models where such sufficient statistics exist, particle learning as it is being called, is especially well suited for the analysis of streaming data do to the relative invariance of its algorithmic complexity with the number of data observations. Through a particle learning approach, a statistical model can be fit to data as the data is arriving, allowing at any instant during the observation process direct quantification of uncertainty surrounding underlying model parameters. Here we describe the use of a particle learning approach for fitting a standard Bayesian semiparametric mixture model as described in Carvalho, Lopes, Polson and Taddy. In Section 2 we briefly review the previously presented particle learning algorithm for the case of a Dirichlet process mixture of multivariate normals. In Section 3 we describe several novel extensions to the original

  17. Treaty implementation

    International Nuclear Information System (INIS)

    Dunn, L.A.

    1990-01-01

    This paper touches on three aspects of the relationship between intelligence and treaty implementation, a two-way association. First the author discusses the role of intelligence as a basis for compliance monitoring and treaty verification. Second the authors discusses payoffs of intelligence gathering and the intelligence process of treaty implementation, in particular on-site inspection. Third, the author goes in another direction and discusses some of the tensions between the intelligence gathering and treaty-implementation processes, especially with regard to extensive use of on-site inspection, such as we are likely to see in monitoring compliance of future arms control treaties

  18. Model-Based Development of Control Systems for Forestry Cranes

    Directory of Open Access Journals (Sweden)

    Pedro La Hera

    2015-01-01

    Full Text Available Model-based methods are used in industry for prototyping concepts based on mathematical models. With our forest industry partners, we have established a model-based workflow for rapid development of motion control systems for forestry cranes. Applying this working method, we can verify control algorithms, both theoretically and practically. This paper is an example of this workflow and presents four topics related to the application of nonlinear control theory. The first topic presents the system of differential equations describing the motion dynamics. The second topic presents nonlinear control laws formulated according to sliding mode control theory. The third topic presents a procedure for model calibration and control tuning that are a prerequisite to realize experimental tests. The fourth topic presents the results of tests performed on an experimental crane specifically equipped for these tasks. Results of these studies show the advantages and disadvantages of these control algorithms, and they highlight their performance in terms of robustness and smoothness.

  19. A sediment graph model based on SCS-CN method

    Science.gov (United States)

    Singh, P. K.; Bhunya, P. K.; Mishra, S. K.; Chaube, U. C.

    2008-01-01

    SummaryThis paper proposes new conceptual sediment graph models based on coupling of popular and extensively used methods, viz., Nash model based instantaneous unit sediment graph (IUSG), soil conservation service curve number (SCS-CN) method, and Power law. These models vary in their complexity and this paper tests their performance using data of the Nagwan watershed (area = 92.46 km 2) (India). The sensitivity of total sediment yield and peak sediment flow rate computations to model parameterisation is analysed. The exponent of the Power law, β, is more sensitive than other model parameters. The models are found to have substantial potential for computing sediment graphs (temporal sediment flow rate distribution) as well as total sediment yield.

  20. Automated extraction of knowledge for model-based diagnostics

    Science.gov (United States)

    Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.

    1990-01-01

    The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.

  1. Bond graph model-based fault diagnosis of hybrid systems

    CERN Document Server

    Borutzky, Wolfgang

    2015-01-01

    This book presents a bond graph model-based approach to fault diagnosis in mechatronic systems appropriately represented by a hybrid model. The book begins by giving a survey of the fundamentals of fault diagnosis and failure prognosis, then recalls state-of-art developments referring to latest publications, and goes on to discuss various bond graph representations of hybrid system models, equations formulation for switched systems, and simulation of their dynamic behavior. The structured text: • focuses on bond graph model-based fault detection and isolation in hybrid systems; • addresses isolation of multiple parametric faults in hybrid systems; • considers system mode identification; • provides a number of elaborated case studies that consider fault scenarios for switched power electronic systems commonly used in a variety of applications; and • indicates that bond graph modelling can also be used for failure prognosis. In order to facilitate the understanding of fault diagnosis and the presented...

  2. Fuzzy model-based observers for fault detection in CSTR.

    Science.gov (United States)

    Ballesteros-Moncada, Hazael; Herrera-López, Enrique J; Anzurez-Marín, Juan

    2015-11-01

    Under the vast variety of fuzzy model-based observers reported in the literature, what would be the properone to be used for fault detection in a class of chemical reactor? In this study four fuzzy model-based observers for sensor fault detection of a Continuous Stirred Tank Reactor were designed and compared. The designs include (i) a Luenberger fuzzy observer, (ii) a Luenberger fuzzy observer with sliding modes, (iii) a Walcott-Zak fuzzy observer, and (iv) an Utkin fuzzy observer. A negative, an oscillating fault signal, and a bounded random noise signal with a maximum value of ±0.4 were used to evaluate and compare the performance of the fuzzy observers. The Utkin fuzzy observer showed the best performance under the tested conditions. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  3. MTK: An AI tool for model-based reasoning

    Science.gov (United States)

    Erickson, William K.; Schwartz, Mary R.

    1987-01-01

    A 1988 goal for the Systems Autonomy Demonstration Project Office of the NASA Ames Research Center is to apply model-based representation and reasoning techniques in a knowledge-based system that will provide monitoring, fault diagnosis, control and trend analysis of the space station Thermal Management System (TMS). A number of issues raised during the development of the first prototype system inspired the design and construction of a model-based reasoning tool called MTK, which was used in the building of the second prototype. These issues are outlined, along with examples from the thermal system to highlight the motivating factors behind them. An overview of the capabilities of MTK is given.

  4. Route Choice Model Based on Game Theory for Commuters

    Directory of Open Access Journals (Sweden)

    Licai Yang

    2016-06-01

    Full Text Available The traffic behaviours of commuters may cause traffic congestion during peak hours. Advanced Traffic Information System can provide dynamic information to travellers. Due to the lack of timeliness and comprehensiveness, the provided information cannot satisfy the travellers’ needs. Since the assumptions of traditional route choice model based on Expected Utility Theory conflict with the actual situation, a route choice model based on Game Theory is proposed to provide reliable route choice to commuters in actual situation in this paper. The proposed model treats the alternative routes as game players and utilizes the precision of predicted information and familiarity of traffic condition to build a game. The optimal route can be generated considering Nash Equilibrium by solving the route choice game. Simulations and experimental analysis show that the proposed model can describe the commuters’ routine route choice decisionexactly and the provided route is reliable.

  5. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    need to be integrated with work-flows and data-flows for specific product-process synthesis-design problems within a computer-aided framework. The framework therefore should be able to manage knowledge-data, models and the associated methods and tools needed by specific synthesis-design work...... of model based methods and tools within a computer aided framework for product-process synthesis-design will be highlighted.......Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...

  6. Model-based dispersive wave processing: A recursive Bayesian solution

    International Nuclear Information System (INIS)

    Candy, J.V.; Chambers, D.H.

    1999-01-01

    Wave propagation through dispersive media represents a significant problem in many acoustic applications, especially in ocean acoustics, seismology, and nondestructive evaluation. In this paper we propose a propagation model that can easily represent many classes of dispersive waves and proceed to develop the model-based solution to the wave processing problem. It is shown that the underlying wave system is nonlinear and time-variable requiring a recursive processor. Thus the general solution to the model-based dispersive wave enhancement problem is developed using a Bayesian maximum a posteriori (MAP) approach and shown to lead to the recursive, nonlinear extended Kalman filter (EKF) processor. The problem of internal wave estimation is cast within this framework. The specific processor is developed and applied to data synthesized by a sophisticated simulator demonstrating the feasibility of this approach. copyright 1999 Acoustical Society of America.

  7. GENI: A graphical environment for model-based control

    International Nuclear Information System (INIS)

    Kleban, S.; Lee, M.; Zambre, Y.

    1989-10-01

    A new method to operate machine and beam simulation programs for accelerator control has been developed. Existing methods, although cumbersome, have been used in control systems for commissioning and operation of many machines. We developed GENI, a generalized graphical interface to these programs for model-based control. This ''object-oriented''-like environment is described and some typical applications are presented. 4 refs., 5 figs

  8. Model-based security engineering for the internet of things

    OpenAIRE

    NEISSE RICARDO; STERI GARY; NAI FOVINO Igor; BALDINI Gianmarco; VAN HOESEL Lodewijk

    2015-01-01

    We propose in this chapter a Model-based Security Toolkit (SecKit) and methodology to address the control and protection of user data in the deployment of the Internet of Things (IoT). This toolkit takes a more general approach for security engineering including risk analysis, establishment of aspect-specific trust relationships, and enforceable security policies. We describe the integrated metamodels used in the toolkit and the accompanying security engineering methodology for IoT systems...

  9. Applying Model Based Systems Engineering to NASA's Space Communications Networks

    Science.gov (United States)

    Bhasin, Kul; Barnes, Patrick; Reinert, Jessica; Golden, Bert

    2013-01-01

    System engineering practices for complex systems and networks now require that requirement, architecture, and concept of operations product development teams, simultaneously harmonize their activities to provide timely, useful and cost-effective products. When dealing with complex systems of systems, traditional systems engineering methodology quickly falls short of achieving project objectives. This approach is encumbered by the use of a number of disparate hardware and software tools, spreadsheets and documents to grasp the concept of the network design and operation. In case of NASA's space communication networks, since the networks are geographically distributed, and so are its subject matter experts, the team is challenged to create a common language and tools to produce its products. Using Model Based Systems Engineering methods and tools allows for a unified representation of the system in a model that enables a highly related level of detail. To date, Program System Engineering (PSE) team has been able to model each network from their top-level operational activities and system functions down to the atomic level through relational modeling decomposition. These models allow for a better understanding of the relationships between NASA's stakeholders, internal organizations, and impacts to all related entities due to integration and sustainment of existing systems. Understanding the existing systems is essential to accurate and detailed study of integration options being considered. In this paper, we identify the challenges the PSE team faced in its quest to unify complex legacy space communications networks and their operational processes. We describe the initial approaches undertaken and the evolution toward model based system engineering applied to produce Space Communication and Navigation (SCaN) PSE products. We will demonstrate the practice of Model Based System Engineering applied to integrating space communication networks and the summary of its

  10. European Climate - Energy Security Nexus. A model based scenario analysis

    International Nuclear Information System (INIS)

    Criqui, Patrick; Mima, Silvana

    2011-01-01

    In this research, we have provided an overview of the climate-security nexus in the European sector through a model based scenario analysis with POLES model. The analysis underline that under stringent climate policies, Europe take advantage of a double dividend in its capacity to develop a new cleaner energy model and in lower vulnerability to potential shocks on the international energy markets. (authors)

  11. Constrained convex minimization via model-based excessive gap

    OpenAIRE

    Tran Dinh, Quoc; Cevher, Volkan

    2014-01-01

    We introduce a model-based excessive gap technique to analyze first-order primal- dual methods for constrained convex minimization. As a result, we construct new primal-dual methods with optimal convergence rates on the objective residual and the primal feasibility gap of their iterates separately. Through a dual smoothing and prox-function selection strategy, our framework subsumes the augmented Lagrangian, and alternating methods as special cases, where our rates apply.

  12. Trojan detection model based on network behavior analysis

    International Nuclear Information System (INIS)

    Liu Junrong; Liu Baoxu; Wang Wenjin

    2012-01-01

    Based on the analysis of existing Trojan detection technology, this paper presents a Trojan detection model based on network behavior analysis. First of all, we abstract description of the Trojan network behavior, then according to certain rules to establish the characteristic behavior library, and then use the support vector machine algorithm to determine whether a Trojan invasion. Finally, through the intrusion detection experiments, shows that this model can effectively detect Trojans. (authors)

  13. Model-based design languages: A case study

    OpenAIRE

    Cibrario Bertolotti, Ivan; Hu, Tingting; Navet, Nicolas

    2017-01-01

    Fast-paced innovation in the embedded systems domain puts an ever increasing pressure on effective software development methods, leading to the growing popularity of Model-Based Design (MBD). In this context, a proper choice of modeling languages and related tools - depending on design goals and problem qualities - is crucial to make the most of MBD benefits. In this paper, a comparison between two dissimilar approaches to modeling is carried out, with the goal of highlighting their relative ...

  14. Constructing a justice model based on Sen's capability approach

    OpenAIRE

    Yüksel, Sevgi; Yuksel, Sevgi

    2008-01-01

    The thesis provides a possible justice model based on Sen's capability approach. For this goal, we first analyze the general structure of a theory of justice, identifying the main variables and issues. Furthermore, based on Sen (2006) and Kolm (1998), we look at 'transcendental' and 'comparative' approaches to justice and concentrate on the sufficiency condition for the comparative approach. Then, taking Rawls' theory of justice as a starting point, we present how Sen's capability approach em...

  15. Model-based Recursive Partitioning for Subgroup Analyses

    OpenAIRE

    Seibold, Heidi; Zeileis, Achim; Hothorn, Torsten

    2016-01-01

    The identification of patient subgroups with differential treatment effects is the first step towards individualised treatments. A current draft guideline by the EMA discusses potentials and problems in subgroup analyses and formulated challenges to the development of appropriate statistical procedures for the data-driven identification of patient subgroups. We introduce model-based recursive partitioning as a procedure for the automated detection of patient subgroups that are identifiable by...

  16. LEARNING CREATIVE WRITING MODEL BASED ON NEUROLINGUISTIC PROGRAMMING

    OpenAIRE

    Rustan, Edhy

    2017-01-01

    The objectives of the study are to determine: (1) condition on learning creative writing at high school students in Makassar, (2) requirement of learning model in creative writing, (3) program planning and design model in ideal creative writing, (4) feasibility of model study based on creative writing in neurolinguistic programming, and (5) the effectiveness of the learning model based on creative writing in neurolinguisticprogramming.The method of this research uses research development of L...

  17. GPU-accelerated 3-D model-based tracking

    International Nuclear Information System (INIS)

    Brown, J Anthony; Capson, David W

    2010-01-01

    Model-based approaches to tracking the pose of a 3-D object in video are effective but computationally demanding. While statistical estimation techniques, such as the particle filter, are often employed to minimize the search space, real-time performance remains unachievable on current generation CPUs. Recent advances in graphics processing units (GPUs) have brought massively parallel computational power to the desktop environment and powerful developer tools, such as NVIDIA Compute Unified Device Architecture (CUDA), have provided programmers with a mechanism to exploit it. NVIDIA GPUs' single-instruction multiple-thread (SIMT) programming model is well-suited to many computer vision tasks, particularly model-based tracking, which requires several hundred 3-D model poses to be dynamically configured, rendered, and evaluated against each frame in the video sequence. Using 6 degree-of-freedom (DOF) rigid hand tracking as an example application, this work harnesses consumer-grade GPUs to achieve real-time, 3-D model-based, markerless object tracking in monocular video.

  18. Impact of previously disadvantaged land-users on sustainable ...

    African Journals Online (AJOL)

    Impact of previously disadvantaged land-users on sustainable agricultural ... about previously disadvantaged land users involved in communal farming systems ... of input, capital, marketing, information and land use planning, with effect on ...

  19. Impact of Vocational Interests, Previous Academic Experience, Gender and Age on Situational Judgement Test Performance

    Science.gov (United States)

    Schripsema, Nienke R.; van Trigt, Anke M.; Borleffs, Jan C. C.; Cohen-Schotanus, Janke

    2017-01-01

    Situational Judgement Tests (SJTs) are increasingly implemented in medical school admissions. In this paper, we investigate the effects of vocational interests, previous academic experience, gender and age on SJT performance. The SJT was part of the selection process for the Bachelor's degree programme in Medicine at University of Groningen, the…

  20. Impact of vocational interests, previous academic experience, gender and age on Situational Judgement Test performance

    NARCIS (Netherlands)

    Schripsema, Nienke R.; Trigt, van Anke M.; Borleffs, Jan C. C.; Cohen-Schotanus, Janke

    Situational Judgement Tests (SJTs) are increasingly implemented in medical school admissions. In this paper, we investigate the effects of vocational interests, previous academic experience, gender and age on SJT performance. The SJT was part of the selection process for the Bachelor's degree

  1. 75 FR 68185 - Airworthiness Directives; EADS CASA (Type Certificate Previously Held by Construcciones...

    Science.gov (United States)

    2010-11-05

    ... Airworthiness Directives; EADS CASA (Type Certificate Previously Held by Construcciones Aeronauticas, S.A...) that were, at that moment, defined in issue C of EADS-CASA document DT-0-C00-05001. That document has... implementation of the revised Fuel Airworthiness Limitations contained in issue D of EADS- CASA document DT-0-C00...

  2. 22 CFR 40.91 - Certain aliens previously removed.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Certain aliens previously removed. 40.91... IMMIGRANTS UNDER THE IMMIGRATION AND NATIONALITY ACT, AS AMENDED Aliens Previously Removed § 40.91 Certain aliens previously removed. (a) 5-year bar. An alien who has been found inadmissible, whether as a result...

  3. An evolutionary model-based algorithm for accurate phylogenetic breakpoint mapping and subtype prediction in HIV-1.

    Directory of Open Access Journals (Sweden)

    Sergei L Kosakovsky Pond

    2009-11-01

    Full Text Available Genetically diverse pathogens (such as Human Immunodeficiency virus type 1, HIV-1 are frequently stratified into phylogenetically or immunologically defined subtypes for classification purposes. Computational identification of such subtypes is helpful in surveillance, epidemiological analysis and detection of novel variants, e.g., circulating recombinant forms in HIV-1. A number of conceptually and technically different techniques have been proposed for determining the subtype of a query sequence, but there is not a universally optimal approach. We present a model-based phylogenetic method for automatically subtyping an HIV-1 (or other viral or bacterial sequence, mapping the location of breakpoints and assigning parental sequences in recombinant strains as well as computing confidence levels for the inferred quantities. Our Subtype Classification Using Evolutionary ALgorithms (SCUEAL procedure is shown to perform very well in a variety of simulation scenarios, runs in parallel when multiple sequences are being screened, and matches or exceeds the performance of existing approaches on typical empirical cases. We applied SCUEAL to all available polymerase (pol sequences from two large databases, the Stanford Drug Resistance database and the UK HIV Drug Resistance Database. Comparing with subtypes which had previously been assigned revealed that a minor but substantial (approximately 5% fraction of pure subtype sequences may in fact be within- or inter-subtype recombinants. A free implementation of SCUEAL is provided as a module for the HyPhy package and the Datamonkey web server. Our method is especially useful when an accurate automatic classification of an unknown strain is desired, and is positioned to complement and extend faster but less accurate methods. Given the increasingly frequent use of HIV subtype information in studies focusing on the effect of subtype on treatment, clinical outcome, pathogenicity and vaccine design, the importance

  4. A Model-Based Joint Identification of Differentially Expressed Genes and Phenotype-Associated Genes.

    Directory of Open Access Journals (Sweden)

    Samuel Sunghwan Cho

    Full Text Available Over the last decade, many analytical methods and tools have been developed for microarray data. The detection of differentially expressed genes (DEGs among different treatment groups is often a primary purpose of microarray data analysis. In addition, association studies investigating the relationship between genes and a phenotype of interest such as survival time are also popular in microarray data analysis. Phenotype association analysis provides a list of phenotype-associated genes (PAGs. However, it is sometimes necessary to identify genes that are both DEGs and PAGs. We consider the joint identification of DEGs and PAGs in microarray data analyses. The first approach we used was a naïve approach that detects DEGs and PAGs separately and then identifies the genes in an intersection of the list of PAGs and DEGs. The second approach we considered was a hierarchical approach that detects DEGs first and then chooses PAGs from among the DEGs or vice versa. In this study, we propose a new model-based approach for the joint identification of DEGs and PAGs. Unlike the previous two-step approaches, the proposed method identifies genes simultaneously that are DEGs and PAGs. This method uses standard regression models but adopts different null hypothesis from ordinary regression models, which allows us to perform joint identification in one-step. The proposed model-based methods were evaluated using experimental data and simulation studies. The proposed methods were used to analyze a microarray experiment in which the main interest lies in detecting genes that are both DEGs and PAGs, where DEGs are identified between two diet groups and PAGs are associated with four phenotypes reflecting the expression of leptin, adiponectin, insulin-like growth factor 1, and insulin. Model-based approaches provided a larger number of genes, which are both DEGs and PAGs, than other methods. Simulation studies showed that they have more power than other methods

  5. A Model-Based Joint Identification of Differentially Expressed Genes and Phenotype-Associated Genes

    Science.gov (United States)

    Seo, Minseok; Shin, Su-kyung; Kwon, Eun-Young; Kim, Sung-Eun; Bae, Yun-Jung; Lee, Seungyeoun; Sung, Mi-Kyung; Choi, Myung-Sook; Park, Taesung

    2016-01-01

    Over the last decade, many analytical methods and tools have been developed for microarray data. The detection of differentially expressed genes (DEGs) among different treatment groups is often a primary purpose of microarray data analysis. In addition, association studies investigating the relationship between genes and a phenotype of interest such as survival time are also popular in microarray data analysis. Phenotype association analysis provides a list of phenotype-associated genes (PAGs). However, it is sometimes necessary to identify genes that are both DEGs and PAGs. We consider the joint identification of DEGs and PAGs in microarray data analyses. The first approach we used was a naïve approach that detects DEGs and PAGs separately and then identifies the genes in an intersection of the list of PAGs and DEGs. The second approach we considered was a hierarchical approach that detects DEGs first and then chooses PAGs from among the DEGs or vice versa. In this study, we propose a new model-based approach for the joint identification of DEGs and PAGs. Unlike the previous two-step approaches, the proposed method identifies genes simultaneously that are DEGs and PAGs. This method uses standard regression models but adopts different null hypothesis from ordinary regression models, which allows us to perform joint identification in one-step. The proposed model-based methods were evaluated using experimental data and simulation studies. The proposed methods were used to analyze a microarray experiment in which the main interest lies in detecting genes that are both DEGs and PAGs, where DEGs are identified between two diet groups and PAGs are associated with four phenotypes reflecting the expression of leptin, adiponectin, insulin-like growth factor 1, and insulin. Model-based approaches provided a larger number of genes, which are both DEGs and PAGs, than other methods. Simulation studies showed that they have more power than other methods. Through analysis of

  6. Determining root correspondence between previously and newly detected objects

    Science.gov (United States)

    Paglieroni, David W.; Beer, N Reginald

    2014-06-17

    A system that applies attribute and topology based change detection to networks of objects that were detected on previous scans of a structure, roadway, or area of interest. The attributes capture properties or characteristics of the previously detected objects, such as location, time of detection, size, elongation, orientation, etc. The topology of the network of previously detected objects is maintained in a constellation database that stores attributes of previously detected objects and implicitly captures the geometrical structure of the network. A change detection system detects change by comparing the attributes and topology of new objects detected on the latest scan to the constellation database of previously detected objects.

  7. Elusive Implementation

    DEFF Research Database (Denmark)

    Heering Holt, Ditte; Rod, Morten Hulvej; Waldorff, Susanne Boch

    2018-01-01

    in health. However, despite growing support for intersectoral policymaking, implementation remains a challenge. Critics argue that public health has remained naïve about the policy process and a better understanding is needed. Based on ethnographic data, this paper conducts an in-depth analysis of a local......: On the basis of an explorative study among ten Danish municipalities, we conducted an ethnographic study of the development of a municipal-wide implementation strategy for the intersectoral health policy of a medium-sized municipality. The main data sources consist of ethnographic field notes from participant...

  8. Audiovisual Rehabilitation in Hemianopia: A Model-Based Theoretical Investigation.

    Science.gov (United States)

    Magosso, Elisa; Cuppini, Cristiano; Bertini, Caterina

    2017-01-01

    Hemianopic patients exhibit visual detection improvement in the blind field when audiovisual stimuli are given in spatiotemporally coincidence. Beyond this "online" multisensory improvement, there is evidence of long-lasting, "offline" effects induced by audiovisual training: patients show improved visual detection and orientation after they were trained to detect and saccade toward visual targets given in spatiotemporal proximity with auditory stimuli. These effects are ascribed to the Superior Colliculus (SC), which is spared in these patients and plays a pivotal role in audiovisual integration and oculomotor behavior. Recently, we developed a neural network model of audiovisual cortico-collicular loops, including interconnected areas representing the retina, striate and extrastriate visual cortices, auditory cortex, and SC. The network simulated unilateral V1 lesion with possible spared tissue and reproduced "online" effects. Here, we extend the previous network to shed light on circuits, plastic mechanisms, and synaptic reorganization that can mediate the training effects and functionally implement visual rehabilitation. The network is enriched by the oculomotor SC-brainstem route, and Hebbian mechanisms of synaptic plasticity, and is used to test different training paradigms (audiovisual/visual stimulation in eye-movements/fixed-eyes condition) on simulated patients. Results predict different training effects and associate them to synaptic changes in specific circuits. Thanks to the SC multisensory enhancement, the audiovisual training is able to effectively strengthen the retina-SC route, which in turn can foster reinforcement of the SC-brainstem route (this occurs only in eye-movements condition) and reinforcement of the SC-extrastriate route (this occurs in presence of survived V1 tissue, regardless of eye condition). The retina-SC-brainstem circuit may mediate compensatory effects: the model assumes that reinforcement of this circuit can translate visual

  9. The Design of Model-Based Training Programs

    Science.gov (United States)

    Polson, Peter; Sherry, Lance; Feary, Michael; Palmer, Everett; Alkin, Marty; McCrobie, Dan; Kelley, Jerry; Rosekind, Mark (Technical Monitor)

    1997-01-01

    This paper proposes a model-based training program for the skills necessary to operate advance avionics systems that incorporate advanced autopilots and fight management systems. The training model is based on a formalism, the operational procedure model, that represents the mission model, the rules, and the functions of a modem avionics system. This formalism has been defined such that it can be understood and shared by pilots, the avionics software, and design engineers. Each element of the software is defined in terms of its intent (What?), the rationale (Why?), and the resulting behavior (How?). The Advanced Computer Tutoring project at Carnegie Mellon University has developed a type of model-based, computer aided instructional technology called cognitive tutors. They summarize numerous studies showing that training times to a specified level of competence can be achieved in one third the time of conventional class room instruction. We are developing a similar model-based training program for the skills necessary to operation the avionics. The model underlying the instructional program and that simulates the effects of pilots entries and the behavior of the avionics is based on the operational procedure model. Pilots are given a series of vertical flightpath management problems. Entries that result in violations, such as failure to make a crossing restriction or violating the speed limits, result in error messages with instruction. At any time, the flightcrew can request suggestions on the appropriate set of actions. A similar and successful training program for basic skills for the FMS on the Boeing 737-300 was developed and evaluated. The results strongly support the claim that the training methodology can be adapted to the cockpit.

  10. Predictive value of EEG in postanoxic encephalopathy: A quantitative model-based approach.

    Science.gov (United States)

    Efthymiou, Evdokia; Renzel, Roland; Baumann, Christian R; Poryazova, Rositsa; Imbach, Lukas L

    2017-10-01

    The majority of comatose patients after cardiac arrest do not regain consciousness due to severe postanoxic encephalopathy. Early and accurate outcome prediction is therefore essential in determining further therapeutic interventions. The electroencephalogram is a standardized and commonly available tool used to estimate prognosis in postanoxic patients. The identification of pathological EEG patterns with poor prognosis relies however primarily on visual EEG scoring by experts. We introduced a model-based approach of EEG analysis (state space model) that allows for an objective and quantitative description of spectral EEG variability. We retrospectively analyzed standard EEG recordings in 83 comatose patients after cardiac arrest between 2005 and 2013 in the intensive care unit of the University Hospital Zürich. Neurological outcome was assessed one month after cardiac arrest using the Cerebral Performance Category. For a dynamic and quantitative EEG analysis, we implemented a model-based approach (state space analysis) to quantify EEG background variability independent from visual scoring of EEG epochs. Spectral variability was compared between groups and correlated with clinical outcome parameters and visual EEG patterns. Quantitative assessment of spectral EEG variability (state space velocity) revealed significant differences between patients with poor and good outcome after cardiac arrest: Lower mean velocity in temporal electrodes (T4 and T5) was significantly associated with poor prognostic outcome (pEEG patterns such as generalized periodic discharges (pEEG analysis (state space analysis) provides a novel, complementary marker for prognosis in postanoxic encephalopathy. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Integration of model-based control systems with artificial intelligence and workstations

    International Nuclear Information System (INIS)

    Lee, M.; Clearwater, S.

    1987-01-01

    Experience with model based accelerator control started at SPEAR. Since that SPEAR. Since that time nearly all accelerator beam lines have been controlled using model-based application programs, for example, PEP and SLC at SLAC. In order to take advantage of state-of-the-art hardware and software technology, the design and implementation of the accelerator control programs have undergone radical changes with time. Consequently, SPEAR, PEP, and SLC all use different control programs. Since many of these application programs are imbedded deep into the control system, they had to be rewritten each time. Each time this rewriting has occurred a great deal of time and effort has been spent on training physicists and programmers to do the job. Now, these application programs have been developed for a fourth time. This time, however, the programs being developed are generic so that they will not have to be done again. An integrated system called GOLD (Generic Orbit ampersand Lattice Debugger) has been developed for debugging and correcting trajectory errors in accelerator lattices. The system consists of a lattice modeling program (COMFORT), a beam simulator (PLUS), a graphical workstation environment (micro-VAX) and an expert system (ABLE). This paper will describe some of the features and applications of our integrated system with emphasis on the automation offered by expert systems. 5 refs., 4 figs

  12. GOLD: Integration of model-based control systems with artificial intelligence and workstations

    International Nuclear Information System (INIS)

    Lee, M.; Clearwater, S.

    1987-08-01

    Our experience with model-based accelerator control started at SPEAR. Since that time nearly all accelerator beamlines have been controlled using model-based application programs, for example, PEP and SLC at SLAC. In order to take advantage of state-of-the-art hardware and software technology, the design and implementation of the accelerator control programs have undergone radical changes with time. Consequently, SPEAR, PEP and SLC all use different control programs. Since many of these application programs are embedded deep into the control system, they had to be rewritten each time. Each time this rewriting has occurred a great deal of time and effort has been spent on training physicists and programmers to do the job. Now, we have developed an integrated system called GOLD (Genetic Orbit and Lattice Debugger) for debugging and correcting trajectory errors in accelerator lattices. The system consists of a lattice modeling program (COMFORT), a beam simulator (PLUS), a graphical workstation environment (micro-VAX) and an expert system (ABLE). This paper will describe some of the features and applications of our integrated system with emphasis on the automation offered by expert systems. 5 refs

  13. GOLD: Integration of model-based control systems with artificial intelligence and workstations

    International Nuclear Information System (INIS)

    Lee, M.; Clearwater, S.

    1987-08-01

    Our experience with model based accelerator control started at SPEAR. Since that time nearly all accelerator beam lines have been controlled using model-based application programs, for example, PEP and SLC at SLAC. In order to take advantage of state-of-the-art hardware and software technology, the design and implementation of the accelerator control programs have undergone radical change with time. Consequently, SPEAR, PEP, and SLC all use different control programs. Since many of these application programs are imbedded deep into the control system, they had to be rewritten each time. Each time this rewriting has occurred a great deal of time and effort has been spent on training physicists and programmers to do the job. Now, we have developed these application programs for a fourth time. This time, however, the programs we are developing are generic so that we will not have to do it again. We have developed an integrated system called GOLD (Generic Orbit and Lattice Debugger) for debugging and correcting trajectory errors in accelerator lattices. The system consists of a lattice modeling program (COMFORT), a beam simulator (PLUS), a graphical workstation environment (micro-VAX) and an expert system (ABLE). This paper will describe some of the features and applications of our integrated system with emphasis on the automation offered by expert systems. 5 refs

  14. Real-Time Model-Based Leak-Through Detection within Cryogenic Flow Systems

    Science.gov (United States)

    Walker, M.; Figueroa, F.

    2015-01-01

    The timely detection of leaks within cryogenic fuel replenishment systems is of significant importance to operators on account of the safety and economic impacts associated with material loss and operational inefficiencies. Associated loss in control of pressure also effects the stability and ability to control the phase of cryogenic fluids during replenishment operations. Current research dedicated to providing Prognostics and Health Management (PHM) coverage of such cryogenic replenishment systems has focused on the detection of leaks to atmosphere involving relatively simple model-based diagnostic approaches that, while effective, are unable to isolate the fault to specific piping system components. The authors have extended this research to focus on the detection of leaks through closed valves that are intended to isolate sections of the piping system from the flow and pressurization of cryogenic fluids. The described approach employs model-based detection of leak-through conditions based on correlations of pressure changes across isolation valves and attempts to isolate the faults to specific valves. Implementation of this capability is enabled by knowledge and information embedded in the domain model of the system. The approach has been used effectively to detect such leak-through faults during cryogenic operational testing at the Cryogenic Testbed at NASA's Kennedy Space Center.

  15. Model-based engineering for medical-device software.

    Science.gov (United States)

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.

  16. Model-based Control of a Bottom Fired Marine Boiler

    DEFF Research Database (Denmark)

    Solberg, Brian; Karstensen, Claus M. S.; Andersen, Palle

    2005-01-01

    This paper focuses on applying model based MIMO control to minimize variations in water level for a specific boiler type. A first principles model is put up. The model is linearized and an LQG controller is designed. Furthermore the benefit of using a steam °ow measurement is compared to a strategy...... relying on estimates of the disturbance. Preliminary tests at the boiler system show that the designed controller is able to control the boiler process. Furthermore it can be concluded that relying on estimates of the steam flow in the control strategy does not decrease the controller performance...

  17. Model-based Control of a Bottom Fired Marine Boiler

    DEFF Research Database (Denmark)

    Solberg, Brian; Karstensen, Claus M. S.; Andersen, Palle

    This paper focuses on applying model based MIMO control to minimize variations in water level for a specific boiler type. A first principles model is put up. The model is linearized and an LQG controller is designed. Furthermore the benefit of using a steam °ow measurement is compared to a strategy...... relying on estimates of the disturbance. Preliminary tests at the boiler system show that the designed controller is able to control the boiler process. Furthermore it can be concluded that relying on estimates of the steam flow in the control strategy does not decrease the controller performance...

  18. A Cyber-Attack Detection Model Based on Multivariate Analyses

    Science.gov (United States)

    Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi

    In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.

  19. Model-based efficiency evaluation of combine harvester traction drives

    Directory of Open Access Journals (Sweden)

    Steffen Häberle

    2015-08-01

    Full Text Available As part of the research the drive train of the combine harvesters is investigated in detail. The focus on load and power distribution, energy consumption and usage distribution are explicitly explored on two test machines. Based on the lessons learned during field operations, model-based studies of energy saving potential in the traction train of combine harvesters can now be quantified. Beyond that the virtual machine trial provides an opportunity to compare innovative drivetrain architectures and control solutions under reproducible conditions. As a result, an evaluation method is presented and generically used to draw comparisons under local representative operating conditions.

  20. Line impedance estimation using model based identification technique

    DEFF Research Database (Denmark)

    Ciobotaru, Mihai; Agelidis, Vassilios; Teodorescu, Remus

    2011-01-01

    The estimation of the line impedance can be used by the control of numerous grid-connected systems, such as active filters, islanding detection techniques, non-linear current controllers, detection of the on/off grid operation mode. Therefore, estimating the line impedance can add extra functions...... into the operation of the grid-connected power converters. This paper describes a quasi passive method for estimating the line impedance of the distribution electricity network. The method uses the model based identification technique to obtain the resistive and inductive parts of the line impedance. The quasi...

  1. Distributed model based control of multi unit evaporation systems

    International Nuclear Information System (INIS)

    Yudi Samyudia

    2006-01-01

    In this paper, we present a new approach to the analysis and design of distributed control systems for multi-unit plants. The approach is established after treating the effect of recycled dynamics as a gap metric uncertainty from which a distributed controller can be designed sequentially for each unit to tackle the uncertainty. We then use a single effect multi-unit evaporation system to illustrate how the proposed method is used to analyze different control strategies and to systematically achieve a better closed-loop performance using a distributed model-based controller

  2. An Industrial Model Based Disturbance Feedback Control Scheme

    DEFF Research Database (Denmark)

    Kawai, Fukiko; Nakazawa, Chikashi; Vinther, Kasper

    2014-01-01

    This paper presents a model based disturbance feedback control scheme. Industrial process systems have been traditionally controlled by using relay and PID controller. However these controllers are affected by disturbances and model errors and these effects degrade control performance. The authors...... propose a new control method that can decrease the negative impact of disturbance and model errors. The control method is motivated by industrial practice by Fuji Electric. Simulation tests are examined with a conventional PID controller and the disturbance feedback control. The simulation results...

  3. Stabilization of model-based networked control systems

    Energy Technology Data Exchange (ETDEWEB)

    Miranda, Francisco [CIDMA, Universidade de Aveiro, Aveiro (Portugal); Instituto Politécnico de Viana do Castelo, Viana do Castelo (Portugal); Abreu, Carlos [Instituto Politécnico de Viana do Castelo, Viana do Castelo (Portugal); CMEMS-UMINHO, Universidade do Minho, Braga (Portugal); Mendes, Paulo M. [CMEMS-UMINHO, Universidade do Minho, Braga (Portugal)

    2016-06-08

    A class of networked control systems called Model-Based Networked Control Systems (MB-NCSs) is considered. Stabilization of MB-NCSs is studied using feedback controls and simulation of stabilization for different feedbacks is made with the purpose to reduce the network trafic. The feedback control input is applied in a compensated model of the plant that approximates the plant dynamics and stabilizes the plant even under slow network conditions. Conditions for global exponential stabilizability and for the choosing of a feedback control input for a given constant time between the information moments of the network are derived. An optimal control problem to obtain an optimal feedback control is also presented.

  4. Model-Based Integration and Interpretation of Data

    DEFF Research Database (Denmark)

    Petersen, Johannes

    2004-01-01

    Data integration and interpretation plays a crucial role in supervisory control. The paper defines a set of generic inference steps for the data integration and interpretation process based on a three-layer model of system representations. The three-layer model is used to clarify the combination...... of constraint and object-centered representations of the work domain throwing new light on the basic principles underlying the data integration and interpretation process of Rasmussen's abstraction hierarchy as well as other model-based approaches combining constraint and object-centered representations. Based...

  5. Model-based automated testing of critical PLC programs.

    CERN Document Server

    Fernández Adiego, B; Tournier, J-C; González Suárez, V M; Bliudze, S

    2014-01-01

    Testing of critical PLC (Programmable Logic Controller) programs remains a challenging task for control system engineers as it can rarely be automated. This paper proposes a model based approach which uses the BIP (Behavior, Interactions and Priorities) framework to perform automated testing of PLC programs developed with the UNICOS (UNified Industrial COntrol System) framework. This paper defines the translation procedure and rules from UNICOS to BIP which can be fully automated in order to hide the complexity of the underlying model from the control engineers. The approach is illustrated and validated through the study of a water treatment process.

  6. Model-Based GUI Testing Using Uppaal at Novo Nordisk

    Science.gov (United States)

    Hjort, Ulrik H.; Illum, Jacob; Larsen, Kim G.; Petersen, Michael A.; Skou, Arne

    This paper details a collaboration between Aalborg University and Novo Nordiskin developing an automatic model-based test generation tool for system testing of the graphical user interface of a medical device on an embedded platform. The tool takes as input an UML Statemachine model and generates a test suite satisfying some testing criterion, such as edge or state coverage, and converts the individual test case into a scripting language that can be automatically executed against the target. The tool has significantly reduced the time required for test construction and generation, and reduced the number of test scripts while increasing the coverage.

  7. Vehicle coordinated transportation dispatching model base on multiple crisis locations

    Science.gov (United States)

    Tian, Ran; Li, Shanwei; Yang, Guoying

    2018-05-01

    Many disastrous events are often caused after unconventional emergencies occur, and the requirements of disasters are often different. It is difficult for a single emergency resource center to satisfy such requirements at the same time. Therefore, how to coordinate the emergency resources stored by multiple emergency resource centers to various disaster sites requires the coordinated transportation of emergency vehicles. In this paper, according to the problem of emergency logistics coordination scheduling, based on the related constraints of emergency logistics transportation, an emergency resource scheduling model based on multiple disasters is established.

  8. Implementation Politics

    DEFF Research Database (Denmark)

    Hegland, Troels Jacob; Raakjær, Jesper

    2008-01-01

    level are supplemented or even replaced by national priorities. The chapter concludes that in order to capture the domestic politics associated with CFP implementation in Denmark, it is important to understand the policy process as a synergistic interaction between dominant interests, policy alliances...

  9. Spatio-temporal Rich Model Based Video Steganalysis on Cross Sections of Motion Vector Planes.

    Science.gov (United States)

    Tasdemir, Kasim; Kurugollu, Fatih; Sezer, Sakir

    2016-05-11

    A rich model based motion vector steganalysis benefiting from both temporal and spatial correlations of motion vectors is proposed in this work. The proposed steganalysis method has a substantially superior detection accuracy than the previous methods, even the targeted ones. The improvement in detection accuracy lies in several novel approaches introduced in this work. Firstly, it is shown that there is a strong correlation, not only spatially but also temporally, among neighbouring motion vectors for longer distances. Therefore, temporal motion vector dependency along side the spatial dependency is utilized for rigorous motion vector steganalysis. Secondly, unlike the filters previously used, which were heuristically designed against a specific motion vector steganography, a diverse set of many filters which can capture aberrations introduced by various motion vector steganography methods is used. The variety and also the number of the filter kernels are substantially more than that of used in previous ones. Besides that, filters up to fifth order are employed whereas the previous methods use at most second order filters. As a result of these, the proposed system captures various decorrelations in a wide spatio-temporal range and provides a better cover model. The proposed method is tested against the most prominent motion vector steganalysis and steganography methods. To the best knowledge of the authors, the experiments section has the most comprehensive tests in motion vector steganalysis field including five stego and seven steganalysis methods. Test results show that the proposed method yields around 20% detection accuracy increase in low payloads and 5% in higher payloads.

  10. Observer and data-driven model based fault detection in Power Plant Coal Mills

    DEFF Research Database (Denmark)

    Fogh Odgaard, Peter; Lin, Bao; Jørgensen, Sten Bay

    2008-01-01

    model with motor power as the controlled variable, data-driven methods for fault detection are also investigated. Regression models that represent normal operating conditions (NOCs) are developed with both static and dynamic principal component analysis and partial least squares methods. The residual...... between process measurement and the NOC model prediction is used for fault detection. A hybrid approach, where a data-driven model is employed to derive an optimal unknown input observer, is also implemented. The three methods are evaluated with case studies on coal mill data, which includes a fault......This paper presents and compares model-based and data-driven fault detection approaches for coal mill systems. The first approach detects faults with an optimal unknown input observer developed from a simplified energy balance model. Due to the time-consuming effort in developing a first principles...

  11. Towards An Intelligent Model-Based Decision Support System For An Integrated Oil Company (EGPC)

    International Nuclear Information System (INIS)

    Khorshid, M.; Hassan, H.; Abdel Latife, M.A.

    2004-01-01

    Decision Support System (DSS) is an interactive, flexible and adaptable computer-based support system specially developed for supporting the solution of unstructured management problems [31] DSS has become widespread for oil industry domain in recent years. The computer-based DSS, which were developed and implemented in oil industry, are used to address the complex short-term planning and operational issues associated with downstream industry. Most of these applications concentrate on the data-centered tools, while the model-centered applications of DSS are still very limited up till now [20]. This study develops an Intelligent Model-Based DSS for an integrated oil company, to help policy makers and petroleum planner in improving the effectiveness of the strategic planning in oil sector. This domain basically imposes semi-structured or unstructured decisions and involves a very complex modeling process

  12. DEVELOPMENT OF SCIENCE PROCESS SKILLS STUDENTS WITH PROJECT BASED LEARNING MODEL- BASED TRAINING IN LEARNING PHYSICS

    Directory of Open Access Journals (Sweden)

    Ratna Malawati

    2016-06-01

    Full Text Available This study aims to improve the physics Science Process Skills Students on cognitive and psychomotor aspects by using model based Project Based Learning training.The object of this study is the Project Based Learning model used in the learning process of Computationa Physics.The method used is classroom action research through two learning cycles, each cycle consisting of the stages of planning, implementation, observation and reflection. In the first cycle of treatment with their emphasis given training in the first phase up to third in the model Project Based Learning, while the second cycle is given additional treatment with emphasis discussion is collaboration in achieving the best results for each group of products. The results of data analysis showed increased ability to think Students on cognitive and Science Process Skills in the psychomotor.

  13. Inverse grey-box model-based control of a dielectric elastomer actuator

    DEFF Research Database (Denmark)

    Jones, Richard William; Sarban, Rahimullah

    2012-01-01

    control performance across the operating range of the DE actuator, a gain scheduling term, which linearizes the operating characteristics of the tubular dielectric elastomer actuator, is developed and implemented in series with the IMC controller. The IMC-based approach is investigated for servo control......An accurate physical-based electromechanical model of a commercially available tubular dielectric elastomer (DE) actuator has been developed and validated. In this contribution, the use of the physical-based electromechanical model to formulate a model-based controller is examined. The choice...... of control scheme was dictated by the desire for transparency in both controller design and operation. The internal model control (IMC) approach was chosen. In this particular application, the inverse of the linearized form of the grey-box model is used to formulate the IMC controller. To ensure consistent...

  14. A probabilistic graphical model based stochastic input model construction

    International Nuclear Information System (INIS)

    Wan, Jiang; Zabaras, Nicholas

    2014-01-01

    Model reduction techniques have been widely used in modeling of high-dimensional stochastic input in uncertainty quantification tasks. However, the probabilistic modeling of random variables projected into reduced-order spaces presents a number of computational challenges. Due to the curse of dimensionality, the underlying dependence relationships between these random variables are difficult to capture. In this work, a probabilistic graphical model based approach is employed to learn the dependence by running a number of conditional independence tests using observation data. Thus a probabilistic model of the joint PDF is obtained and the PDF is factorized into a set of conditional distributions based on the dependence structure of the variables. The estimation of the joint PDF from data is then transformed to estimating conditional distributions under reduced dimensions. To improve the computational efficiency, a polynomial chaos expansion is further applied to represent the random field in terms of a set of standard random variables. This technique is combined with both linear and nonlinear model reduction methods. Numerical examples are presented to demonstrate the accuracy and efficiency of the probabilistic graphical model based stochastic input models. - Highlights: • Data-driven stochastic input models without the assumption of independence of the reduced random variables. • The problem is transformed to a Bayesian network structure learning problem. • Examples are given in flows in random media

  15. Intelligent model-based diagnostics for vehicle health management

    Science.gov (United States)

    Luo, Jianhui; Tu, Fang; Azam, Mohammad S.; Pattipati, Krishna R.; Willett, Peter K.; Qiao, Liu; Kawamoto, Masayuki

    2003-08-01

    The recent advances in sensor technology, remote communication and computational capabilities, and standardized hardware/software interfaces are creating a dramatic shift in the way the health of vehicles is monitored and managed. These advances facilitate remote monitoring, diagnosis and condition-based maintenance of automotive systems. With the increased sophistication of electronic control systems in vehicles, there is a concomitant increased difficulty in the identification of the malfunction phenomena. Consequently, the current rule-based diagnostic systems are difficult to develop, validate and maintain. New intelligent model-based diagnostic methodologies that exploit the advances in sensor, telecommunications, computing and software technologies are needed. In this paper, we will investigate hybrid model-based techniques that seamlessly employ quantitative (analytical) models and graph-based dependency models for intelligent diagnosis. Automotive engineers have found quantitative simulation (e.g. MATLAB/SIMULINK) to be a vital tool in the development of advanced control systems. The hybrid method exploits this capability to improve the diagnostic system's accuracy and consistency, utilizes existing validated knowledge on rule-based methods, enables remote diagnosis, and responds to the challenges of increased system complexity. The solution is generic and has the potential for application in a wide range of systems.

  16. Muscles of mastication model-based MR image segmentation

    Energy Technology Data Exchange (ETDEWEB)

    Ng, H.P. [NUS Graduate School for Integrative Sciences and Engineering, Singapore (Singapore); Agency for Science Technology and Research, Singapore (Singapore). Biomedical Imaging Lab.; Ong, S.H. [National Univ. of Singapore (Singapore). Dept. of Electrical and Computer Engineering; National Univ. of Singapore (Singapore). Div. of Bioengineering; Hu, Q.; Nowinski, W.L. [Agency for Science Technology and Research, Singapore (Singapore). Biomedical Imaging Lab.; Foong, K.W.C. [NUS Graduate School for Integrative Sciences and Engineering, Singapore (Singapore); National Univ. of Singapore (Singapore). Dept. of Preventive Dentistry; Goh, P.S. [National Univ. of Singapore (Singapore). Dept. of Diagnostic Radiology

    2006-11-15

    Objective: The muscles of mastication play a major role in the orodigestive system as the principal motive force for the mandible. An algorithm for segmenting these muscles from magnetic resonance (MR) images was developed and tested. Materials and methods: Anatomical information about the muscles of mastication in MR images is used to obtain the spatial relationships relating the muscle region of interest (ROI) and head ROI. A model-based technique that involves the spatial relationships between head and muscle ROIs as well as muscle templates is developed. In the segmentation stage, the muscle ROI is derived from the model. Within the muscle ROI, anisotropic diffusion is applied to smooth the texture, followed by thresholding to exclude bone and fat. The muscle template and morphological operators are employed to obtain an initial estimate of the muscle boundary, which then serves as the input contour to the gradient vector flow snake that iterates to the final segmentation. Results: The method was applied to segmentation of the masseter, lateral pterygoid and medial pterygoid in 75 images. The overlap indices (K) achieved are 91.4, 92.1 and 91.2%, respectively. Conclusion: A model-based method for segmenting the muscles of mastication from MR images was developed and tested. The results show good agreement between manual and automatic segmentations. (orig.)

  17. Experience with model based display for advanced diagnostics and control

    International Nuclear Information System (INIS)

    Staffon, J.D.; Lindsay, R.W.

    1989-01-01

    A full color, model based display system based on the Rankine thermodynamic cycle has been developed for use at the Experimental Breeder Reactor II by plant operators, engineers, and experimenters. The displays generate a real time thermodynamic model of the plant processes on computer screens to provide a direct indication of the plant performance. Operators and others who view the displays are no longer required to mentally ''construct'' a model of the process before acting. The model based display accurately depicts the plant states. It appears to effectively reduce the gulf of evaluation, which should result in a significant reduction in human operator errors if this plant display approach is adopted by the nuclear industry. Preliminary comments from users, including operators, indicate an overwhelming acceptance of the display approach. The displays incorporate alarm functions as well as levels of detail ''paging'' capability. The system is developed on a computer network which allows the easy addition of displays as well as extra computers. Constructing a complete console can be rapid and inexpensive. 1 ref., 2 figs

  18. Model based rib-cage unfolding for trauma CT

    Science.gov (United States)

    von Berg, Jens; Klinder, Tobias; Lorenz, Cristian

    2018-03-01

    A CT rib-cage unfolding method is proposed that does not require to determine rib centerlines but determines the visceral cavity surface by model base segmentation. Image intensities are sampled across this surface that is flattened using a model based 3D thin-plate-spline registration. An average rib centerline model projected onto this surface serves as a reference system for registration. The flattening registration is designed so that ribs similar to the centerline model are mapped onto parallel lines preserving their relative length. Ribs deviating from this model appear deviating from straight parallel ribs in the unfolded view, accordingly. As the mapping is continuous also the details in intercostal space and those adjacent to the ribs are rendered well. The most beneficial application area is Trauma CT where a fast detection of rib fractures is a crucial task. Specifically in trauma, automatic rib centerline detection may not be guaranteed due to fractures and dislocations. The application by visual assessment on the large public LIDC data base of lung CT proved general feasibility of this early work.

  19. Hybrid attacks on model-based social recommender systems

    Science.gov (United States)

    Yu, Junliang; Gao, Min; Rong, Wenge; Li, Wentao; Xiong, Qingyu; Wen, Junhao

    2017-10-01

    With the growing popularity of the online social platform, the social network based approaches to recommendation emerged. However, because of the open nature of rating systems and social networks, the social recommender systems are susceptible to malicious attacks. In this paper, we present a certain novel attack, which inherits characteristics of the rating attack and the relation attack, and term it hybrid attack. Furtherly, we explore the impact of the hybrid attack on model-based social recommender systems in multiple aspects. The experimental results show that, the hybrid attack is more destructive than the rating attack in most cases. In addition, users and items with fewer ratings will be influenced more when attacked. Last but not the least, the findings suggest that spammers do not depend on the feedback links from normal users to become more powerful, the unilateral links can make the hybrid attack effective enough. Since unilateral links are much cheaper, the hybrid attack will be a great threat to model-based social recommender systems.

  20. Binaural model-based dynamic-range compression.

    Science.gov (United States)

    Ernst, Stephan M A; Kortlang, Steffen; Grimm, Giso; Bisitz, Thomas; Kollmeier, Birger; Ewert, Stephan D

    2018-01-26

    Binaural cues such as interaural level differences (ILDs) are used to organise auditory perception and to segregate sound sources in complex acoustical environments. In bilaterally fitted hearing aids, dynamic-range compression operating independently at each ear potentially alters these ILDs, thus distorting binaural perception and sound source segregation. A binaurally-linked model-based fast-acting dynamic compression algorithm designed to approximate the normal-hearing basilar membrane (BM) input-output function in hearing-impaired listeners is suggested. A multi-center evaluation in comparison with an alternative binaural and two bilateral fittings was performed to assess the effect of binaural synchronisation on (a) speech intelligibility and (b) perceived quality in realistic conditions. 30 and 12 hearing impaired (HI) listeners were aided individually with the algorithms for both experimental parts, respectively. A small preference towards the proposed model-based algorithm in the direct quality comparison was found. However, no benefit of binaural-synchronisation regarding speech intelligibility was found, suggesting a dominant role of the better ear in all experimental conditions. The suggested binaural synchronisation of compression algorithms showed a limited effect on the tested outcome measures, however, linking could be situationally beneficial to preserve a natural binaural perception of the acoustical environment.

  1. Application of model based control to robotic manipulators

    Science.gov (United States)

    Petrosky, Lyman J.; Oppenheim, Irving J.

    1988-01-01

    A robot that can duplicate humam motion capabilities in such activities as balancing, reaching, lifting, and moving has been built and tested. These capabilities are achieved through the use of real time Model-Based Control (MBC) techniques which have recently been demonstrated. MBC accounts for all manipulator inertial forces and provides stable manipulator motion control even at high speeds. To effectively demonstrate the unique capabilities of MBC, an experimental robotic manipulator was constructed, which stands upright, balancing on a two wheel base. The mathematical modeling of dynamics inherent in MBC permit the control system to perform functions that are impossible with conventional non-model based methods. These capabilities include: (1) Stable control at all speeds of operation; (2) Operations requiring dynamic stability such as balancing; (3) Detection and monitoring of applied forces without the use of load sensors; (4) Manipulator safing via detection of abnormal loads. The full potential of MBC has yet to be realized. The experiments performed for this research are only an indication of the potential applications. MBC has no inherent stability limitations and its range of applicability is limited only by the attainable sampling rate, modeling accuracy, and sensor resolution. Manipulators could be designed to operate at the highest speed mechanically attainable without being limited by control inadequacies. Manipulators capable of operating many times faster than current machines would certainly increase productivity for many tasks.

  2. Model-based design of adaptive embedded systems

    CERN Document Server

    Hamberg, Roelof; Reckers, Frans; Verriet, Jacques

    2013-01-01

    Today’s embedded systems have to operate in a wide variety of dynamically changing environmental circumstances. Adaptivity, the ability of a system to autonomously adapt itself, is a means to optimise a system’s behaviour to accommodate changes in its environment. It involves making in-product trade-offs between system qualities at system level. The main challenge in the development of adaptive systems is keeping control of the intrinsic complexity of such systems while working with multi-disciplinary teams to create different parts of the system. Model-Based Development of Adaptive Embedded Systems focuses on the development of adaptive embedded systems both from an architectural and methodological point of view. It describes architectural solution patterns for adaptive systems and state-of-the-art model-based methods and techniques to support adaptive system development. In particular, the book describes the outcome of the Octopus project, a cooperation of a multi-disciplinary team of academic and indus...

  3. Diagnosis and Model Based Identification of a Coupling Misalignment

    Directory of Open Access Journals (Sweden)

    P. Pennacchi

    2005-01-01

    Full Text Available This paper is focused on the application of two different diagnostic techniques aimed to identify the most important faults in rotating machinery as well as on the simulation and prediction of the frequency response of rotating machines. The application of the two diagnostics techniques, the orbit shape analysis and the model based identification in the frequency domain, is described by means of an experimental case study that concerns a gas turbine-generator unit of a small power plant whose rotor-train was affected by an angular misalignment in a flexible coupling, caused by a wrong machine assembling. The fault type is identified by means of the orbit shape analysis, then the equivalent bending moments, which enable the shaft experimental vibrations to be simulated, have been identified using a model based identification method. These excitations have been used to predict the machine vibrations in a large rotating speed range inside which no monitoring data were available. To the best of the authors' knowledge, this is the first case of identification of coupling misalignment and prediction of the consequent machine behaviour in an actual size rotating machinery. The successful results obtained emphasise the usefulness of integrating common condition monitoring techniques with diagnostic strategies.

  4. Muscles of mastication model-based MR image segmentation

    International Nuclear Information System (INIS)

    Ng, H.P.; Agency for Science Technology and Research, Singapore; Ong, S.H.; National Univ. of Singapore; Hu, Q.; Nowinski, W.L.; Foong, K.W.C.; National Univ. of Singapore; Goh, P.S.

    2006-01-01

    Objective: The muscles of mastication play a major role in the orodigestive system as the principal motive force for the mandible. An algorithm for segmenting these muscles from magnetic resonance (MR) images was developed and tested. Materials and methods: Anatomical information about the muscles of mastication in MR images is used to obtain the spatial relationships relating the muscle region of interest (ROI) and head ROI. A model-based technique that involves the spatial relationships between head and muscle ROIs as well as muscle templates is developed. In the segmentation stage, the muscle ROI is derived from the model. Within the muscle ROI, anisotropic diffusion is applied to smooth the texture, followed by thresholding to exclude bone and fat. The muscle template and morphological operators are employed to obtain an initial estimate of the muscle boundary, which then serves as the input contour to the gradient vector flow snake that iterates to the final segmentation. Results: The method was applied to segmentation of the masseter, lateral pterygoid and medial pterygoid in 75 images. The overlap indices (K) achieved are 91.4, 92.1 and 91.2%, respectively. Conclusion: A model-based method for segmenting the muscles of mastication from MR images was developed and tested. The results show good agreement between manual and automatic segmentations. (orig.)

  5. 49 CFR 173.23 - Previously authorized packaging.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Previously authorized packaging. 173.23 Section... REQUIREMENTS FOR SHIPMENTS AND PACKAGINGS Preparation of Hazardous Materials for Transportation § 173.23 Previously authorized packaging. (a) When the regulations specify a packaging with a specification marking...

  6. 28 CFR 10.5 - Incorporation of papers previously filed.

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Incorporation of papers previously filed... CARRYING ON ACTIVITIES WITHIN THE UNITED STATES Registration Statement § 10.5 Incorporation of papers previously filed. Papers and documents already filed with the Attorney General pursuant to the said act and...

  7. 75 FR 76056 - FEDERAL REGISTER CITATION OF PREVIOUS ANNOUNCEMENT:

    Science.gov (United States)

    2010-12-07

    ... SECURITIES AND EXCHANGE COMMISSION Sunshine Act Meeting FEDERAL REGISTER CITATION OF PREVIOUS ANNOUNCEMENT: STATUS: Closed meeting. PLACE: 100 F Street, NE., Washington, DC. DATE AND TIME OF PREVIOUSLY ANNOUNCED MEETING: Thursday, December 9, 2010 at 2 p.m. CHANGE IN THE MEETING: Time change. The closed...

  8. Drawing-to-Learn: A Framework for Using Drawings to Promote Model-Based Reasoning in Biology

    Science.gov (United States)

    Quillin, Kim; Thomas, Stephen

    2015-01-01

    The drawing of visual representations is important for learners and scientists alike, such as the drawing of models to enable visual model-based reasoning. Yet few biology instructors recognize drawing as a teachable science process skill, as reflected by its absence in the Vision and Change report’s Modeling and Simulation core competency. Further, the diffuse research on drawing can be difficult to access, synthesize, and apply to classroom practice. We have created a framework of drawing-to-learn that defines drawing, categorizes the reasons for using drawing in the biology classroom, and outlines a number of interventions that can help instructors create an environment conducive to student drawing in general and visual model-based reasoning in particular. The suggested interventions are organized to address elements of affect, visual literacy, and visual model-based reasoning, with specific examples cited for each. Further, a Blooming tool for drawing exercises is provided, as are suggestions to help instructors address possible barriers to implementing and assessing drawing-to-learn in the classroom. Overall, the goal of the framework is to increase the visibility of drawing as a skill in biology and to promote the research and implementation of best practices. PMID:25713094

  9. Active tilting-pad journal bearings supporting flexible rotors: Part II–The model-based feedback-controlled lubrication

    DEFF Research Database (Denmark)

    Salazar, Jorge Andrés González; Santos, Ilmar

    2017-01-01

    This is part II of a twofold paper series dealing with the design and implementation of model-based controllers meant for assisting the hybrid and developing the feedback-controlled lubrication regimes in active tilting pad journal bearings (active TPJBs). In both papers theoretical and experimen...... derived in part I. Results show further suppression of resonant vibrations when using the feedback-controlled or active lubrication, overweighting the reduction already achieved with hybrid lubrication, thus improving the whole machine dynamic performance.......This is part II of a twofold paper series dealing with the design and implementation of model-based controllers meant for assisting the hybrid and developing the feedback-controlled lubrication regimes in active tilting pad journal bearings (active TPJBs). In both papers theoretical...... and experimental analyses are presented with focus on the reduction of rotor lateral vibration. This part is devoted to synthesising model-based LQG optimal controllers (LQR regulator + Kalman Filter) for the feedback-controlled lubrication and is based upon the mathematical model of the rotor-bearing system...

  10. No discrimination against previous mates in a sexually cannibalistic spider

    Science.gov (United States)

    Fromhage, Lutz; Schneider, Jutta M.

    2005-09-01

    In several animal species, females discriminate against previous mates in subsequent mating decisions, increasing the potential for multiple paternity. In spiders, female choice may take the form of selective sexual cannibalism, which has been shown to bias paternity in favor of particular males. If cannibalistic attacks function to restrict a male's paternity, females may have little interest to remate with males having survived such an attack. We therefore studied the possibility of female discrimination against previous mates in sexually cannibalistic Argiope bruennichi, where females almost always attack their mate at the onset of copulation. We compared mating latency and copulation duration of males having experienced a previous copulation either with the same or with a different female, but found no evidence for discrimination against previous mates. However, males copulated significantly shorter when inserting into a used, compared to a previously unused, genital pore of the female.

  11. Implant breast reconstruction after salvage mastectomy in previously irradiated patients.

    Science.gov (United States)

    Persichetti, Paolo; Cagli, Barbara; Simone, Pierfranco; Cogliandro, Annalisa; Fortunato, Lucio; Altomare, Vittorio; Trodella, Lucio

    2009-04-01

    The most common surgical approach in case of local tumor recurrence after quadrantectomy and radiotherapy is salvage mastectomy. Breast reconstruction is the subsequent phase of the treatment and the plastic surgeon has to operate on previously irradiated and manipulated tissues. The medical literature highlights that breast reconstruction with tissue expanders is not a pursuable option, considering previous radiotherapy a contraindication. The purpose of this retrospective study is to evaluate the influence of previous radiotherapy on 2-stage breast reconstruction (tissue expander/implant). Only patients with analogous timing of radiation therapy and the same demolitive and reconstructive procedures were recruited. The results of this study prove that, after salvage mastectomy in previously irradiated patients, implant reconstruction is still possible. Further comparative studies are, of course, advisable to draw any conclusion on the possibility to perform implant reconstruction in previously irradiated patients.

  12. Implementation Strategy

    Science.gov (United States)

    1983-01-01

    Meeting the identified needs of Earth science requires approaching EOS as an information system and not simply as one or more satellites with instruments. Six elements of strategy are outlined as follows: implementation of the individual discipline missions as currently planned; use of sustained observational capabilities offered by operational satellites without waiting for the launch of new mission; put first priority on the data system; deploy an Advanced Data Collection and Location System; put a substantial new observing capability in a low Earth orbit in such a way as to provide for sustained measurements; and group instruments to exploit their capabilities for synergism; maximize the scientific utility of the mission; and minimize the costs of implementation where possible.

  13. Implementing Pseudonymity

    Directory of Open Access Journals (Sweden)

    Miranda Mowbray

    2006-03-01

    Full Text Available I will give an overview of some technologies that enable pseudonymity - allowing individuals to reveal or prove information about themselves to others without revealing their full identity. I will describe some functionalities relating to pseudonymity that can be implemented, and some that cannot. My intention is to present enough of the mathematics that underlies technology for pseudonymity to show that it is indeed possible to implement some functionalities that at first glance may appear impossible. In particular, I will show that several of the intended functions of the UK national ID could be provided in a pseudonymous fashion, allowing greater privacy. I will also outline some technology developed at HP Labs which ensures that users’ personal data is released only to software that has been checked to conform to their preferred privacy policies.

  14. Development of a CANDU Moderator Analysis Model; Based on Coupled Solver

    International Nuclear Information System (INIS)

    Yoon, Churl; Park, Joo Hwan

    2006-01-01

    A CFD model for predicting the CANDU-6 moderator temperature has been developed for several years in KAERI, which is based on CFX-4. This analytic model(CFX4-CAMO) has some strength in the modeling of hydraulic resistance in the core region and in the treatment of heat source term in the energy equations. But the convergence difficulties and slow computing speed reveal to be the limitations of this model, because the CFX-4 code adapts a segregated solver to solve the governing equations with strong coupled-effect. Compared to CFX-4 using segregated solver, CFX-10 adapts high efficient and robust coupled-solver. Before December 2005 when CFX-10 was distributed, the previous version of CFX-10(CFX-5. series) also adapted coupled solver but didn't have any capability to apply porous media approaches correctly. In this study, the developed moderator analysis model based on CFX- 4 (CFX4-CAMO) is transformed into a new moderator analysis model based on CFX-10. The new model is examined and the results are compared to the former

  15. Re-evaluation of model-based light-scattering spectroscopy for tissue spectroscopy

    Science.gov (United States)

    Lau, Condon; Šćepanović, Obrad; Mirkovic, Jelena; McGee, Sasha; Yu, Chung-Chieh; Fulghum, Stephen; Wallace, Michael; Tunnell, James; Bechtel, Kate; Feld, Michael

    2009-01-01

    Model-based light scattering spectroscopy (LSS) seemed a promising technique for in-vivo diagnosis of dysplasia in multiple organs. In the studies, the residual spectrum, the difference between the observed and modeled diffuse reflectance spectra, was attributed to single elastic light scattering from epithelial nuclei, and diagnostic information due to nuclear changes was extracted from it. We show that this picture is incorrect. The actual single scattering signal arising from epithelial nuclei is much smaller than the previously computed residual spectrum, and does not have the wavelength dependence characteristic of Mie scattering. Rather, the residual spectrum largely arises from assuming a uniform hemoglobin distribution. In fact, hemoglobin is packaged in blood vessels, which alters the reflectance. When we include vessel packaging, which accounts for an inhomogeneous hemoglobin distribution, in the diffuse reflectance model, the reflectance is modeled more accurately, greatly reducing the amplitude of the residual spectrum. These findings are verified via numerical estimates based on light propagation and Mie theory, tissue phantom experiments, and analysis of published data measured from Barrett’s esophagus. In future studies, vessel packaging should be included in the model of diffuse reflectance and use of model-based LSS should be discontinued. PMID:19405760

  16. Model-based prediction of myelosuppression and recovery based on frequent neutrophil monitoring.

    Science.gov (United States)

    Netterberg, Ida; Nielsen, Elisabet I; Friberg, Lena E; Karlsson, Mats O

    2017-08-01

    To investigate whether a more frequent monitoring of the absolute neutrophil counts (ANC) during myelosuppressive chemotherapy, together with model-based predictions, can improve therapy management, compared to the limited clinical monitoring typically applied today. Daily ANC in chemotherapy-treated cancer patients were simulated from a previously published population model describing docetaxel-induced myelosuppression. The simulated values were used to generate predictions of the individual ANC time-courses, given the myelosuppression model. The accuracy of the predicted ANC was evaluated under a range of conditions with reduced amount of ANC measurements. The predictions were most accurate when more data were available for generating the predictions and when making short forecasts. The inaccuracy of ANC predictions was highest around nadir, although a high sensitivity (≥90%) was demonstrated to forecast Grade 4 neutropenia before it occurred. The time for a patient to recover to baseline could be well forecasted 6 days (±1 day) before the typical value occurred on day 17. Daily monitoring of the ANC, together with model-based predictions, could improve anticancer drug treatment by identifying patients at risk for severe neutropenia and predicting when the next cycle could be initiated.

  17. Using model-based screening to help discover unknown environmental contaminants.

    Science.gov (United States)

    McLachlan, Michael S; Kierkegaard, Amelie; Radke, Michael; Sobek, Anna; Malmvärn, Anna; Alsberg, Tomas; Arnot, Jon A; Brown, Trevor N; Wania, Frank; Breivik, Knut; Xu, Shihe

    2014-07-01

    Of the tens of thousands of chemicals in use, only a small fraction have been analyzed in environmental samples. To effectively identify environmental contaminants, methods to prioritize chemicals for analytical method development are required. We used a high-throughput model of chemical emissions, fate, and bioaccumulation to identify chemicals likely to have high concentrations in specific environmental media, and we prioritized these for target analysis. This model-based screening was applied to 215 organosilicon chemicals culled from industrial chemical production statistics. The model-based screening prioritized several recognized organosilicon contaminants and generated hypotheses leading to the selection of three chemicals that have not previously been identified as potential environmental contaminants for target analysis. Trace analytical methods were developed, and the chemicals were analyzed in air, sewage sludge, and sediment. All three substances were found to be environmental contaminants. Phenyl-tris(trimethylsiloxy)silane was present in all samples analyzed, with concentrations of ∼50 pg m(-3) in Stockholm air and ∼0.5 ng g(-1) dw in sediment from the Stockholm archipelago. Tris(trifluoropropyl)trimethyl-cyclotrisiloxane and tetrakis(trifluoropropyl)tetramethyl-cyclotetrasiloxane were found in sediments from Lake Mjøsa at ∼1 ng g(-1) dw. The discovery of three novel environmental contaminants shows that models can be useful for prioritizing chemicals for exploratory assessment.

  18. Fusion Implementation

    International Nuclear Information System (INIS)

    Schmidt, J.A.

    2002-01-01

    If a fusion DEMO reactor can be brought into operation during the first half of this century, fusion power production can have a significant impact on carbon dioxide production during the latter half of the century. An assessment of fusion implementation scenarios shows that the resource demands and waste production associated with these scenarios are manageable factors. If fusion is implemented during the latter half of this century it will be one element of a portfolio of (hopefully) carbon dioxide limiting sources of electrical power. It is time to assess the regional implications of fusion power implementation. An important attribute of fusion power is the wide range of possible regions of the country, or countries in the world, where power plants can be located. Unlike most renewable energy options, fusion energy will function within a local distribution system and not require costly, and difficult, long distance transmission systems. For example, the East Coast of the United States is a prime candidate for fusion power deployment by virtue of its distance from renewable energy sources. As fossil fuels become less and less available as an energy option, the transmission of energy across bodies of water will become very expensive. On a global scale, fusion power will be particularly attractive for regions separated from sources of renewable energy by oceans

  19. A Model Based on Cocitation for Web Information Retrieval

    Directory of Open Access Journals (Sweden)

    Yue Xie

    2014-01-01

    Full Text Available According to the relationship between authority and cocitation in HITS, we propose a new hyperlink weighting scheme to describe the strength of the relevancy between any two webpages. Then we combine hyperlink weight normalization and random surfing schemes as used in PageRank to justify the new model. In the new model based on cocitation (MBCC, the pages with stronger relevancy are assigned higher values, not just depending on the outlinks. This model combines both features of HITS and PageRank. Finally, we present the results of some numerical experiments, showing that the MBCC ranking agrees with the HITS ranking, especially in top 10. Meanwhile, MBCC keeps the superiority of PageRank, that is, existence and uniqueness of ranking vectors.

  20. A social discounting model based on Tsallis’ statistics

    Science.gov (United States)

    Takahashi, Taiki

    2010-09-01

    Social decision making (e.g. social discounting and social preferences) has been attracting attention in economics, econophysics, social physics, behavioral psychology, and neuroeconomics. This paper proposes a novel social discounting model based on the deformed algebra developed in the Tsallis’ non-extensive thermostatistics. Furthermore, it is suggested that this model can be utilized to quantify the degree of consistency in social discounting in humans and analyze the relationships between behavioral tendencies in social discounting and other-regarding economic decision making under game-theoretic conditions. Future directions in the application of the model to studies in econophysics, neuroeconomics, and social physics, as well as real-world problems such as the supply of live organ donations, are discussed.

  1. Vocational Teachers and Professionalism - A Model Based on Empirical Analyses

    DEFF Research Database (Denmark)

    Duch, Henriette Skjærbæk; Andreasen, Karen E

    Vocational Teachers and Professionalism - A Model Based on Empirical Analyses Several theorists has developed models to illustrate the processes of adult learning and professional development (e.g. Illeris, Argyris, Engeström; Wahlgren & Aarkorg, Kolb and Wenger). Models can sometimes be criticized...... emphasis on the adult employee, the organization, its surroundings as well as other contextual factors. Our concern is adult vocational teachers attending a pedagogical course and teaching at vocational colleges. The aim of the paper is to discuss different models and develop a model concerning teachers...... at vocational colleges based on empirical data in a specific context, vocational teacher-training course in Denmark. By offering a basis and concepts for analysis of practice such model is meant to support the development of vocational teachers’ professionalism at courses and in organizational contexts...

  2. [Model-based biofuels system analysis: a review].

    Science.gov (United States)

    Chang, Shiyan; Zhang, Xiliang; Zhao, Lili; Ou, Xunmin

    2011-03-01

    Model-based system analysis is an important tool for evaluating the potential and impacts of biofuels, and for drafting biofuels technology roadmaps and targets. The broad reach of the biofuels supply chain requires that biofuels system analyses span a range of disciplines, including agriculture/forestry, energy, economics, and the environment. Here we reviewed various models developed for or applied to modeling biofuels, and presented a critical analysis of Agriculture/Forestry System Models, Energy System Models, Integrated Assessment Models, Micro-level Cost, Energy and Emission Calculation Models, and Specific Macro-level Biofuel Models. We focused on the models' strengths, weaknesses, and applicability, facilitating the selection of a suitable type of model for specific issues. Such an analysis was a prerequisite for future biofuels system modeling, and represented a valuable resource for researchers and policy makers.

  3. Neuro-fuzzy system modeling based on automatic fuzzy clustering

    Institute of Scientific and Technical Information of China (English)

    Yuangang TANG; Fuchun SUN; Zengqi SUN

    2005-01-01

    A neuro-fuzzy system model based on automatic fuzzy clustering is proposed.A hybrid model identification algorithm is also developed to decide the model structure and model parameters.The algorithm mainly includes three parts:1) Automatic fuzzy C-means (AFCM),which is applied to generate fuzzy rules automatically,and then fix on the size of the neuro-fuzzy network,by which the complexity of system design is reducesd greatly at the price of the fitting capability;2) Recursive least square estimation (RLSE).It is used to update the parameters of Takagi-Sugeno model,which is employed to describe the behavior of the system;3) Gradient descent algorithm is also proposed for the fuzzy values according to the back propagation algorithm of neural network.Finally,modeling the dynamical equation of the two-link manipulator with the proposed approach is illustrated to validate the feasibility of the method.

  4. Numerical Analysis of Modeling Based on Improved Elman Neural Network

    Directory of Open Access Journals (Sweden)

    Shao Jie

    2014-01-01

    Full Text Available A modeling based on the improved Elman neural network (IENN is proposed to analyze the nonlinear circuits with the memory effect. The hidden layer neurons are activated by a group of Chebyshev orthogonal basis functions instead of sigmoid functions in this model. The error curves of the sum of squared error (SSE varying with the number of hidden neurons and the iteration step are studied to determine the number of the hidden layer neurons. Simulation results of the half-bridge class-D power amplifier (CDPA with two-tone signal and broadband signals as input have shown that the proposed behavioral modeling can reconstruct the system of CDPAs accurately and depict the memory effect of CDPAs well. Compared with Volterra-Laguerre (VL model, Chebyshev neural network (CNN model, and basic Elman neural network (BENN model, the proposed model has better performance.

  5. Numeral eddy current sensor modelling based on genetic neural network

    International Nuclear Information System (INIS)

    Yu Along

    2008-01-01

    This paper presents a method used to the numeral eddy current sensor modelling based on the genetic neural network to settle its nonlinear problem. The principle and algorithms of genetic neural network are introduced. In this method, the nonlinear model parameters of the numeral eddy current sensor are optimized by genetic neural network (GNN) according to measurement data. So the method remains both the global searching ability of genetic algorithm and the good local searching ability of neural network. The nonlinear model has the advantages of strong robustness, on-line modelling and high precision. The maximum nonlinearity error can be reduced to 0.037% by using GNN. However, the maximum nonlinearity error is 0.075% using the least square method

  6. Driver's mental workload prediction model based on physiological indices.

    Science.gov (United States)

    Yan, Shengyuan; Tran, Cong Chi; Wei, Yingying; Habiyaremye, Jean Luc

    2017-09-15

    Developing an early warning model to predict the driver's mental workload (MWL) is critical and helpful, especially for new or less experienced drivers. The present study aims to investigate the correlation between new drivers' MWL and their work performance, regarding the number of errors. Additionally, the group method of data handling is used to establish the driver's MWL predictive model based on subjective rating (NASA task load index [NASA-TLX]) and six physiological indices. The results indicate that the NASA-TLX and the number of errors are positively correlated, and the predictive model shows the validity of the proposed model with an R 2 value of 0.745. The proposed model is expected to provide a reference value for the new drivers of their MWL by providing the physiological indices, and the driving lesson plans can be proposed to sustain an appropriate MWL as well as improve the driver's work performance.

  7. Model-based gene set analysis for Bioconductor.

    Science.gov (United States)

    Bauer, Sebastian; Robinson, Peter N; Gagneur, Julien

    2011-07-01

    Gene Ontology and other forms of gene-category analysis play a major role in the evaluation of high-throughput experiments in molecular biology. Single-category enrichment analysis procedures such as Fisher's exact test tend to flag large numbers of redundant categories as significant, which can complicate interpretation. We have recently developed an approach called model-based gene set analysis (MGSA), that substantially reduces the number of redundant categories returned by the gene-category analysis. In this work, we present the Bioconductor package mgsa, which makes the MGSA algorithm available to users of the R language. Our package provides a simple and flexible application programming interface for applying the approach. The mgsa package has been made available as part of Bioconductor 2.8. It is released under the conditions of the Artistic license 2.0. peter.robinson@charite.de; julien.gagneur@embl.de.

  8. Model-based fault diagnosis in PEM fuel cell systems

    Energy Technology Data Exchange (ETDEWEB)

    Escobet, T; de Lira, S; Puig, V; Quevedo, J [Automatic Control Department (ESAII), Universitat Politecnica de Catalunya (UPC), Rambla Sant Nebridi 10, 08222 Terrassa (Spain); Feroldi, D; Riera, J; Serra, M [Institut de Robotica i Informatica Industrial (IRI), Consejo Superior de Investigaciones Cientificas (CSIC), Universitat Politecnica de Catalunya (UPC) Parc Tecnologic de Barcelona, Edifici U, Carrer Llorens i Artigas, 4-6, Planta 2, 08028 Barcelona (Spain)

    2009-07-01

    In this work, a model-based fault diagnosis methodology for PEM fuel cell systems is presented. The methodology is based on computing residuals, indicators that are obtained comparing measured inputs and outputs with analytical relationships, which are obtained by system modelling. The innovation of this methodology is based on the characterization of the relative residual fault sensitivity. To illustrate the results, a non-linear fuel cell simulator proposed in the literature is used, with modifications, to include a set of fault scenarios proposed in this work. Finally, it is presented the diagnosis results corresponding to these fault scenarios. It is remarkable that with this methodology it is possible to diagnose and isolate all the faults in the proposed set in contrast with other well known methodologies which use the binary signature matrix of analytical residuals and faults. (author)

  9. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  10. Storm surge model based on variational data assimilation method

    Directory of Open Access Journals (Sweden)

    Shi-li Huang

    2010-06-01

    Full Text Available By combining computation and observation information, the variational data assimilation method has the ability to eliminate errors caused by the uncertainty of parameters in practical forecasting. It was applied to a storm surge model based on unstructured grids with high spatial resolution meant for improving the forecasting accuracy of the storm surge. By controlling the wind stress drag coefficient, the variation-based model was developed and validated through data assimilation tests in an actual storm surge induced by a typhoon. In the data assimilation tests, the model accurately identified the wind stress drag coefficient and obtained results close to the true state. Then, the actual storm surge induced by Typhoon 0515 was forecast by the developed model, and the results demonstrate its efficiency in practical application.

  11. GENERALISED MODEL BASED CONFIDENCE INTERVALS IN TWO STAGE CLUSTER SAMPLING

    Directory of Open Access Journals (Sweden)

    Christopher Ouma Onyango

    2010-09-01

    Full Text Available Chambers and Dorfman (2002 constructed bootstrap confidence intervals in model based estimation for finite population totals assuming that auxiliary values are available throughout a target population and that the auxiliary values are independent. They also assumed that the cluster sizes are known throughout the target population. We now extend to two stage sampling in which the cluster sizes are known only for the sampled clusters, and we therefore predict the unobserved part of the population total. Jan and Elinor (2008 have done similar work, but unlike them, we use a general model, in which the auxiliary values are not necessarily independent. We demonstrate that the asymptotic properties of our proposed estimator and its coverage rates are better than those constructed under the model assisted local polynomial regression model.

  12. Algorithms and procedures in the model based control of accelerators

    International Nuclear Information System (INIS)

    Bozoki, E.

    1987-10-01

    The overall design of a Model Based Control system was presented. The system consists of PLUG-IN MODULES, governed by a SUPERVISORY PROGRAM and communicating via SHARED DATA FILES. Models can be ladded or replaced without affecting the oveall system. There can be more then one module (algorithm) to perform the same task. The user can choose the most appropriate algorithm or can compare the results using different algorithms. Calculations, algorithms, file read and write, etc. which are used in more than one module, will be in a subroutine library. This feature will simplify the maintenance of the system. A partial list of modules is presented, specifying the task they perform. 19 refs., 1 fig

  13. Model-based estimation for dynamic cardiac studies using ECT

    International Nuclear Information System (INIS)

    Chiao, P.C.; Rogers, W.L.; Clinthorne, N.H.; Fessler, J.A.; Hero, A.O.

    1994-01-01

    In this paper, the authors develop a strategy for joint estimation of physiological parameters and myocardial boundaries using ECT (Emission Computed Tomography). The authors construct an observation model to relate parameters of interest to the projection data and to account for limited ECT system resolution and measurement noise. The authors then use a maximum likelihood (ML) estimator to jointly estimate all the parameters directly from the projection data without reconstruction of intermediate images. The authors also simulate myocardial perfusion studies based on a simplified heart model to evaluate the performance of the model-based joint ML estimator and compare this performance to the Cramer-Rao lower bound. Finally, model assumptions and potential uses of the joint estimation strategy are discussed

  14. Model-based estimation for dynamic cardiac studies using ECT.

    Science.gov (United States)

    Chiao, P C; Rogers, W L; Clinthorne, N H; Fessler, J A; Hero, A O

    1994-01-01

    The authors develop a strategy for joint estimation of physiological parameters and myocardial boundaries using ECT (emission computed tomography). They construct an observation model to relate parameters of interest to the projection data and to account for limited ECT system resolution and measurement noise. The authors then use a maximum likelihood (ML) estimator to jointly estimate all the parameters directly from the projection data without reconstruction of intermediate images. They also simulate myocardial perfusion studies based on a simplified heart model to evaluate the performance of the model-based joint ML estimator and compare this performance to the Cramer-Rao lower bound. Finally, the authors discuss model assumptions and potential uses of the joint estimation strategy.

  15. A General Accelerated Degradation Model Based on the Wiener Process.

    Science.gov (United States)

    Liu, Le; Li, Xiaoyang; Sun, Fuqiang; Wang, Ning

    2016-12-06

    Accelerated degradation testing (ADT) is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses.

  16. Polynomial fuzzy model-based approach for underactuated surface vessels

    DEFF Research Database (Denmark)

    Khooban, Mohammad Hassan; Vafamand, Navid; Dragicevic, Tomislav

    2018-01-01

    The main goal of this study is to introduce a new polynomial fuzzy model-based structure for a class of marine systems with non-linear and polynomial dynamics. The suggested technique relies on a polynomial Takagi–Sugeno (T–S) fuzzy modelling, a polynomial dynamic parallel distributed compensation...... surface vessel (USV). Additionally, in order to overcome the USV control challenges, including the USV un-modelled dynamics, complex nonlinear dynamics, external disturbances and parameter uncertainties, the polynomial fuzzy model representation is adopted. Moreover, the USV-based control structure...... and a sum-of-squares (SOS) decomposition. The new proposed approach is a generalisation of the standard T–S fuzzy models and linear matrix inequality which indicated its effectiveness in decreasing the tracking time and increasing the efficiency of the robust tracking control problem for an underactuated...

  17. Mechatronic Model Based Computed Torque Control of a Parallel Manipulator

    Directory of Open Access Journals (Sweden)

    Zhiyong Yang

    2008-11-01

    Full Text Available With high speed and accuracy the parallel manipulators have wide application in the industry, but there still exist many difficulties in the actual control process because of the time-varying and coupling. Unfortunately, the present-day commercial controlles cannot provide satisfying performance for its single axis linear control only. Therefore, aimed at a novel 2-DOF (Degree of Freedom parallel manipulator called Diamond 600, a motor-mechanism coupling dynamic model based control scheme employing the computed torque control algorithm are presented in this paper. First, the integrated dynamic coupling model is deduced, according to equivalent torques between the mechanical structure and the PM (Permanent Magnetism servomotor. Second, computed torque controller is described in detail for the above proposed model. At last, a series of numerical simulations and experiments are carried out to test the effectiveness of the system, and the results verify the favourable tracking ability and robustness.

  18. Mechatronic Model Based Computed Torque Control of a Parallel Manipulator

    Directory of Open Access Journals (Sweden)

    Zhiyong Yang

    2008-03-01

    Full Text Available With high speed and accuracy the parallel manipulators have wide application in the industry, but there still exist many difficulties in the actual control process because of the time-varying and coupling. Unfortunately, the present-day commercial controlles cannot provide satisfying performance for its single axis linear control only. Therefore, aimed at a novel 2-DOF (Degree of Freedom parallel manipulator called Diamond 600, a motor-mechanism coupling dynamic model based control scheme employing the computed torque control algorithm are presented in this paper. First, the integrated dynamic coupling model is deduced, according to equivalent torques between the mechanical structure and the PM (Permanent Magnetism servomotor. Second, computed torque controller is described in detail for the above proposed model. At last, a series of numerical simulations and experiments are carried out to test the effectiveness of the system, and the results verify the favourable tracking ability and robustness.

  19. Automated and model-based assembly of an anamorphic telescope

    Science.gov (United States)

    Holters, Martin; Dirks, Sebastian; Stollenwerk, Jochen; Loosen, Peter

    2018-02-01

    Since the first usage of optical glasses there has been an increasing demand for optical systems which are highly customized for a wide field of applications. To meet the challenge of the production of so many unique systems, the development of new techniques and approaches has risen in importance. However, the assembly of precision optical systems with lot sizes of one up to a few tens of systems is still dominated by manual labor. In contrast, highly adaptive and model-based approaches may offer a solution for manufacturing with a high degree of automation and high throughput while maintaining high precision. In this work a model-based automated assembly approach based on ray-tracing is presented. This process runs autonomously, and accounts for a wide range of functionality. It firstly identifies the sequence for an optimized assembly and secondly, generates and matches intermediate figures of merit to predict the overall optical functionality of the optical system. This process also takes into account the generation of a digital twin of the optical system, by mapping key-performance-indicators like the first and the second momentum of intensity into the optical model. This approach is verified by the automatic assembly of an anamorphic telescope within an assembly cell. By continuous measuring and mapping the key-performance-indicators into the optical model, the quality of the digital twin is determined. Moreover, by measuring the optical quality and geometrical parameters of the telescope, the precision of this approach is determined. Finally, the productivity of the process is evaluated by monitoring the speed of the different steps of the process.

  20. MFAM: Multiple Frequency Adaptive Model-Based Indoor Localization Method.

    Science.gov (United States)

    Tuta, Jure; Juric, Matjaz B

    2018-03-24

    This paper presents MFAM (Multiple Frequency Adaptive Model-based localization method), a novel model-based indoor localization method that is capable of using multiple wireless signal frequencies simultaneously. It utilizes indoor architectural model and physical properties of wireless signal propagation through objects and space. The motivation for developing multiple frequency localization method lies in the future Wi-Fi standards (e.g., 802.11ah) and the growing number of various wireless signals present in the buildings (e.g., Wi-Fi, Bluetooth, ZigBee, etc.). Current indoor localization methods mostly rely on a single wireless signal type and often require many devices to achieve the necessary accuracy. MFAM utilizes multiple wireless signal types and improves the localization accuracy over the usage of a single frequency. It continuously monitors signal propagation through space and adapts the model according to the changes indoors. Using multiple signal sources lowers the required number of access points for a specific signal type while utilizing signals, already present in the indoors. Due to the unavailability of the 802.11ah hardware, we have evaluated proposed method with similar signals; we have used 2.4 GHz Wi-Fi and 868 MHz HomeMatic home automation signals. We have performed the evaluation in a modern two-bedroom apartment and measured mean localization error 2.0 to 2.3 m and median error of 2.0 to 2.2 m. Based on our evaluation results, using two different signals improves the localization accuracy by 18% in comparison to 2.4 GHz Wi-Fi-only approach. Additional signals would improve the accuracy even further. We have shown that MFAM provides better accuracy than competing methods, while having several advantages for real-world usage.

  1. MFAM: Multiple Frequency Adaptive Model-Based Indoor Localization Method

    Directory of Open Access Journals (Sweden)

    Jure Tuta

    2018-03-01

    Full Text Available This paper presents MFAM (Multiple Frequency Adaptive Model-based localization method, a novel model-based indoor localization method that is capable of using multiple wireless signal frequencies simultaneously. It utilizes indoor architectural model and physical properties of wireless signal propagation through objects and space. The motivation for developing multiple frequency localization method lies in the future Wi-Fi standards (e.g., 802.11ah and the growing number of various wireless signals present in the buildings (e.g., Wi-Fi, Bluetooth, ZigBee, etc.. Current indoor localization methods mostly rely on a single wireless signal type and often require many devices to achieve the necessary accuracy. MFAM utilizes multiple wireless signal types and improves the localization accuracy over the usage of a single frequency. It continuously monitors signal propagation through space and adapts the model according to the changes indoors. Using multiple signal sources lowers the required number of access points for a specific signal type while utilizing signals, already present in the indoors. Due to the unavailability of the 802.11ah hardware, we have evaluated proposed method with similar signals; we have used 2.4 GHz Wi-Fi and 868 MHz HomeMatic home automation signals. We have performed the evaluation in a modern two-bedroom apartment and measured mean localization error 2.0 to 2.3 m and median error of 2.0 to 2.2 m. Based on our evaluation results, using two different signals improves the localization accuracy by 18% in comparison to 2.4 GHz Wi-Fi-only approach. Additional signals would improve the accuracy even further. We have shown that MFAM provides better accuracy than competing methods, while having several advantages for real-world usage.

  2. Personality disorders in previously detained adolescent females: a prospective study

    NARCIS (Netherlands)

    Krabbendam, A.; Colins, O.F.; Doreleijers, T.A.H.; van der Molen, E.; Beekman, A.T.F.; Vermeiren, R.R.J.M.

    2015-01-01

    This longitudinal study investigated the predictive value of trauma and mental health problems for the development of antisocial personality disorder (ASPD) and borderline personality disorder (BPD) in previously detained women. The participants were 229 detained adolescent females who were assessed

  3. Payload specialist Reinhard Furrer show evidence of previous blood sampling

    Science.gov (United States)

    1985-01-01

    Payload specialist Reinhard Furrer shows evidence of previous blood sampling while Wubbo J. Ockels, Dutch payload specialist (only partially visible), extends his right arm after a sample has been taken. Both men show bruises on their arms.

  4. Choice of contraception after previous operative delivery at a family ...

    African Journals Online (AJOL)

    Choice of contraception after previous operative delivery at a family planning clinic in Northern Nigeria. Amina Mohammed‑Durosinlorun, Joel Adze, Stephen Bature, Caleb Mohammed, Matthew Taingson, Amina Abubakar, Austin Ojabo, Lydia Airede ...

  5. Previous utilization of service does not improve timely booking in ...

    African Journals Online (AJOL)

    Previous utilization of service does not improve timely booking in antenatal care: Cross sectional study ... Journal Home > Vol 24, No 3 (2010) > ... Results: Past experience on antenatal care service utilization did not come out as a predictor for ...

  6. A previous hamstring injury affects kicking mechanics in soccer players.

    Science.gov (United States)

    Navandar, Archit; Veiga, Santiago; Torres, Gonzalo; Chorro, David; Navarro, Enrique

    2018-01-10

    Although the kicking skill is influenced by limb dominance and sex, how a previous hamstring injury affects kicking has not been studied in detail. Thus, the objective of this study was to evaluate the effect of sex and limb dominance on kicking in limbs with and without a previous hamstring injury. 45 professional players (males: n=19, previously injured players=4, age=21.16 ± 2.00 years; females: n=19, previously injured players=10, age=22.15 ± 4.50 years) performed 5 kicks each with their preferred and non-preferred limb at a target 7m away, which were recorded with a three-dimensional motion capture system. Kinematic and kinetic variables were extracted for the backswing, leg cocking, leg acceleration and follow through phases. A shorter backswing (20.20 ± 3.49% vs 25.64 ± 4.57%), and differences in knee flexion angle (58 ± 10o vs 72 ± 14o) and hip flexion velocity (8 ± 0rad/s vs 10 ± 2rad/s) were observed in previously injured, non-preferred limb kicks for females. A lower peak hip linear velocity (3.50 ± 0.84m/s vs 4.10 ± 0.45m/s) was observed in previously injured, preferred limb kicks of females. These differences occurred in the backswing and leg-cocking phases where the hamstring muscles were the most active. A variation in the functioning of the hamstring muscles and that of the gluteus maximus and iliopsoas in the case of a previous injury could account for the differences observed in the kicking pattern. Therefore, the effects of a previous hamstring injury must be considered while designing rehabilitation programs to re-educate kicking movement.

  7. Secondary recurrent miscarriage is associated with previous male birth.

    LENUS (Irish Health Repository)

    Ooi, Poh Veh

    2012-01-31

    Secondary recurrent miscarriage (RM) is defined as three or more consecutive pregnancy losses after delivery of a viable infant. Previous reports suggest that a firstborn male child is associated with less favourable subsequent reproductive potential, possibly due to maternal immunisation against male-specific minor histocompatibility antigens. In a retrospective cohort study of 85 cases of secondary RM we aimed to determine if secondary RM was associated with (i) gender of previous child, maternal age, or duration of miscarriage history, and (ii) increased risk of pregnancy complications. Fifty-three women (62.0%; 53\\/85) gave birth to a male child prior to RM compared to 32 (38.0%; 32\\/85) who gave birth to a female child (p=0.002). The majority (91.7%; 78\\/85) had uncomplicated, term deliveries and normal birth weight neonates, with one quarter of the women previously delivered by Caesarean section. All had routine RM investigations and 19.0% (16\\/85) had an abnormal result. Fifty-seven women conceived again and 33.3% (19\\/57) miscarried, but there was no significant difference in failure rates between those with a previous male or female child (13\\/32 vs. 6\\/25, p=0.2). When patients with abnormal results were excluded, or when women with only one previous child were considered, there was still no difference in these rates. A previous male birth may be associated with an increased risk of secondary RM but numbers preclude concluding whether this increases recurrence risk. The suggested association with previous male birth provides a basis for further investigations at a molecular level.

  8. Secondary recurrent miscarriage is associated with previous male birth.

    LENUS (Irish Health Repository)

    Ooi, Poh Veh

    2011-01-01

    Secondary recurrent miscarriage (RM) is defined as three or more consecutive pregnancy losses after delivery of a viable infant. Previous reports suggest that a firstborn male child is associated with less favourable subsequent reproductive potential, possibly due to maternal immunisation against male-specific minor histocompatibility antigens. In a retrospective cohort study of 85 cases of secondary RM we aimed to determine if secondary RM was associated with (i) gender of previous child, maternal age, or duration of miscarriage history, and (ii) increased risk of pregnancy complications. Fifty-three women (62.0%; 53\\/85) gave birth to a male child prior to RM compared to 32 (38.0%; 32\\/85) who gave birth to a female child (p=0.002). The majority (91.7%; 78\\/85) had uncomplicated, term deliveries and normal birth weight neonates, with one quarter of the women previously delivered by Caesarean section. All had routine RM investigations and 19.0% (16\\/85) had an abnormal result. Fifty-seven women conceived again and 33.3% (19\\/57) miscarried, but there was no significant difference in failure rates between those with a previous male or female child (13\\/32 vs. 6\\/25, p=0.2). When patients with abnormal results were excluded, or when women with only one previous child were considered, there was still no difference in these rates. A previous male birth may be associated with an increased risk of secondary RM but numbers preclude concluding whether this increases recurrence risk. The suggested association with previous male birth provides a basis for further investigations at a molecular level.

  9. Model-based system engineering approach for the Euclid mission to manage scientific and technical complexity

    Science.gov (United States)

    Lorenzo Alvarez, Jose; Metselaar, Harold; Amiaux, Jerome; Saavedra Criado, Gonzalo; Gaspar Venancio, Luis M.; Salvignol, Jean-Christophe; Laureijs, René J.; Vavrek, Roland

    2016-08-01

    In the last years, the system engineering field is coming to terms with a paradigm change in the approach for complexity management. Different strategies have been proposed to cope with highly interrelated systems, system of systems and collaborative system engineering have been proposed and a significant effort is being invested into standardization and ontology definition. In particular, Model Based System Engineering (MBSE) intends to introduce methodologies for a systematic system definition, development, validation, deployment, operation and decommission, based on logical and visual relationship mapping, rather than traditional 'document based' information management. The practical implementation in real large-scale projects is not uniform across fields. In space science missions, the usage has been limited to subsystems or sample projects with modeling being performed 'a-posteriori' in many instances. The main hurdle for the introduction of MBSE practices in new projects is still the difficulty to demonstrate their added value to a project and whether their benefit is commensurate with the level of effort required to put them in place. In this paper we present the implemented Euclid system modeling activities, and an analysis of the benefits and limitations identified to support in particular requirement break-down and allocation, and verification planning at mission level.

  10. Development of Acoustic Model-Based Iterative Reconstruction Technique for Thick-Concrete Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Almansouri, Hani [Purdue University; Clayton, Dwight A [ORNL; Kisner, Roger A [ORNL; Polsky, Yarom [ORNL; Bouman, Charlie [Purdue University; Santos-Villalobos, Hector J [ORNL

    2015-01-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well s health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.

  11. Development of Acoustic Model-Based Iterative Reconstruction Technique for Thick-Concrete Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Almansouri, Hani [Purdue University; Clayton, Dwight A [ORNL; Kisner, Roger A [ORNL; Polsky, Yarom [ORNL; Bouman, Charlie [Purdue University; Santos-Villalobos, Hector J [ORNL

    2016-01-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well's health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.

  12. Development of acoustic model-based iterative reconstruction technique for thick-concrete imaging

    Science.gov (United States)

    Almansouri, Hani; Clayton, Dwight; Kisner, Roger; Polsky, Yarom; Bouman, Charles; Santos-Villalobos, Hector

    2016-02-01

    Ultrasound signals have been used extensively for non-destructive evaluation (NDE). However, typical reconstruction techniques, such as the synthetic aperture focusing technique (SAFT), are limited to quasi-homogenous thin media. New ultrasonic systems and reconstruction algorithms are in need for one-sided NDE of non-homogenous thick objects. An application example space is imaging of reinforced concrete structures for commercial nuclear power plants (NPPs). These structures provide important foundation, support, shielding, and containment functions. Identification and management of aging and degradation of concrete structures is fundamental to the proposed long-term operation of NPPs. Another example is geothermal and oil/gas production wells. These multi-layered structures are composed of steel, cement, and several types of soil and rocks. Ultrasound systems with greater penetration range and image quality will allow for better monitoring of the well's health and prediction of high-pressure hydraulic fracturing of the rock. These application challenges need to be addressed with an integrated imaging approach, where the application, hardware, and reconstruction software are highly integrated and optimized. Therefore, we are developing an ultrasonic system with Model-Based Iterative Reconstruction (MBIR) as the image reconstruction backbone. As the first implementation of MBIR for ultrasonic signals, this paper document the first implementation of the algorithm and show reconstruction results for synthetically generated data.1

  13. An Enhanced Preventive Maintenance Optimization Model Based on a Three-Stage Failure Process

    Directory of Open Access Journals (Sweden)

    Ruifeng Yang

    2015-01-01

    Full Text Available Nuclear power plants are highly complex systems and the issues related to their safety are of primary importance. Probabilistic safety assessment is regarded as the most widespread methodology for studying the safety of nuclear power plants. As maintenance is one of the most important factors for affecting the reliability and safety, an enhanced preventive maintenance optimization model based on a three-stage failure process is proposed. Preventive maintenance is still a dominant maintenance policy due to its easy implementation. In order to correspond to the three-color scheme commonly used in practice, the lifetime of system before failure is divided into three stages, namely, normal, minor defective, and severe defective stages. When the minor defective stage is identified, two measures are considered for comparison: one is that halving the inspection interval only when the minor defective stage is identified at the first time; the other one is that if only identifying the minor defective stage, the subsequent inspection interval is halved. Maintenance is implemented immediately once the severe defective stage is identified. Minimizing the expected cost per unit time is our objective function to optimize the inspection interval. Finally, a numerical example is presented to illustrate the effectiveness of the proposed models.

  14. Wind farms providing secondary frequency regulation: Evaluating the performance of model-based receding horizon control

    International Nuclear Information System (INIS)

    Shapiro, Carl R.; Meneveau, Charles; Gayme, Dennice F.; Meyers, Johan

    2016-01-01

    We investigate the use of wind farms to provide secondary frequency regulation for a power grid. Our approach uses model-based receding horizon control of a wind farm that is tested using a large eddy simulation (LES) framework. In order to enable real-time implementation, the control actions are computed based on a time-varying one-dimensional wake model. This model describes wake advection and interactions, both of which play an important role in wind farm power production. This controller is implemented in an LES model of an 84-turbine wind farm represented by actuator disk turbine models. Differences between the velocities at each turbine predicted by the wake model and measured in LES are used for closed-loop feedback. The controller is tested on two types of regulation signals, “RegA” and “RegD”, obtained from PJM, an independent system operator in the eastern United States. Composite performance scores, which are used by PJM to qualify plants for regulation, are used to evaluate the performance of the controlled wind farm. Our results demonstrate that the controlled wind farm consistently performs well, passing the qualification threshold for all fastacting RegD signals. For the RegA signal, which changes over slower time scales, the controlled wind farm's average performance surpasses the threshold, but further work is needed to enable the controlled system to achieve qualifying performance all of the time. (paper)

  15. Digital Model-Based Engineering: Expectations, Prerequisites, and Challenges of Infusion

    Science.gov (United States)

    Hale, J. P.; Zimmerman, P.; Kukkala, G.; Guerrero, J.; Kobryn, P.; Puchek, B.; Bisconti, M.; Baldwin, C.; Mulpuri, M.

    2017-01-01

    Digital model-based engineering (DMbE) is the use of digital artifacts, digital environments, and digital tools in the performance of engineering functions. DMbE is intended to allow an organization to progress from documentation-based engineering methods to digital methods that may provide greater flexibility, agility, and efficiency. The term 'DMbE' was developed as part of an effort by the Model-Based Systems Engineering (MBSE) Infusion Task team to identify what government organizations might expect in the course of moving to or infusing MBSE into their organizations. The Task team was established by the Interagency Working Group on Engineering Complex Systems, an informal collaboration among government systems engineering organizations. This Technical Memorandum (TM) discusses the work of the MBSE Infusion Task team to date. The Task team identified prerequisites, expectations, initial challenges, and recommendations for areas of study to pursue, as well as examples of efforts already in progress. The team identified the following five expectations associated with DMbE infusion, discussed further in this TM: (1) Informed decision making through increased transparency, and greater insight. (2) Enhanced communication. (3) Increased understanding for greater flexibility/adaptability in design. (4) Increased confidence that the capability will perform as expected. (5) Increased efficiency. The team identified the following seven challenges an organization might encounter when looking to infuse DMbE: (1) Assessing value added to the organization. Not all DMbE practices will be applicable to every situation in every organization, and not all implementations will have positive results. (2) Overcoming organizational and cultural hurdles. (3) Adopting contractual practices and technical data management. (4) Redefining configuration management. The DMbE environment changes the range of configuration information to be managed to include performance and design models

  16. Model-Based Methods for Fault Diagnosis: Some Guide-Lines

    DEFF Research Database (Denmark)

    Patton, R.J.; Chen, J.; Nielsen, S.B.

    1995-01-01

    This paper provides a review of model-based fault diagnosis techniques. Starting from basic principles, the properties.......This paper provides a review of model-based fault diagnosis techniques. Starting from basic principles, the properties....

  17. Model-based Compositional Design of Networked Control Systems

    Science.gov (United States)

    2013-12-01

    with event scheduling. In particular, the C++ source code for the ns-2 simulator scheduler is modi - fied to implement the time synchronization mechanism...pump: Security attacks and defenses for a diabetes therapy system. In: 2011 13th IEEE International Conference In e-Health Networking Applications and

  18. Erlotinib-induced rash spares previously irradiated skin

    International Nuclear Information System (INIS)

    Lips, Irene M.; Vonk, Ernest J.A.; Koster, Mariska E.Y.; Houwing, Ronald H.

    2011-01-01

    Erlotinib is an epidermal growth factor receptor inhibitor prescribed to patients with locally advanced or metastasized non-small cell lung carcinoma after failure of at least one earlier chemotherapy treatment. Approximately 75% of the patients treated with erlotinib develop acneiform skin rashes. A patient treated with erlotinib 3 months after finishing concomitant treatment with chemotherapy and radiotherapy for non-small cell lung cancer is presented. Unexpectedly, the part of the skin that had been included in his previously radiotherapy field was completely spared from the erlotinib-induced acneiform skin rash. The exact mechanism of erlotinib-induced rash sparing in previously irradiated skin is unclear. The underlying mechanism of this phenomenon needs to be explored further, because the number of patients being treated with a combination of both therapeutic modalities is increasing. The therapeutic effect of erlotinib in the area of the previously irradiated lesion should be assessed. (orig.)

  19. Reasoning with Previous Decisions: Beyond the Doctrine of Precedent

    DEFF Research Database (Denmark)

    Komárek, Jan

    2013-01-01

    in different jurisdictions use previous judicial decisions in their argument, we need to move beyond the concept of precedent to a wider notion, which would embrace practices and theories in legal systems outside the Common law tradition. This article presents the concept of ‘reasoning with previous decisions...... law method’, but they are no less rational and intellectually sophisticated. The reason for the rather conceited attitude of some comparatists is in the dominance of the common law paradigm of precedent and the accompanying ‘case law method’. If we want to understand how courts and lawyers......’ as such an alternative and develops its basic models. The article first points out several shortcomings inherent in limiting the inquiry into reasoning with previous decisions by the common law paradigm (1). On the basis of numerous examples provided in section (1), I will present two basic models of reasoning...

  20. [Prevalence of previously diagnosed diabetes mellitus in Mexico.

    Science.gov (United States)

    Rojas-Martínez, Rosalba; Basto-Abreu, Ana; Aguilar-Salinas, Carlos A; Zárate-Rojas, Emiliano; Villalpando, Salvador; Barrientos-Gutiérrez, Tonatiuh

    2018-01-01

    To compare the prevalence of previously diagnosed diabetes in 2016 with previous national surveys and to describe treatment and its complications. Mexico's national surveys Ensa 2000, Ensanut 2006, 2012 and 2016 were used. For 2016, logistic regression models and measures of central tendency and dispersion were obtained. The prevalence of previously diagnosed diabetes in 2016 was 9.4%. The increase of 2.2% relative to 2012 was not significant and only observed in patients older than 60 years. While preventive measures have increased, the access to medical treatment and lifestyle has not changed. The treatment has been modified, with an increase in insulin and decrease in hypoglycaemic agents. Population aging, lack of screening actions and the increase in diabetes complications will lead to an increase on the burden of disease. Policy measures targeting primary and secondary prevention of diabetes are crucial.

  1. Using a symbolic process model as input for model-based fMRI analysis : Locating the neural correlates of problem state replacements

    NARCIS (Netherlands)

    Borst, J.P.; Taatgen, N.A.; Van Rijn, D.H.

    2011-01-01

    In this paper, a model-based analysis method for fMRI is used with a high-level symbolic process model. Participants performed a triple-task in which intermediate task information needs to be updated frequently. Previous work has shown that the associated resource - the problem state resource - acts

  2. Model-based cartilage thickness measurement in the submillimeter range

    International Nuclear Information System (INIS)

    Streekstra, G. J.; Strackee, S. D.; Maas, M.; Wee, R. ter; Venema, H. W.

    2007-01-01

    Current methods of image-based thickness measurement in thin sheet structures utilize second derivative zero crossings to locate the layer boundaries. It is generally acknowledged that the nonzero width of the point spread function (PSF) limits the accuracy of this measurement procedure. We propose a model-based method that strongly reduces PSF-induced bias by incorporating the PSF into the thickness estimation method. We estimated the bias in thickness measurements in simulated thin sheet images as obtained from second derivative zero crossings. To gain insight into the range of sheet thickness where our method is expected to yield improved results, sheet thickness was varied between 0.15 and 1.2 mm with an assumed PSF as present in the high-resolution modes of current computed tomography (CT) scanners [full width at half maximum (FWHM) 0.5-0.8 mm]. Our model-based method was evaluated in practice by measuring layer thickness from CT images of a phantom mimicking two parallel cartilage layers in an arthrography procedure. CT arthrography images of cadaver wrists were also evaluated, and thickness estimates were compared to those obtained from high-resolution anatomical sections that served as a reference. The thickness estimates from the simulated images reveal that the method based on second derivative zero crossings shows considerable bias for layers in the submillimeter range. This bias is negligible for sheet thickness larger than 1 mm, where the size of the sheet is more than twice the FWHM of the PSF but can be as large as 0.2 mm for a 0.5 mm sheet. The results of the phantom experiments show that the bias is effectively reduced by our method. The deviations from the true thickness, due to random fluctuations induced by quantum noise in the CT images, are of the order of 3% for a standard wrist imaging protocol. In the wrist the submillimeter thickness estimates from the CT arthrography images correspond within 10% to those estimated from the anatomical

  3. Cardiovascular magnetic resonance in adults with previous cardiovascular surgery.

    Science.gov (United States)

    von Knobelsdorff-Brenkenhoff, Florian; Trauzeddel, Ralf Felix; Schulz-Menger, Jeanette

    2014-03-01

    Cardiovascular magnetic resonance (CMR) is a versatile non-invasive imaging modality that serves a broad spectrum of indications in clinical cardiology and has proven evidence. Most of the numerous applications are appropriate in patients with previous cardiovascular surgery in the same manner as in non-surgical subjects. However, some specifics have to be considered. This review article is intended to provide information about the application of CMR in adults with previous cardiovascular surgery. In particular, the two main scenarios, i.e. following coronary artery bypass surgery and following heart valve surgery, are highlighted. Furthermore, several pictorial descriptions of other potential indications for CMR after cardiovascular surgery are given.

  4. Researches of fruit quality prediction model based on near infrared spectrum

    Science.gov (United States)

    Shen, Yulin; Li, Lian

    2018-04-01

    With the improvement in standards for food quality and safety, people pay more attention to the internal quality of fruits, therefore the measurement of fruit internal quality is increasingly imperative. In general, nondestructive soluble solid content (SSC) and total acid content (TAC) analysis of fruits is vital and effective for quality measurement in global fresh produce markets, so in this paper, we aim at establishing a novel fruit internal quality prediction model based on SSC and TAC for Near Infrared Spectrum. Firstly, the model of fruit quality prediction based on PCA + BP neural network, PCA + GRNN network, PCA + BP adaboost strong classifier, PCA + ELM and PCA + LS_SVM classifier are designed and implemented respectively; then, in the NSCT domain, the median filter and the SavitzkyGolay filter are used to preprocess the spectral signal, Kennard-Stone algorithm is used to automatically select the training samples and test samples; thirdly, we achieve the optimal models by comparing 15 kinds of prediction model based on the theory of multi-classifier competition mechanism, specifically, the non-parametric estimation is introduced to measure the effectiveness of proposed model, the reliability and variance of nonparametric estimation evaluation of each prediction model to evaluate the prediction result, while the estimated value and confidence interval regard as a reference, the experimental results demonstrate that this model can better achieve the optimal evaluation of the internal quality of fruit; finally, we employ cat swarm optimization to optimize two optimal models above obtained from nonparametric estimation, empirical testing indicates that the proposed method can provide more accurate and effective results than other forecasting methods.

  5. Optimal chemotherapy for leukemia: a model-based strategy for individualized treatment.

    Directory of Open Access Journals (Sweden)

    Devaraj Jayachandran

    Full Text Available Acute Lymphoblastic Leukemia, commonly known as ALL, is a predominant form of cancer during childhood. With the advent of modern healthcare support, the 5-year survival rate has been impressive in the recent past. However, long-term ALL survivors embattle several treatment-related medical and socio-economic complications due to excessive and inordinate chemotherapy doses received during treatment. In this work, we present a model-based approach to personalize 6-Mercaptopurine (6-MP treatment for childhood ALL with a provision for incorporating the pharmacogenomic variations among patients. Semi-mechanistic mathematical models were developed and validated for i 6-MP metabolism, ii red blood cell mean corpuscular volume (MCV dynamics, a surrogate marker for treatment efficacy, and iii leukopenia, a major side-effect. With the constraint of getting limited data from clinics, a global sensitivity analysis based model reduction technique was employed to reduce the parameter space arising from semi-mechanistic models. The reduced, sensitive parameters were used to individualize the average patient model to a specific patient so as to minimize the model uncertainty. Models fit the data well and mimic diverse behavior observed among patients with minimum parameters. The model was validated with real patient data obtained from literature and Riley Hospital for Children in Indianapolis. Patient models were used to optimize the dose for an individual patient through nonlinear model predictive control. The implementation of our approach in clinical practice is realizable with routinely measured complete blood counts (CBC and a few additional metabolite measurements. The proposed approach promises to achieve model-based individualized treatment to a specific patient, as opposed to a standard-dose-for-all, and to prescribe an optimal dose for a desired outcome with minimum side-effects.

  6. The Methodical Approach to Assessment of Enterprise Activity on the Basis of its Models, Based on the Balanced Scorecard

    Directory of Open Access Journals (Sweden)

    Minenkova Olena V.

    2017-12-01

    Full Text Available The article proposes the methodical approach to assessment of activity of enterprise on the basis of its models, based on the balanced scorecard. The content is presented and the following components of the methodical approach are formed: tasks, input information, list of methods and models, as well as results. Implementation of this methodical approach provides improvement of management and increase of results of enterprise activity. The place of assessment models in management of enterprise activity and formation of managerial decision has been defined. Recommendations as to the operations of decision-making procedures to increase the efficiency of enterprise have been provided.

  7. Multi-binding site model-based curve-fitting program for the computation of RIA data

    International Nuclear Information System (INIS)

    Malan, P.G.; Ekins, R.P.; Cox, M.G.; Long, E.M.R.

    1977-01-01

    In this paper, a comparison will be made of model-based and empirical curve-fitting procedures. The implementation of a multiple binding-site curve-fitting model which will successfully fit a wide range of assay data, and which can be run on a mini-computer is described. The latter sophisticated model also provides estimates of binding site concentrations and the values of the respective equilibrium constants present: the latter have been used for refining assay conditions using computer optimisation techniques. (orig./AJ) [de

  8. Estimation of pump operational state with model-based methods

    International Nuclear Information System (INIS)

    Ahonen, Tero; Tamminen, Jussi; Ahola, Jero; Viholainen, Juha; Aranto, Niina; Kestilae, Juha

    2010-01-01

    Pumps are widely used in industry, and they account for 20% of the industrial electricity consumption. Since the speed variation is often the most energy-efficient method to control the head and flow rate of a centrifugal pump, frequency converters are used with induction motor-driven pumps. Although a frequency converter can estimate the operational state of an induction motor without external measurements, the state of a centrifugal pump or other load machine is not typically considered. The pump is, however, usually controlled on the basis of the required flow rate or output pressure. As the pump operational state can be estimated with a general model having adjustable parameters, external flow rate or pressure measurements are not necessary to determine the pump flow rate or output pressure. Hence, external measurements could be replaced with an adjustable model for the pump that uses estimates of the motor operational state. Besides control purposes, modelling the pump operation can provide useful information for energy auditing and optimization purposes. In this paper, two model-based methods for pump operation estimation are presented. Factors affecting the accuracy of the estimation methods are analyzed. The applicability of the methods is verified by laboratory measurements and tests in two pilot installations. Test results indicate that the estimation methods can be applied to the analysis and control of pump operation. The accuracy of the methods is sufficient for auditing purposes, and the methods can inform the user if the pump is driven inefficiently.

  9. Small Area Model-Based Estimators Using Big Data Sources

    Directory of Open Access Journals (Sweden)

    Marchetti Stefano

    2015-06-01

    Full Text Available The timely, accurate monitoring of social indicators, such as poverty or inequality, on a finegrained spatial and temporal scale is a crucial tool for understanding social phenomena and policymaking, but poses a great challenge to official statistics. This article argues that an interdisciplinary approach, combining the body of statistical research in small area estimation with the body of research in social data mining based on Big Data, can provide novel means to tackle this problem successfully. Big Data derived from the digital crumbs that humans leave behind in their daily activities are in fact providing ever more accurate proxies of social life. Social data mining from these data, coupled with advanced model-based techniques for fine-grained estimates, have the potential to provide a novel microscope through which to view and understand social complexity. This article suggests three ways to use Big Data together with small area estimation techniques, and shows how Big Data has the potential to mirror aspects of well-being and other socioeconomic phenomena.

  10. Model-based processing for underwater acoustic arrays

    CERN Document Server

    Sullivan, Edmund J

    2015-01-01

    This monograph presents a unified approach to model-based processing for underwater acoustic arrays. The use of physical models in passive array processing is not a new idea, but it has been used on a case-by-case basis, and as such, lacks any unifying structure. This work views all such processing methods as estimation procedures, which then can be unified by treating them all as a form of joint estimation based on a Kalman-type recursive processor, which can be recursive either in space or time, depending on the application. This is done for three reasons. First, the Kalman filter provides a natural framework for the inclusion of physical models in a processing scheme. Second, it allows poorly known model parameters to be jointly estimated along with the quantities of interest. This is important, since in certain areas of array processing already in use, such as those based on matched-field processing, the so-called mismatch problem either degrades performance or, indeed, prevents any solution at all. Third...

  11. Artificial emotional model based on finite state machine

    Institute of Scientific and Technical Information of China (English)

    MENG Qing-mei; WU Wei-guo

    2008-01-01

    According to the basic emotional theory, the artificial emotional model based on the finite state machine(FSM) was presented. In finite state machine model of emotion, the emotional space included the basic emotional space and the multiple emotional spaces. The emotion-switching diagram was defined and transition function was developed using Markov chain and linear interpolation algorithm. The simulation model was built using Stateflow toolbox and Simulink toolbox based on the Matlab platform.And the model included three subsystems: the input one, the emotion one and the behavior one. In the emotional subsystem, the responses of different personalities to the external stimuli were described by defining personal space. This model takes states from an emotional space and updates its state depending on its current state and a state of its input (also a state-emotion). The simulation model realizes the process of switching the emotion from the neutral state to other basic emotions. The simulation result is proved to correspond to emotion-switching law of human beings.

  12. Model-Based Approaches for Teaching and Practicing Personality Assessment.

    Science.gov (United States)

    Blais, Mark A; Hopwood, Christopher J

    2017-01-01

    Psychological assessment is a complex professional skill. Competence in assessment requires an extensive knowledge of personality, neuropsychology, social behavior, and psychopathology, a background in psychometrics, familiarity with a range of multimethod tools, cognitive flexibility, skepticism, and interpersonal sensitivity. This complexity makes assessment a challenge to teach and learn, particularly as the investment of resources and time in assessment has waned in psychological training programs over the last few decades. In this article, we describe 3 conceptual models that can assist teaching and learning psychological assessments. The transtheoretical model of personality provides a personality systems-based framework for understanding how multimethod assessment data relate to major personality systems and can be combined to describe and explain complex human behavior. The quantitative psychopathology-personality trait model is an empirical model based on the hierarchical organization of individual differences. Application of this model can help students understand diagnostic comorbidity and symptom heterogeneity, focus on more meaningful high-order domains, and identify the most effective assessment tools for addressing a given question. The interpersonal situation model is rooted in interpersonal theory and can help students connect test data to here-and-now interactions with patients. We conclude by demonstrating the utility of these models using a case example.

  13. Model-based magnetization retrieval from holographic phase images

    Energy Technology Data Exchange (ETDEWEB)

    Röder, Falk, E-mail: f.roeder@hzdr.de [Helmholtz-Zentrum Dresden-Rossendorf, Institut für Ionenstrahlphysik und Materialforschung, Bautzner Landstr. 400, D-01328 Dresden (Germany); Triebenberg Labor, Institut für Strukturphysik, Technische Universität Dresden, D-01062 Dresden (Germany); Vogel, Karin [Triebenberg Labor, Institut für Strukturphysik, Technische Universität Dresden, D-01062 Dresden (Germany); Wolf, Daniel [Helmholtz-Zentrum Dresden-Rossendorf, Institut für Ionenstrahlphysik und Materialforschung, Bautzner Landstr. 400, D-01328 Dresden (Germany); Triebenberg Labor, Institut für Strukturphysik, Technische Universität Dresden, D-01062 Dresden (Germany); Hellwig, Olav [Helmholtz-Zentrum Dresden-Rossendorf, Institut für Ionenstrahlphysik und Materialforschung, Bautzner Landstr. 400, D-01328 Dresden (Germany); AG Magnetische Funktionsmaterialien, Institut für Physik, Technische Universität Chemnitz, D-09126 Chemnitz (Germany); HGST, A Western Digital Company, 3403 Yerba Buena Rd., San Jose, CA 95135 (United States); Wee, Sung Hun [HGST, A Western Digital Company, 3403 Yerba Buena Rd., San Jose, CA 95135 (United States); Wicht, Sebastian; Rellinghaus, Bernd [IFW Dresden, Institute for Metallic Materials, P.O. Box 270116, D-01171 Dresden (Germany)

    2017-05-15

    The phase shift of the electron wave is a useful measure for the projected magnetic flux density of magnetic objects at the nanometer scale. More important for materials science, however, is the knowledge about the magnetization in a magnetic nano-structure. As demonstrated here, a dominating presence of stray fields prohibits a direct interpretation of the phase in terms of magnetization modulus and direction. We therefore present a model-based approach for retrieving the magnetization by considering the projected shape of the nano-structure and assuming a homogeneous magnetization therein. We apply this method to FePt nano-islands epitaxially grown on a SrTiO{sub 3} substrate, which indicates an inclination of their magnetization direction relative to the structural easy magnetic [001] axis. By means of this real-world example, we discuss prospects and limits of this approach. - Highlights: • Retrieval of the magnetization from holographic phase images. • Magnetostatic model constructed for a magnetic nano-structure. • Decomposition into homogeneously magnetized components. • Discretization of a each component by elementary cuboids. • Analytic solution for the phase of a magnetized cuboid considered. • Fitting a set of magnetization vectors to experimental phase images.

  14. Dynamic model based on Bayesian method for energy security assessment

    International Nuclear Information System (INIS)

    Augutis, Juozas; Krikštolaitis, Ričardas; Pečiulytė, Sigita; Žutautaitė, Inga

    2015-01-01

    Highlights: • Methodology for dynamic indicator model construction and forecasting of indicators. • Application of dynamic indicator model for energy system development scenarios. • Expert judgement involvement using Bayesian method. - Abstract: The methodology for the dynamic indicator model construction and forecasting of indicators for the assessment of energy security level is presented in this article. An indicator is a special index, which provides numerical values to important factors for the investigated area. In real life, models of different processes take into account various factors that are time-dependent and dependent on each other. Thus, it is advisable to construct a dynamic model in order to describe these dependences. The energy security indicators are used as factors in the dynamic model. Usually, the values of indicators are obtained from statistical data. The developed dynamic model enables to forecast indicators’ variation taking into account changes in system configuration. The energy system development is usually based on a new object construction. Since the parameters of changes of the new system are not exactly known, information about their influences on indicators could not be involved in the model by deterministic methods. Thus, dynamic indicators’ model based on historical data is adjusted by probabilistic model with the influence of new factors on indicators using the Bayesian method

  15. RISK LOAN PORTFOLIO OPTIMIZATION MODEL BASED ON CVAR RISK MEASURE

    Directory of Open Access Journals (Sweden)

    Ming-Chang LEE

    2015-07-01

    Full Text Available In order to achieve commercial banks liquidity, safety and profitability objective requirements, loan portfolio risk analysis based optimization decisions are rational allocation of assets.  The risk analysis and asset allocation are the key technology of banking and risk management.  The aim of this paper, build a loan portfolio optimization model based on risk analysis.  Loan portfolio rate of return by using Value-at-Risk (VaR and Conditional Value-at-Risk (CVaR constraint optimization decision model reflects the bank's risk tolerance, and the potential loss of direct control of the bank.  In this paper, it analyze a general risk management model applied to portfolio problems with VaR and CVaR risk measures by using Using the Lagrangian Algorithm.  This paper solves the highly difficult problem by matrix operation method.  Therefore, the combination of this paper is easy understanding the portfolio problems with VaR and CVaR risk model is a hyperbola in mean-standard deviation space.  It is easy calculation in proposed method.

  16. A General Accelerated Degradation Model Based on the Wiener Process

    Directory of Open Access Journals (Sweden)

    Le Liu

    2016-12-01

    Full Text Available Accelerated degradation testing (ADT is an efficient tool to conduct material service reliability and safety evaluations by analyzing performance degradation data. Traditional stochastic process models are mainly for linear or linearization degradation paths. However, those methods are not applicable for the situations where the degradation processes cannot be linearized. Hence, in this paper, a general ADT model based on the Wiener process is proposed to solve the problem for accelerated degradation data analysis. The general model can consider the unit-to-unit variation and temporal variation of the degradation process, and is suitable for both linear and nonlinear ADT analyses with single or multiple acceleration variables. The statistical inference is given to estimate the unknown parameters in both constant stress and step stress ADT. The simulation example and two real applications demonstrate that the proposed method can yield reliable lifetime evaluation results compared with the existing linear and time-scale transformation Wiener processes in both linear and nonlinear ADT analyses.

  17. Parallelization of the model-based iterative reconstruction algorithm DIRA

    International Nuclear Information System (INIS)

    Oertenberg, A.; Sandborg, M.; Alm Carlsson, G.; Malusek, A.; Magnusson, M.

    2016-01-01

    New paradigms for parallel programming have been devised to simplify software development on multi-core processors and many-core graphical processing units (GPU). Despite their obvious benefits, the parallelization of existing computer programs is not an easy task. In this work, the use of the Open Multiprocessing (OpenMP) and Open Computing Language (OpenCL) frameworks is considered for the parallelization of the model-based iterative reconstruction algorithm DIRA with the aim to significantly shorten the code's execution time. Selected routines were parallelized using OpenMP and OpenCL libraries; some routines were converted from MATLAB to C and optimised. Parallelization of the code with the OpenMP was easy and resulted in an overall speedup of 15 on a 16-core computer. Parallelization with OpenCL was more difficult owing to differences between the central processing unit and GPU architectures. The resulting speedup was substantially lower than the theoretical peak performance of the GPU; the cause was explained. (authors)

  18. Model-Based Recursive Partitioning for Subgroup Analyses.

    Science.gov (United States)

    Seibold, Heidi; Zeileis, Achim; Hothorn, Torsten

    2016-05-01

    The identification of patient subgroups with differential treatment effects is the first step towards individualised treatments. A current draft guideline by the EMA discusses potentials and problems in subgroup analyses and formulated challenges to the development of appropriate statistical procedures for the data-driven identification of patient subgroups. We introduce model-based recursive partitioning as a procedure for the automated detection of patient subgroups that are identifiable by predictive factors. The method starts with a model for the overall treatment effect as defined for the primary analysis in the study protocol and uses measures for detecting parameter instabilities in this treatment effect. The procedure produces a segmented model with differential treatment parameters corresponding to each patient subgroup. The subgroups are linked to predictive factors by means of a decision tree. The method is applied to the search for subgroups of patients suffering from amyotrophic lateral sclerosis that differ with respect to their Riluzole treatment effect, the only currently approved drug for this disease.

  19. Model-based Quantile Regression for Discrete Data

    KAUST Repository

    Padellini, Tullia

    2018-04-10

    Quantile regression is a class of methods voted to the modelling of conditional quantiles. In a Bayesian framework quantile regression has typically been carried out exploiting the Asymmetric Laplace Distribution as a working likelihood. Despite the fact that this leads to a proper posterior for the regression coefficients, the resulting posterior variance is however affected by an unidentifiable parameter, hence any inferential procedure beside point estimation is unreliable. We propose a model-based approach for quantile regression that considers quantiles of the generating distribution directly, and thus allows for a proper uncertainty quantification. We then create a link between quantile regression and generalised linear models by mapping the quantiles to the parameter of the response variable, and we exploit it to fit the model with R-INLA. We extend it also in the case of discrete responses, where there is no 1-to-1 relationship between quantiles and distribution\\'s parameter, by introducing continuous generalisations of the most common discrete variables (Poisson, Binomial and Negative Binomial) to be exploited in the fitting.

  20. Model-Based Systems Engineering in Concurrent Engineering Centers

    Science.gov (United States)

    Iwata, Curtis; Infeld, Samantha; Bracken, Jennifer Medlin; McGuire, Melissa; McQuirk, Christina; Kisdi, Aron; Murphy, Jonathan; Cole, Bjorn; Zarifian, Pezhman

    2015-01-01

    Concurrent Engineering Centers (CECs) are specialized facilities with a goal of generating and maturing engineering designs by enabling rapid design iterations. This is accomplished by co-locating a team of experts (either physically or virtually) in a room with a narrow design goal and a limited timeline of a week or less. The systems engineer uses a model of the system to capture the relevant interfaces and manage the overall architecture. A single model that integrates other design information and modeling allows the entire team to visualize the concurrent activity and identify conflicts more efficiently, potentially resulting in a systems model that will continue to be used throughout the project lifecycle. Performing systems engineering using such a system model is the definition of model-based systems engineering (MBSE); therefore, CECs evolving their approach to incorporate advances in MBSE are more successful in reducing time and cost needed to meet study goals. This paper surveys space mission CECs that are in the middle of this evolution, and the authors share their experiences in order to promote discussion within the community.

  1. Coastal aquifer management under parameter uncertainty: Ensemble surrogate modeling based simulation-optimization

    Science.gov (United States)

    Janardhanan, S.; Datta, B.

    2011-12-01

    Surrogate models are widely used to develop computationally efficient simulation-optimization models to solve complex groundwater management problems. Artificial intelligence based models are most often used for this purpose where they are trained using predictor-predictand data obtained from a numerical simulation model. Most often this is implemented with the assumption that the parameters and boundary conditions used in the numerical simulation model are perfectly known. However, in most practical situations these values are uncertain. Under these circumstances the application of such approximation surrogates becomes limited. In our study we develop a surrogate model based coupled simulation optimization methodology for determining optimal pumping strategies for coastal aquifers considering parameter uncertainty. An ensemble surrogate modeling approach is used along with multiple realization optimization. The methodology is used to solve a multi-objective coastal aquifer management problem considering two conflicting objectives. Hydraulic conductivity and the aquifer recharge are considered as uncertain values. Three dimensional coupled flow and transport simulation model FEMWATER is used to simulate the aquifer responses for a number of scenarios corresponding to Latin hypercube samples of pumping and uncertain parameters to generate input-output patterns for training the surrogate models. Non-parametric bootstrap sampling of this original data set is used to generate multiple data sets which belong to different regions in the multi-dimensional decision and parameter space. These data sets are used to train and test multiple surrogate models based on genetic programming. The ensemble of surrogate models is then linked to a multi-objective genetic algorithm to solve the pumping optimization problem. Two conflicting objectives, viz, maximizing total pumping from beneficial wells and minimizing the total pumping from barrier wells for hydraulic control of

  2. Squamous cell carcinoma arising in previously burned or irradiated skin

    International Nuclear Information System (INIS)

    Edwards, M.J.; Hirsch, R.M.; Broadwater, J.R.; Netscher, D.T.; Ames, F.C.

    1989-01-01

    Squamous cell carcinoma (SCC) arising in previously burned or irradiated skin was reviewed in 66 patients treated between 1944 and 1986. Healing of the initial injury was complicated in 70% of patients. Mean interval from initial injury to diagnosis of SCC was 37 years. The overwhelming majority of patients presented with a chronic intractable ulcer in previously injured skin. The regional relapse rate after surgical excision was very high, 58% of all patients. Predominant patterns of recurrence were in local skin and regional lymph nodes (93% of recurrences). Survival rates at 5, 10, and 20 years were 52%, 34%, and 23%, respectively. Five-year survival rates in previously burned and irradiated patients were not significantly different (53% and 50%, respectively). This review, one of the largest reported series, better defines SCC arising in previously burned or irradiated skin as a locally aggressive disease that is distinct from SCC arising in sunlight-damaged skin. An increased awareness of the significance of chronic ulceration in scar tissue may allow earlier diagnosis. Regional disease control and survival depend on surgical resection of all known disease and may require radical lymph node dissection or amputation

  3. Outcome Of Pregnancy Following A Previous Lower Segment ...

    African Journals Online (AJOL)

    Background: A previous ceasarean section is an important variable that influences patient management in subsequent pregnancies. A trial of vaginal delivery in such patients is a feasible alternative to a secondary section, thus aiding to reduce the ceasarean section rate and its associated co-morbidities. Objective: To ...

  4. 24 CFR 1710.552 - Previously accepted state filings.

    Science.gov (United States)

    2010-04-01

    ... of Substantially Equivalent State Law § 1710.552 Previously accepted state filings. (a) Materials... and contracts or agreements contain notice of purchaser's revocation rights. In addition see § 1715.15..., unless the developer is obligated to do so in the contract. (b) If any such filing becomes inactive or...

  5. The job satisfaction of principals of previously disadvantaged schools

    African Journals Online (AJOL)

    The aim of this study was to identify influences on the job satisfaction of previously disadvantaged ..... I am still riding the cloud … I hope it lasts. .... as a way of creating a climate and culture in schools where individuals are willing to explore.

  6. Haemophilus influenzae type f meningitis in a previously healthy boy

    DEFF Research Database (Denmark)

    Ronit, Andreas; Berg, Ronan M G; Bruunsgaard, Helle

    2013-01-01

    Non-serotype b strains of Haemophilus influenzae are extremely rare causes of acute bacterial meningitis in immunocompetent individuals. We report a case of acute bacterial meningitis in a 14-year-old boy, who was previously healthy and had been immunised against H influenzae serotype b (Hib...

  7. Research Note Effects of previous cultivation on regeneration of ...

    African Journals Online (AJOL)

    We investigated the effects of previous cultivation on regeneration potential under miombo woodlands in a resettlement area, a spatial product of Zimbabwe's land reforms. We predicted that cultivation would affect population structure, regeneration, recruitment and potential grazing capacity of rangelands. Plant attributes ...

  8. Cryptococcal meningitis in a previously healthy child | Chimowa ...

    African Journals Online (AJOL)

    An 8-year-old previously healthy female presented with a 3 weeks history of headache, neck stiffness, deafness, fever and vomiting and was diagnosed with cryptococcal meningitis. She had documented hearing loss and was referred to tertiary-level care after treatment with fluconazole did not improve her neurological ...

  9. Investigation of previously derived Hyades, Coma, and M67 reddenings

    International Nuclear Information System (INIS)

    Taylor, B.J.

    1980-01-01

    New Hyades polarimetry and field star photometry have been obtained to check the Hyades reddening, which was found to be nonzero in a previous paper. The new Hyades polarimetry implies essentially zero reddening; this is also true of polarimetry published by Behr (which was incorrectly interpreted in the previous paper). Four photometric techniques which are presumed to be insensitive to blanketing are used to compare the Hyades to nearby field stars; these four techniques also yield essentially zero reddening. When all of these results are combined with others which the author has previously published and a simultaneous solution for the Hyades, Coma, and M67 reddenings is made, the results are E (B-V) =3 +- 2 (sigma) mmag, -1 +- 3 (sigma) mmag, and 46 +- 6 (sigma) mmag, respectively. No support for a nonzero Hyades reddening is offered by the new results. When the newly obtained reddenings for the Hyades, Coma, and M67 are compared with results from techniques given by Crawford and by users of the David Dunlap Observatory photometric system, no differences between the new and other reddenings are found which are larger than about 2 sigma. The author had previously found that the M67 main-sequence stars have about the same blanketing as that of Coma and less blanketing than the Hyades; this conclusion is essentially unchanged by the revised reddenings

  10. Rapid fish stock depletion in previously unexploited seamounts: the ...

    African Journals Online (AJOL)

    Rapid fish stock depletion in previously unexploited seamounts: the case of Beryx splendens from the Sierra Leone Rise (Gulf of Guinea) ... A spectral analysis and red-noise spectra procedure (REDFIT) algorithm was used to identify the red-noise spectrum from the gaps in the observed time-series of catch per unit effort by ...

  11. 18 CFR 154.302 - Previously submitted material.

    Science.gov (United States)

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Previously submitted material. 154.302 Section 154.302 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... concurrently with the rate change filing. There must be furnished to the Director, Office of Energy Market...

  12. Process cells dismantling of EUREX pant: previous activities

    International Nuclear Information System (INIS)

    Gili, M.

    1998-01-01

    In the '98-'99 period some process cells of the EUREX pant will be dismantled, in order to place there the liquid wastes conditioning plant 'CORA'. This report resumes the previous activities (plant rinsing campaigns and inactive Cell 014 dismantling), run in the past three years and the drawn experience [it

  13. The job satisfaction of principals of previously disadvantaged schools

    African Journals Online (AJOL)

    The aim of this study was to identify influences on the job satisfaction of previously disadvantaged school principals in North-West Province. Evans's theory of job satisfaction, morale and motivation was useful as a conceptual framework. A mixedmethods explanatory research design was important in discovering issues with ...

  14. Obstructive pulmonary disease in patients with previous tuberculosis ...

    African Journals Online (AJOL)

    Obstructive pulmonary disease in patients with previous tuberculosis: Pathophysiology of a community-based cohort. B.W. Allwood, R Gillespie, M Galperin-Aizenberg, M Bateman, H Olckers, L Taborda-Barata, G.L. Calligaro, Q Said-Hartley, R van Zyl-Smit, C.B. Cooper, E van Rikxoort, J Goldin, N Beyers, E.D. Bateman ...

  15. Abiraterone in metastatic prostate cancer without previous chemotherapy

    NARCIS (Netherlands)

    Ryan, Charles J.; Smith, Matthew R.; de Bono, Johann S.; Molina, Arturo; Logothetis, Christopher J.; de Souza, Paul; Fizazi, Karim; Mainwaring, Paul; Piulats, Josep M.; Ng, Siobhan; Carles, Joan; Mulders, Peter F. A.; Basch, Ethan; Small, Eric J.; Saad, Fred; Schrijvers, Dirk; van Poppel, Hendrik; Mukherjee, Som D.; Suttmann, Henrik; Gerritsen, Winald R.; Flaig, Thomas W.; George, Daniel J.; Yu, Evan Y.; Efstathiou, Eleni; Pantuck, Allan; Winquist, Eric; Higano, Celestia S.; Taplin, Mary-Ellen; Park, Youn; Kheoh, Thian; Griffin, Thomas; Scher, Howard I.; Rathkopf, Dana E.; Boyce, A.; Costello, A.; Davis, I.; Ganju, V.; Horvath, L.; Lynch, R.; Marx, G.; Parnis, F.; Shapiro, J.; Singhal, N.; Slancar, M.; van Hazel, G.; Wong, S.; Yip, D.; Carpentier, P.; Luyten, D.; de Reijke, T.

    2013-01-01

    Abiraterone acetate, an androgen biosynthesis inhibitor, improves overall survival in patients with metastatic castration-resistant prostate cancer after chemotherapy. We evaluated this agent in patients who had not received previous chemotherapy. In this double-blind study, we randomly assigned

  16. Using A Model-Based Systems Engineering Approach For Exploration Medical System Development

    Science.gov (United States)

    Hanson, A.; Mindock, J.; McGuire, K.; Reilly, J.; Cerro, J.; Othon, W.; Rubin, D.; Urbina, M.; Canga, M.

    2017-01-01

    NASA's Human Research Program's Exploration Medical Capabilities (ExMC) element is defining the medical system needs for exploration class missions. ExMC's Systems Engineering (SE) team will play a critical role in successful design and implementation of the medical system into exploration vehicles. The team's mission is to "Define, develop, validate, and manage the technical system design needed to implement exploration medical capabilities for Mars and test the design in a progression of proving grounds." Development of the medical system is being conducted in parallel with exploration mission architecture and vehicle design development. Successful implementation of the medical system in this environment will require a robust systems engineering approach to enable technical communication across communities to create a common mental model of the emergent engineering and medical systems. Model-Based Systems Engineering (MBSE) improves shared understanding of system needs and constraints between stakeholders and offers a common language for analysis. The ExMC SE team is using MBSE techniques to define operational needs, decompose requirements and architecture, and identify medical capabilities needed to support human exploration. Systems Modeling Language (SysML) is the specific language the SE team is utilizing, within an MBSE approach, to model the medical system functional needs, requirements, and architecture. Modeling methods are being developed through the practice of MBSE within the team, and tools are being selected to support meta-data exchange as integration points to other system models are identified. Use of MBSE is supporting the development of relationships across disciplines and NASA Centers to build trust and enable teamwork, enhance visibility of team goals, foster a culture of unbiased learning and serving, and be responsive to customer needs. The MBSE approach to medical system design offers a paradigm shift toward greater integration between

  17. Model-based framework for multi-axial real-time hybrid simulation testing

    Science.gov (United States)

    Fermandois, Gaston A.; Spencer, Billie F.

    2017-10-01

    Real-time hybrid simulation is an efficient and cost-effective dynamic testing technique for performance evaluation of structural systems subjected to earthquake loading with rate-dependent behavior. A loading assembly with multiple actuators is required to impose realistic boundary conditions on physical specimens. However, such a testing system is expected to exhibit significant dynamic coupling of the actuators and suffer from time lags that are associated with the dynamics of the servo-hydraulic system, as well as control-structure interaction (CSI). One approach to reducing experimental errors considers a multi-input, multi-output (MIMO) controller design, yielding accurate reference tracking and noise rejection. In this paper, a framework for multi-axial real-time hybrid simulation (maRTHS) testing is presented. The methodology employs a real-time feedback-feedforward controller for multiple actuators commanded in Cartesian coordinates. Kinematic transformations between actuator space and Cartesian space are derived for all six-degrees-offreedom of the moving platform. Then, a frequency domain identification technique is used to develop an accurate MIMO transfer function of the system. Further, a Cartesian-domain model-based feedforward-feedback controller is implemented for time lag compensation and to increase the robustness of the reference tracking for given model uncertainty. The framework is implemented using the 1/5th-scale Load and Boundary Condition Box (LBCB) located at the University of Illinois at Urbana- Champaign. To demonstrate the efficacy of the proposed methodology, a single-story frame subjected to earthquake loading is tested. One of the columns in the frame is represented physically in the laboratory as a cantilevered steel column. For realtime execution, the numerical substructure, kinematic transformations, and controllers are implemented on a digital signal processor. Results show excellent performance of the maRTHS framework when six

  18. ENVIRONMENTAL RESPONSIBILITY MODEL BASED ON ISO 14000 MANAGEMENT SYSTEMS

    Directory of Open Access Journals (Sweden)

    Catalina SITNIKOV

    2012-01-01

    Full Text Available Worldwide corporations, as well as their stakeholders, are more conscious of the need for environmental management, SR behaviour, and sustainable growth and development. International Standards are becoming more significant for corporations to work towards common environmental management practices. ISO 14001 is the first and the broadest standard intended at a more responsible approach of corporations and the world’s most acknowledged framework for environmental management systems that assists corporations to better manage the effect of their activities on the environment. This article aims to study ISO 14001 implementation and its effects on the environmental responsibility. A model will be built, which covers the environmental management system, the components of organizational culture, being able to influence environmental standards implementation.

  19. Reoperative sentinel lymph node biopsy after previous mastectomy.

    Science.gov (United States)

    Karam, Amer; Stempel, Michelle; Cody, Hiram S; Port, Elisa R

    2008-10-01

    Sentinel lymph node (SLN) biopsy is the standard of care for axillary staging in breast cancer, but many clinical scenarios questioning the validity of SLN biopsy remain. Here we describe our experience with reoperative-SLN (re-SLN) biopsy after previous mastectomy. Review of the SLN database from September 1996 to December 2007 yielded 20 procedures done in the setting of previous mastectomy. SLN biopsy was performed using radioisotope with or without blue dye injection superior to the mastectomy incision, in the skin flap in all patients. In 17 of 20 patients (85%), re-SLN biopsy was performed for local or regional recurrence after mastectomy. Re-SLN biopsy was successful in 13 of 20 patients (65%) after previous mastectomy. Of the 13 patients, 2 had positive re-SLN, and completion axillary dissection was performed, with 1 having additional positive nodes. In the 11 patients with negative re-SLN, 2 patients underwent completion axillary dissection demonstrating additional negative nodes. One patient with a negative re-SLN experienced chest wall recurrence combined with axillary recurrence 11 months after re-SLN biopsy. All others remained free of local or axillary recurrence. Re-SLN biopsy was unsuccessful in 7 of 20 patients (35%). In three of seven patients, axillary dissection was performed, yielding positive nodes in two of the three. The remaining four of seven patients all had previous modified radical mastectomy, so underwent no additional axillary surgery. In this small series, re-SLN was successful after previous mastectomy, and this procedure may play some role when axillary staging is warranted after mastectomy.

  20. Towards a Development Environment for Model Based Test Design

    OpenAIRE

    Jing, Han

    2008-01-01

    Within the UP IP I&V organization there is high focus on increasing the ability to predict product quality in a cost efficient way. Test automation has therefore been an important enabler for us. The IP test design environment is continuously evolving and the investigations will show which improvements that is most important to implement in short and long term. In Ericsson UP IP I&V, the test automation framework environments are severed to complete some process by automated method, f...

  1. The Affordable Care Act, Insurance Coverage, and Health Care Utilization of Previously Incarcerated Young Men: 2008-2015.

    Science.gov (United States)

    Winkelman, Tyler N A; Choi, HwaJung; Davis, Matthew M

    2017-05-01

    To estimate health insurance and health care utilization patterns among previously incarcerated men following implementation of the Affordable Care Act's (ACA's) Medicaid expansion and Marketplace plans in 2014. We performed serial cross-sectional analyses using data from the National Survey of Family Growth between 2008 and 2015. Our sample included men aged 18 to 44 years with (n = 3476) and without (n = 8702) a history of incarceration. Uninsurance declined significantly among previously incarcerated men after ACA implementation (-5.9 percentage points; 95% confidence interval [CI] = -11.5, -0.4), primarily because of an increase in private insurance (6.8 percentage points; 95% CI = 0.1, 13.3). Previously incarcerated men accounted for a large proportion of the remaining uninsured (38.6%) in 2014 to 2015. Following ACA implementation, previously incarcerated men continued to be significantly less likely to report a regular source of primary care and more likely to report emergency department use than were never-incarcerated peers. Health insurance coverage improved among previously incarcerated men following ACA implementation. However, these men account for a substantial proportion of the remaining uninsured. Previously incarcerated men continue to lack primary care and frequently utilize acute care services.

  2. Equivalent charge source model based iterative maximum neighbor weight for sparse EEG source localization.

    Science.gov (United States)

    Xu, Peng; Tian, Yin; Lei, Xu; Hu, Xiao; Yao, Dezhong

    2008-12-01

    How to localize the neural electric activities within brain effectively and precisely from the scalp electroencephalogram (EEG) recordings is a critical issue for current study in clinical neurology and cognitive neuroscience. In this paper, based on the charge source model and the iterative re-weighted strategy, proposed is a new maximum neighbor weight based iterative sparse source imaging method, termed as CMOSS (Charge source model based Maximum neighbOr weight Sparse Solution). Different from the weight used in focal underdetermined system solver (FOCUSS) where the weight for each point in the discrete solution space is independently updated in iterations, the new designed weight for each point in each iteration is determined by the source solution of the last iteration at both the point and its neighbors. Using such a new weight, the next iteration may have a bigger chance to rectify the local source location bias existed in the previous iteration solution. The simulation studies with comparison to FOCUSS and LORETA for various source configurations were conducted on a realistic 3-shell head model, and the results confirmed the validation of CMOSS for sparse EEG source localization. Finally, CMOSS was applied to localize sources elicited in a visual stimuli experiment, and the result was consistent with those source areas involved in visual processing reported in previous studies.

  3. Dynamic model based on voltage transfer curve for pattern formation in dielectric barrier glow discharge

    Energy Technology Data Exchange (ETDEWEB)

    Li, Ben; He, Feng; Ouyang, Jiting, E-mail: jtouyang@bit.edu.cn [School of Physics, Beijing Institute of Technology, Beijing 100081 (China); Duan, Xiaoxi [Research Center of Laser Fusion, CAEP, Mianyang 621900 (China)

    2015-12-15

    Simulation work is very important for understanding the formation of self-organized discharge patterns. Previous works have witnessed different models derived from other systems for simulation of discharge pattern, but most of these models are complicated and time-consuming. In this paper, we introduce a convenient phenomenological dynamic model based on the basic dynamic process of glow discharge and the voltage transfer curve (VTC) to study the dielectric barrier glow discharge (DBGD) pattern. VTC is an important characteristic of DBGD, which plots the change of wall voltage after a discharge as a function of the initial total gap voltage. In the modeling, the combined effect of the discharge conditions is included in VTC, and the activation-inhibition effect is expressed by a spatial interaction term. Besides, the model reduces the dimensionality of the system by just considering the integration effect of current flow. All these greatly facilitate the construction of this model. Numerical simulations turn out to be in good accordance with our previous fluid modeling and experimental result.

  4. Development Of Entrepreneur Learning Model Based On Problem Based Learning To Increase Competency Independence And Creativity Students Of Industrial Engineering

    Directory of Open Access Journals (Sweden)

    Leola Dewiyani

    2017-10-01

    Full Text Available Currently it is undeniable that the competition to get a job is very tight and of course universities have an important role in printing human resources that can compete globally not least with the Department of Industrial Engineering Faculty of Engineering Muhammadiyah University of Jakarta FT UMJ. Problems that occur is based on the analysis obtained from the track record of graduates researchers found that 60 percent of students of Industrial Engineering FT UMJ work not in accordance with the level of education owned so financially their income is still below the standard. This study aims to improve the competence of students of Industrial Engineering Department FT UMJ in entrepreneurship courses especially through the development of Problem Based Learning based learning model. Specific targets of this research were conducted with the aim to identify and analyze the need to implement learning model based on Problem Based Learning Entrepreneurship and to design and develop the model of entrepreneurship based on Problem Based Learning to improve the competence independence and creativity of Industrial Engineering students of FT UMJ in Entrepreneurship course. To achieve the above objectives this research uses research and development R amp D method. The product produced in this research is the detail of learning model of entrepreneurial model based on Problem Based Learning entrepreneurship model based on Problem Based Learning and international journals

  5. Experience of Implementing Moisture Sorption Control in Historical Archives

    Directory of Open Access Journals (Sweden)

    P. Zítek

    2006-01-01

    Full Text Available This paper deals with a novel approach to inhibiting the harmful impact of moisture sorption in old art works and historical exhibits preserved in remote historic buildings that are in use as depositories or exhibition rooms for cultural heritage collections. It is a sequel to the previous work presented in [2], where the principle of moisture sorption stabilization was explained. Sorption isotherm investigations and EMC control implementation in historical buildings not provided with heating are the main concern in this paper. The proposed microclimate adjustment consists in leaving the interior temperature to run almost its spontaneous yearly cycle, while the air humidity is maintained in a specific relationship to the current interior temperature. The interior air humidity is modestly adjusted to protect historical exhibits and art works from harmful variations in the content of absorbed moisture, which would otherwise arise owing to the interior temperature drifts. Since direct measurements of moisture content are not feasible, the air humidity is controlled via a model-based principle. Two long-term implementations of the proposed microclimate control have already proved that it can permanently maintain a constant moisture content in the preserved exhibits. 

  6. Three Dimensional Dynamic Model Based Wind Field Reconstruction from Lidar Data

    International Nuclear Information System (INIS)

    Raach, Steffen; Schlipf, David; Haizmann, Florian; Cheng, Po Wen

    2014-01-01

    Using the inflowing horizontal and vertical wind shears for individual pitch controller is a promising method if blade bending measurements are not available. Due to the limited information provided by a lidar system the reconstruction of shears in real-time is a challenging task especially for the horizontal shear in the presence of changing wind direction. The internal model principle has shown to be a promising approach to estimate the shears and directions in 10 minutes averages with real measurement data. The static model based wind vector field reconstruction is extended in this work taking into account a dynamic reconstruction model based on Taylor's Frozen Turbulence Hypothesis. The presented method provides time series over several seconds of the wind speed, shears and direction, which can be directly used in advanced optimal preview control. Therefore, this work is an important step towards the application of preview individual blade pitch control under realistic wind conditions. The method is tested using a turbulent wind field and a detailed lidar simulator. For the simulation, the turbulent wind field structure is flowing towards the lidar system and is continuously misaligned with respect to the horizontal axis of the wind turbine. Taylor's Frozen Turbulence Hypothesis is taken into account to model the wind evolution. For the reconstruction, the structure is discretized into several stages where each stage is reduced to an effective wind speed, superposed with a linear horizontal and vertical wind shear. Previous lidar measurements are shifted using again Taylor's Hypothesis. The wind field reconstruction problem is then formulated as a nonlinear optimization problem, which minimizes the residual between the assumed wind model and the lidar measurements to obtain the misalignment angle and the effective wind speed and the wind shears for each stage. This method shows good results in reconstructing the wind characteristics of a three

  7. Whole vertebral bone segmentation method with a statistical intensity-shape model based approach

    Science.gov (United States)

    Hanaoka, Shouhei; Fritscher, Karl; Schuler, Benedikt; Masutani, Yoshitaka; Hayashi, Naoto; Ohtomo, Kuni; Schubert, Rainer

    2011-03-01

    An automatic segmentation algorithm for the vertebrae in human body CT images is presented. Especially we focused on constructing and utilizing 4 different statistical intensity-shape combined models for the cervical, upper / lower thoracic and lumbar vertebrae, respectively. For this purpose, two previously reported methods were combined: a deformable model-based initial segmentation method and a statistical shape-intensity model-based precise segmentation method. The former is used as a pre-processing to detect the position and orientation of each vertebra, which determines the initial condition for the latter precise segmentation method. The precise segmentation method needs prior knowledge on both the intensities and the shapes of the objects. After PCA analysis of such shape-intensity expressions obtained from training image sets, vertebrae were parametrically modeled as a linear combination of the principal component vectors. The segmentation of each target vertebra was performed as fitting of this parametric model to the target image by maximum a posteriori estimation, combined with the geodesic active contour method. In the experimental result by using 10 cases, the initial segmentation was successful in 6 cases and only partially failed in 4 cases (2 in the cervical area and 2 in the lumbo-sacral). In the precise segmentation, the mean error distances were 2.078, 1.416, 0.777, 0.939 mm for cervical, upper and lower thoracic, lumbar spines, respectively. In conclusion, our automatic segmentation algorithm for the vertebrae in human body CT images showed a fair performance for cervical, thoracic and lumbar vertebrae.

  8. Cortical and hippocampal correlates of deliberation during model-based decisions for rewards in humans.

    Directory of Open Access Journals (Sweden)

    Aaron M Bornstein

    Full Text Available How do we use our memories of the past to guide decisions we've never had to make before? Although extensive work describes how the brain learns to repeat rewarded actions, decisions can also be influenced by associations between stimuli or events not directly involving reward - such as when planning routes using a cognitive map or chess moves using predicted countermoves - and these sorts of associations are critical when deciding among novel options. This process is known as model-based decision making. While the learning of environmental relations that might support model-based decisions is well studied, and separately this sort of information has been inferred to impact decisions, there is little evidence concerning the full cycle by which such associations are acquired and drive choices. Of particular interest is whether decisions are directly supported by the same mnemonic systems characterized for relational learning more generally, or instead rely on other, specialized representations. Here, building on our previous work, which isolated dual representations underlying sequential predictive learning, we directly demonstrate that one such representation, encoded by the hippocampal memory system and adjacent cortical structures, supports goal-directed decisions. Using interleaved learning and decision tasks, we monitor predictive learning directly and also trace its influence on decisions for reward. We quantitatively compare the learning processes underlying multiple behavioral and fMRI observables using computational model fits. Across both tasks, a quantitatively consistent learning process explains reaction times, choices, and both expectation- and surprise-related neural activity. The same hippocampal and ventral stream regions engaged in anticipating stimuli during learning are also engaged in proportion to the difficulty of decisions. These results support a role for predictive associations learned by the hippocampal memory system to

  9. Model-Based Engineering and Manufacturing CAD/CAM Benchmark

    International Nuclear Information System (INIS)

    Domm, T.D.; Underwood, R.S.

    1999-01-01

    The Benehmark Project was created from a desire to identify best practices and improve the overall efficiency and performance of the Y-12 Plant's systems and personnel supporting the manufacturing mission. The mission of the benchmark team was to search out industry leaders in manufacturing and evaluate their engineering practices and processes to determine direction and focus fm Y-12 modmizadon efforts. The companies visited included several large established companies and anew, small, high-tech machining firm. As a result of this effort changes are recommended that will enable Y-12 to become a more responsive cost-effective manufacturing facility capable of suppording the needs of the Nuclear Weapons Complex (NW at sign) and Work Fw Others into the 21' century. The benchmark team identified key areas of interest, both focused and gencml. The focus arm included Human Resources, Information Management, Manufacturing Software Tools, and Standarda/ Policies and Practices. Areas of general interest included Inhstructure, Computer Platforms and Networking, and Organizational Structure. The method for obtaining the desired information in these areas centered on the creation of a benchmark questionnaire. The questionnaire was used throughout each of the visits as the basis for information gathering. The results of this benchmark showed that all companies are moving in the direction of model-based engineering and manufacturing. There was evidence that many companies are trying to grasp how to manage current and legacy data. In terms of engineering design software tools, the companies contacted were using both 3-D solid modeling and surfaced Wire-frame models. The manufacturing computer tools were varie4 with most companies using more than one software product to generate machining data and none currently performing model-based manufacturing (MBM) ftom a common medel. The majority of companies were closer to identifying or using a single computer-aided design (CAD) system

  10. [Fatal amnioinfusion with previous choriocarcinoma in a parturient woman].

    Science.gov (United States)

    Hrgović, Z; Bukovic, D; Mrcela, M; Hrgović, I; Siebzehnrübl, E; Karelovic, D

    2004-04-01

    The case of 36-year-old tercipare is described who developed choriocharcinoma in a previous pregnancy. During the first term labour the patient developed cardiac arrest, so reanimation and sectio cesarea was performed. A male new-born was delivered in good condition, but even after intensive therapy and reanimation occurred death of parturient woman with picture of disseminate intravascular coagulopathia (DIK). On autopsy and on histology there was no sign of malignant disease, so it was not possible to connect previous choricarcinoma with amniotic fluid embolism. Maybe was place of choriocarcinoma "locus minoris resistentiae" which later resulted with failure in placentation what was hard to prove. On autopsy we found embolia of lung with a microthrombosis of terminal circulation with punctiformis bleeding in mucous, what stands for DIK.

  11. Challenging previous conceptions of vegetarianism and eating disorders.

    Science.gov (United States)

    Fisak, B; Peterson, R D; Tantleff-Dunn, S; Molnar, J M

    2006-12-01

    The purpose of this study was to replicate and expand upon previous research that has examined the potential association between vegetarianism and disordered eating. Limitations of previous research studies are addressed, including possible low reliability of measures of eating pathology within vegetarian samples, use of only a few dietary restraint measures, and a paucity of research examining potential differences in body image and food choice motives of vegetarians versus nonvegetarians. Two hundred and fifty-six college students completed a number of measures of eating pathology and body image, and a food choice motives questionnaire. Interestingly, no significant differences were found between vegetarians and nonvegetarians in measures of eating pathology or body image. However, significant differences in food choice motives were found. Implications for both researchers and clinicians are discussed.

  12. Previously unreported abnormalities in Wolfram Syndrome Type 2.

    Science.gov (United States)

    Akturk, Halis Kaan; Yasa, Seda

    2017-01-01

    Wolfram syndrome (WFS) is a rare autosomal recessive disease with non-autoimmune childhood onset insulin dependent diabetes and optic atrophy. WFS type 2 (WFS2) differs from WFS type 1 (WFS1) with upper intestinal ulcers, bleeding tendency and the lack ofdiabetes insipidus. Li-fespan is short due to related comorbidities. Only a few familieshave been reported with this syndrome with the CISD2 mutation. Here we report two siblings with a clinical diagnosis of WFS2, previously misdiagnosed with type 1 diabetes mellitus and diabetic retinopathy-related blindness. We report possible additional clinical and laboratory findings that have not been pre-viously reported, such as asymptomatic hypoparathyroidism, osteomalacia, growth hormone (GH) deficiency and hepatomegaly. Even though not a requirement for the diagnosis of WFS2 currently, our case series confirm hypogonadotropic hypogonadism to be also a feature of this syndrome, as reported before. © Polish Society for Pediatric Endocrinology and Diabetology.

  13. Model Based Mission Assurance in a Model Based Systems Engineering (MBSE) Framework: State-of-the-Art Assessment

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.

    2016-01-01

    This report explores the current state of the art of Safety and Mission Assurance (S&MA) in projects that have shifted towards Model Based Systems Engineering (MBSE). Its goal is to provide insight into how NASA's Office of Safety and Mission Assurance (OSMA) should respond to this shift. In MBSE, systems engineering information is organized and represented in models: rigorous computer-based representations, which collectively make many activities easier to perform, less error prone, and scalable. S&MA practices must shift accordingly. The "Objective Structure Hierarchies" recently developed by OSMA provide the framework for understanding this shift. Although the objectives themselves will remain constant, S&MA practices (activities, processes, tools) to achieve them are subject to change. This report presents insights derived from literature studies and interviews. The literature studies gleaned assurance implications from reports of space-related applications of MBSE. The interviews with knowledgeable S&MA and MBSE personnel discovered concerns and ideas for how assurance may adapt. Preliminary findings and observations are presented on the state of practice of S&MA with respect to MBSE, how it is already changing, and how it is likely to change further. Finally, recommendations are provided on how to foster the evolution of S&MA to best fit with MBSE.

  14. Adaptive surrogate model based multiobjective optimization for coastal aquifer management

    Science.gov (United States)

    Song, Jian; Yang, Yun; Wu, Jianfeng; Wu, Jichun; Sun, Xiaomin; Lin, Jin

    2018-06-01

    In this study, a novel surrogate model assisted multiobjective memetic algorithm (SMOMA) is developed for optimal pumping strategies of large-scale coastal groundwater problems. The proposed SMOMA integrates an efficient data-driven surrogate model with an improved non-dominated sorted genetic algorithm-II (NSGAII) that employs a local search operator to accelerate its convergence in optimization. The surrogate model based on Kernel Extreme Learning Machine (KELM) is developed and evaluated as an approximate simulator to generate the patterns of regional groundwater flow and salinity levels in coastal aquifers for reducing huge computational burden. The KELM model is adaptively trained during evolutionary search to satisfy desired fidelity level of surrogate so that it inhibits error accumulation of forecasting and results in correctly converging to true Pareto-optimal front. The proposed methodology is then applied to a large-scale coastal aquifer management in Baldwin County, Alabama. Objectives of minimizing the saltwater mass increase and maximizing the total pumping rate in the coastal aquifers are considered. The optimal solutions achieved by the proposed adaptive surrogate model are compared against those solutions obtained from one-shot surrogate model and original simulation model. The adaptive surrogate model does not only improve the prediction accuracy of Pareto-optimal solutions compared with those by the one-shot surrogate model, but also maintains the equivalent quality of Pareto-optimal solutions compared with those by NSGAII coupled with original simulation model, while retaining the advantage of surrogate models in reducing computational burden up to 94% of time-saving. This study shows that the proposed methodology is a computationally efficient and promising tool for multiobjective optimizations of coastal aquifer managements.

  15. Medical Device Integration Model Based on the Internet of Things

    Science.gov (United States)

    Hao, Aiyu; Wang, Ling

    2015-01-01

    At present, hospitals in our country have basically established the HIS system, which manages registration, treatment, and charge, among many others, of patients. During treatment, patients need to use medical devices repeatedly to acquire all sorts of inspection data. Currently, the output data of the medical devices are often manually input into information system, which is easy to get wrong or easy to cause mismatches between inspection reports and patients. For some small hospitals of which information construction is still relatively weak, the information generated by the devices is still presented in the form of paper reports. When doctors or patients want to have access to the data at a given time again, they can only look at the paper files. Data integration between medical devices has long been a difficult problem for the medical information system, because the data from medical devices are lack of mandatory unified global standards and have outstanding heterogeneity of devices. In order to protect their own interests, manufacturers use special protocols, etc., thus causing medical decices to still be the "lonely island" of hospital information system. Besides, unfocused application of the data will lead to failure to achieve a reasonable distribution of medical resources. With the deepening of IT construction in hospitals, medical information systems will be bound to develop towards mobile applications, intelligent analysis, and interconnection and interworking, on the premise that there is an effective medical device integration (MDI) technology. To this end, this paper presents a MDI model based on the Internet of Things (IoT). Through abstract classification, this model is able to extract the common characteristics of the devices, resolve the heterogeneous differences between them, and employ a unified protocol to integrate data between devices. And by the IoT technology, it realizes interconnection network of devices and conducts associate matching

  16. A satellite and model based flood inundation climatology of Australia

    Science.gov (United States)

    Schumann, G.; Andreadis, K.; Castillo, C. J.

    2013-12-01

    To date there is no coherent and consistent database on observed or simulated flood event inundation and magnitude at large scales (continental to global). The only compiled data set showing a consistent history of flood inundation area and extent at a near global scale is provided by the MODIS-based Dartmouth Flood Observatory. However, MODIS satellite imagery is only available from 2000 and is hampered by a number of issues associated with flood mapping using optical images (e.g. classification algorithms, cloud cover, vegetation). Here, we present for the first time a proof-of-concept study in which we employ a computationally efficient 2-D hydrodynamic model (LISFLOOD-FP) complemented with a sub-grid channel formulation to generate a complete flood inundation climatology of the past 40 years (1973-2012) for the entire Australian continent. The model was built completely from freely available SRTM-derived data, including channel widths, bank heights and floodplain topography, which was corrected for vegetation canopy height using a global ICESat canopy dataset. Channel hydraulics were resolved using actual channel data and bathymetry was estimated within the model using hydraulic geometry. On the floodplain, the model simulated the flow paths and inundation variables at a 1 km resolution. The developed model was run over a period of 40 years and a floodplain inundation climatology was generated and compared to satellite flood event observations. Our proof-of-concept study demonstrates that this type of model can reliably simulate past flood events with reasonable accuracies both in time and space. The Australian model was forced with both observed flow climatology and VIC-simulated flows in order to assess the feasibility of a model-based flood inundation climatology at the global scale.

  17. Model-based setup assistant for progressive tools

    Science.gov (United States)

    Springer, Robert; Gräler, Manuel; Homberg, Werner; Henke, Christian; Trächtler, Ansgar

    2018-05-01

    In the field of production systems, globalization and technological progress lead to increasing requirements regarding part quality, delivery time and costs. Hence, today's production is challenged much more than a few years ago: it has to be very flexible and produce economically small batch sizes to satisfy consumer's demands and avoid unnecessary stock. Furthermore, a trend towards increasing functional integration continues to lead to an ongoing miniaturization of sheet metal components. In the industry of electric connectivity for example, the miniaturized connectors are manufactured by progressive tools, which are usually used for very large batches. These tools are installed in mechanical presses and then set up by a technician, who has to manually adjust a wide range of punch-bending operations. Disturbances like material thickness, temperatures, lubrication or tool wear complicate the setup procedure. In prospect of the increasing demand of production flexibility, this time-consuming process has to be handled more and more often. In this paper, a new approach for a model-based setup assistant is proposed as a solution, which is exemplarily applied in combination with a progressive tool. First, progressive tools, more specifically, their setup process is described and based on that, the challenges are pointed out. As a result, a systematic process to set up the machines is introduced. Following, the process is investigated with an FE-Analysis regarding the effects of the disturbances. In the next step, design of experiments is used to systematically develop a regression model of the system's behaviour. This model is integrated within an optimization in order to calculate optimal machine parameters and the following necessary adjustment of the progressive tool due to the disturbances. Finally, the assistant is tested in a production environment and the results are discussed.

  18. Background-Modeling-Based Adaptive Prediction for Surveillance Video Coding.

    Science.gov (United States)

    Zhang, Xianguo; Huang, Tiejun; Tian, Yonghong; Gao, Wen

    2014-02-01

    The exponential growth of surveillance videos presents an unprecedented challenge for high-efficiency surveillance video coding technology. Compared with the existing coding standards that were basically developed for generic videos, surveillance video coding should be designed to make the best use of the special characteristics of surveillance videos (e.g., relative static background). To do so, this paper first conducts two analyses on how to improve the background and foreground prediction efficiencies in surveillance video coding. Following the analysis results, we propose a background-modeling-based adaptive prediction (BMAP) method. In this method, all blocks to be encoded are firstly classified into three categories. Then, according to the category of each block, two novel inter predictions are selectively utilized, namely, the background reference prediction (BRP) that uses the background modeled from the original input frames as the long-term reference and the background difference prediction (BDP) that predicts the current data in the background difference domain. For background blocks, the BRP can effectively improve the prediction efficiency using the higher quality background as the reference; whereas for foreground-background-hybrid blocks, the BDP can provide a better reference after subtracting its background pixels. Experimental results show that the BMAP can achieve at least twice the compression ratio on surveillance videos as AVC (MPEG-4 Advanced Video Coding) high profile, yet with a slightly additional encoding complexity. Moreover, for the foreground coding performance, which is crucial to the subjective quality of moving objects in surveillance videos, BMAP also obtains remarkable gains over several state-of-the-art methods.

  19. Model-based sensor-augmented pump therapy.

    Science.gov (United States)

    Grosman, Benyamin; Voskanyan, Gayane; Loutseiko, Mikhail; Roy, Anirban; Mehta, Aloke; Kurtz, Natalie; Parikh, Neha; Kaufman, Francine R; Mastrototaro, John J; Keenan, Barry

    2013-03-01

    In insulin pump therapy, optimization of bolus and basal insulin dose settings is a challenge. We introduce a new algorithm that provides individualized basal rates and new carbohydrate ratio and correction factor recommendations. The algorithm utilizes a mathematical model of blood glucose (BG) as a function of carbohydrate intake and delivered insulin, which includes individualized parameters derived from sensor BG and insulin delivery data downloaded from a patient's pump. A mathematical model of BG as a function of carbohydrate intake and delivered insulin was developed. The model includes fixed parameters and several individualized parameters derived from the subject's BG measurements and pump data. Performance of the new algorithm was assessed using n = 4 diabetic canine experiments over a 32 h duration. In addition, 10 in silico adults from the University of Virginia/Padova type 1 diabetes mellitus metabolic simulator were tested. The percentage of time in glucose range 80-180 mg/dl was 86%, 85%, 61%, and 30% using model-based therapy and [78%, 100%] (brackets denote multiple experiments conducted under the same therapy and animal model), [75%, 67%], 47%, and 86% for the control experiments for dogs 1 to 4, respectively. The BG measurements obtained in the simulation using our individualized algorithm were in 61-231 mg/dl min-max envelope, whereas use of the simulator's default treatment resulted in BG measurements 90-210 mg/dl min-max envelope. The study results demonstrate the potential of this method, which could serve as a platform for improving, facilitating, and standardizing insulin pump therapy based on a single download of data. © 2013 Diabetes Technology Society.

  20. Comparison between two photovoltaic module models based on transistors

    Science.gov (United States)

    Saint-Eve, Frédéric; Sawicki, Jean-Paul; Petit, Pierre; Maufay, Fabrice; Aillerie, Michel

    2018-05-01

    The main objective of this paper is to verify the possibility to reduce to a simple electronic circuit with very few components the behavior simulation of an un-shaded photovoltaic (PV) module. Particularly, two models based on well-tried elementary structures, i.e., the Darlington structure in first model and the voltage regulation with programmable Zener diode in the second are analyzed. Specifications extracted from the behavior of a real I-V characteristic of a panel are considered and the principal electrical variables are deduced. The two models are expected to match with open circuit voltage, maximum power point (MPP) and short circuit current, without forgetting realistic current slopes on the both sides of MPP. The robustness is mentioned when irradiance varies and is considered as an additional fundamental property. For both models, two simulations are done to identify influence of some parameters. In the first model, a parameter allowing to adjust current slope on left side of MPP proves to be also important for the calculation of open circuit voltage. Besides this model does not authorize an entirely adjustment of I-V characteristic and MPP moves significantly away from real value when irradiance increases. On the contrary, the second model seems to have only qualities: open circuit voltage is easy to calculate, current slopes are realistic and there is perhaps a good robustness when irradiance variations are simulated by adjusting short circuit current of PV module. We have shown that these two simplified models are expected to make reliable and easier simulations of complex PV architecture integrating many different devices like PV modules or other renewable energy sources and storage capacities coupled in parallel association.

  1. Previous climatic alterations are caused by the sun

    International Nuclear Information System (INIS)

    Groenaas, Sigbjoern

    2003-01-01

    The article surveys the scientific results of previous research into the contribution of the sun to climatic alterations. The author concludes that there is evidence of eight cold periods after the last ice age and that the alterations largely were due to climate effects from the sun. However, these effects are only causing a fraction of the registered global warming. It is assumed that the human activities are contributing to the rest of the greenhouse effect

  2. Influence of previous knowledge in Torrance tests of creative thinking

    OpenAIRE

    Aranguren, María; Consejo Nacional de Investigaciones Científicas y Técnicas CONICET

    2015-01-01

    The aim of this work is to analyze the influence of study field, expertise and recreational activities participation in Torrance Tests of Creative Thinking (TTCT, 1974) performance. Several hypotheses were postulated to explore the possible effects of previous knowledge in TTCT verbal and TTCT figural university students’ outcomes. Participants in this study included 418 students from five study fields: Psychology;Philosophy and Literature, Music; Engineering; and Journalism and Advertisin...

  3. Analysis of previous screening examinations for patients with breast cancer

    International Nuclear Information System (INIS)

    Lee, Eun Hye; Cha, Joo Hee; Han, Dae Hee; Choi, Young Ho; Hwang, Ki Tae; Ryu, Dae Sik; Kwak, Jin Ho; Moon, Woo Kyung

    2007-01-01

    We wanted to improve the quality of subsequent screening by reviewing the previous screening of breast cancer patients. Twenty-four breast cancer patients who underwent previous screening were enrolled. All 24 took mammograms and 15 patients also took sonograms. We reviewed the screening retrospectively according to the BI-RADS criteria and we categorized the results into false negative, true negative, true positive and occult cancers. We also categorized the causes of false negative cancers into misperception, misinterpretation and technical factors and then we analyzed the attributing factors. Review of the previous screening revealed 66.7% (16/24) false negative, 25.0% (6/24) true negative, and 8.3% (2/24) true positive cancers. False negative cancers were caused by the mammogram in 56.3% (9/16) and by the sonogram in 43.7% (7/16). For the false negative cases, all of misperception were related with mammograms and this was attributed to dense breast, a lesion located at the edge of glandular tissue or the image, and findings seen on one view only. Almost all misinterpretations were related with sonograms and attributed to loose application of the final assessment. To improve the quality of breast screening, it is essential to overcome the main causes of false negative examinations, including misperception and misinterpretation. We need systematic education and strict application of final assessment categories of BI-RADS. For effective communication among physicians, it is also necessary to properly educate them about BI-RADS

  4. Wind farms providing secondary frequency regulation: evaluating the performance of model-based receding horizon control

    Directory of Open Access Journals (Sweden)

    C. R. Shapiro

    2018-01-01

    Full Text Available This paper is an extended version of our paper presented at the 2016 TORQUE conference (Shapiro et al., 2016. We investigate the use of wind farms to provide secondary frequency regulation for a power grid using a model-based receding horizon control framework. In order to enable real-time implementation, the control actions are computed based on a time-varying one-dimensional wake model. This model describes wake advection and wake interactions, both of which play an important role in wind farm power production. In order to test the control strategy, it is implemented in a large-eddy simulation (LES model of an 84-turbine wind farm using the actuator disk turbine representation. Rotor-averaged velocity measurements at each turbine are used to provide feedback for error correction. The importance of including the dynamics of wake advection in the underlying wake model is tested by comparing the performance of this dynamic-model control approach to a comparable static-model control approach that relies on a modified Jensen model. We compare the performance of both control approaches using two types of regulation signals, RegA and RegD, which are used by PJM, an independent system operator in the eastern United States. The poor performance of the static-model control relative to the dynamic-model control demonstrates that modeling the dynamics of wake advection is key to providing the proposed type of model-based coordinated control of large wind farms. We further explore the performance of the dynamic-model control via composite performance scores used by PJM to qualify plants for regulation services or markets. Our results demonstrate that the dynamic-model-controlled wind farm consistently performs well, passing the qualification threshold for all fast-acting RegD signals. For the RegA signal, which changes over slower timescales, the dynamic-model control leads to average performance that surpasses the qualification threshold, but further

  5. Model-based visual tracking the OpenTL framework

    CERN Document Server

    Panin, Giorgio

    2011-01-01

    This book has two main goals: to provide a unifed and structured overview of this growing field, as well as to propose a corresponding software framework, the OpenTL library, developed by the author and his working group at TUM-Informatik. The main objective of this work is to show, how most real-world application scenarios can be naturally cast into a common description vocabulary, and therefore implemented and tested in a fully modular and scalable way, through the defnition of a layered, object-oriented software architecture.The resulting architecture covers in a seamless way all processin

  6. Model-based multi-fringe interferometry using Zernike polynomials

    Science.gov (United States)

    Gu, Wei; Song, Weihong; Wu, Gaofeng; Quan, Haiyang; Wu, Yongqian; Zhao, Wenchuan

    2018-06-01

    In this paper, a general phase retrieval method is proposed, which is based on one single interferogram with a small amount of fringes (either tilt or power). Zernike polynomials are used to characterize the phase to be measured; the phase distribution is reconstructed by a non-linear least squares method. Experiments show that the proposed method can obtain satisfactory results compared to the standard phase-shifting interferometry technique. Additionally, the retrace errors of proposed method can be neglected because of the few fringes; it does not need any auxiliary phase shifting facilities (low cost) and it is easy to implement without the process of phase unwrapping.

  7. Model-based development and testing of advertising messages

    DEFF Research Database (Denmark)

    Bech-Larsen, Tino

    2000-01-01

    The implementation of valid and comprehensible guidelines for message development potentially enhances the effects of advertising messages and improves the possibility of measuring such effects. Moreover, such guidelines also have potential implications for the managerial communication processes...... (client-agency and intra-agency) involved in the development of advertising messages. The purpose of the study described in this paper is to compare the development and effects of two campaign proposals, with the common aim of increasing the consumption of apples among young Danes (18 to 35 years of age......-test with the target group (n=500), as well as interviews with the involved advertising agency and client2 staff....

  8. Hybrid network defense model based on fuzzy evaluation.

    Science.gov (United States)

    Cho, Ying-Chiang; Pan, Jen-Yi

    2014-01-01

    With sustained and rapid developments in the field of information technology, the issue of network security has become increasingly prominent. The theme of this study is network data security, with the test subject being a classified and sensitive network laboratory that belongs to the academic network. The analysis is based on the deficiencies and potential risks of the network's existing defense technology, characteristics of cyber attacks, and network security technologies. Subsequently, a distributed network security architecture using the technology of an intrusion prevention system is designed and implemented. In this paper, first, the overall design approach is presented. This design is used as the basis to establish a network defense model, an improvement over the traditional single-technology model that addresses the latter's inadequacies. Next, a distributed network security architecture is implemented, comprising a hybrid firewall, intrusion detection, virtual honeynet projects, and connectivity and interactivity between these three components. Finally, the proposed security system is tested. A statistical analysis of the test results verifies the feasibility and reliability of the proposed architecture. The findings of this study will potentially provide new ideas and stimuli for future designs of network security architecture.

  9. Model Based Optimal Sensor Network Design for Condition Monitoring in an IGCC Plant

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Rajeeva; Kumar, Aditya; Dai, Dan; Seenumani, Gayathri; Down, John; Lopez, Rodrigo

    2012-12-31

    these two formulations were developed and validated. For a given OSP problem the computation efficiency largely depends on the “size” of the problem. Initially a simplified 1-D gasifier model assuming axial and azimuthal symmetry was used to test out various OSP algorithms. Finally these algorithms were used to design the optimal sensor network for condition monitoring of IGCC gasifier refractory wear and RSC fouling. The sensors type and locations obtained as solution to the OSP problem were validated using model based sensing approach. The OSP algorithm has been developed in a modular form and has been packaged as a software tool for OSP design where a designer can explore various OSP design algorithm is a user friendly way. The OSP software tool is implemented in Matlab/Simulink© in-house. The tool also uses few optimization routines that are freely available on World Wide Web. In addition a modular Extended Kalman Filter (EKF) block has also been developed in Matlab/Simulink© which can be utilized for model based sensing of important process variables that are not directly measured through combining the online sensors with model based estimation once the hardware sensor and their locations has been finalized. The OSP algorithm details and the results of applying these algorithms to obtain optimal sensor location for condition monitoring of gasifier refractory wear and RSC fouling profile are summarized in this final report.

  10. Drawing-to-learn: a framework for using drawings to promote model-based reasoning in biology.

    Science.gov (United States)

    Quillin, Kim; Thomas, Stephen

    2015-03-02

    The drawing of visual representations is important for learners and scientists alike, such as the drawing of models to enable visual model-based reasoning. Yet few biology instructors recognize drawing as a teachable science process skill, as reflected by its absence in the Vision and Change report's Modeling and Simulation core competency. Further, the diffuse research on drawing can be difficult to access, synthesize, and apply to classroom practice. We have created a framework of drawing-to-learn that defines drawing, categorizes the reasons for using drawing in the biology classroom, and outlines a number of interventions that can help instructors create an environment conducive to student drawing in general and visual model-based reasoning in particular. The suggested interventions are organized to address elements of affect, visual literacy, and visual model-based reasoning, with specific examples cited for each. Further, a Blooming tool for drawing exercises is provided, as are suggestions to help instructors address possible barriers to implementing and assessing drawing-to-learn in the classroom. Overall, the goal of the framework is to increase the visibility of drawing as a skill in biology and to promote the research and implementation of best practices. © 2015 K. Quillin and S. Thomas. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  11. Moyamoya disease in a child with previous acute necrotizing encephalopathy

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Taik-Kun; Cha, Sang Hoon; Chung, Kyoo Byung; Kim, Jung Hyuck; Kim, Baek Hyun; Chung, Hwan Hoon [Department of Diagnostic Radiology, Korea University College of Medicine, Ansan Hospital, 516 Kojan-Dong, Ansan City, Kyungki-Do 425-020 (Korea); Eun, Baik-Lin [Department of Pediatrics, Korea University College of Medicine, Seoul (Korea)

    2003-09-01

    A previously healthy 24-day-old boy presented with a 2-day history of fever and had a convulsion on the day of admission. MRI showed abnormal signal in the thalami, caudate nuclei and central white matter. Acute necrotising encephalopathy was diagnosed, other causes having been excluded after biochemical and haematological analysis of blood, urine and CSF. He recovered, but with spastic quadriparesis. At the age of 28 months, he suffered sudden deterioration of consciousness and motor weakness of his right limbs. MRI was consistent with an acute cerebrovascular accident. Angiography showed bilateral middle cerebral artery stenosis or frank occlusion with numerous lenticulostriate collateral vessels consistent with moyamoya disease. (orig.)

  12. MCNP HPGe detector benchmark with previously validated Cyltran model.

    Science.gov (United States)

    Hau, I D; Russ, W R; Bronson, F

    2009-05-01

    An exact copy of the detector model generated for Cyltran was reproduced as an MCNP input file and the detection efficiency was calculated similarly with the methodology used in previous experimental measurements and simulation of a 280 cm(3) HPGe detector. Below 1000 keV the MCNP data correlated to the Cyltran results within 0.5% while above this energy the difference between MCNP and Cyltran increased to about 6% at 4800 keV, depending on the electron cut-off energy.

  13. Model-based PEEP optimisation in mechanical ventilation

    Directory of Open Access Journals (Sweden)

    Chiew Yeong Shiong

    2011-12-01

    Full Text Available Abstract Background Acute Respiratory Distress Syndrome (ARDS patients require mechanical ventilation (MV for breathing support. Patient-specific PEEP is encouraged for treating different patients but there is no well established method in optimal PEEP selection. Methods A study of 10 patients diagnosed with ALI/ARDS whom underwent recruitment manoeuvre is carried out. Airway pressure and flow data are used to identify patient-specific constant lung elastance (Elung and time-variant dynamic lung elastance (Edrs at each PEEP level (increments of 5cmH2O, for a single compartment linear lung model using integral-based methods. Optimal PEEP is estimated using Elung versus PEEP, Edrs-Pressure curve and Edrs Area at minimum elastance (maximum compliance and the inflection of the curves (diminishing return. Results are compared to clinically selected PEEP values. The trials and use of the data were approved by the New Zealand South Island Regional Ethics Committee. Results Median absolute percentage fitting error to the data when estimating time-variant Edrs is 0.9% (IQR = 0.5-2.4 and 5.6% [IQR: 1.8-11.3] when estimating constant Elung. Both Elung and Edrs decrease with PEEP to a minimum, before rising, and indicating potential over-inflation. Median Edrs over all patients across all PEEP values was 32.2 cmH2O/l [IQR: 26.1-46.6], reflecting the heterogeneity of ALI/ARDS patients, and their response to PEEP, that complicates standard approaches to PEEP selection. All Edrs-Pressure curves have a clear inflection point before minimum Edrs, making PEEP selection straightforward. Model-based selected PEEP using the proposed metrics were higher than clinically selected values in 7/10 cases. Conclusion Continuous monitoring of the patient-specific Elung and Edrs and minimally invasive PEEP titration provide a unique, patient-specific and physiologically relevant metric to optimize PEEP selection with minimal disruption of MV therapy.

  14. Uncertainties in model-based outcome predictions for treatment planning

    International Nuclear Information System (INIS)

    Deasy, Joseph O.; Chao, K.S. Clifford; Markman, Jerry

    2001-01-01

    Purpose: Model-based treatment-plan-specific outcome predictions (such as normal tissue complication probability [NTCP] or the relative reduction in salivary function) are typically presented without reference to underlying uncertainties. We provide a method to assess the reliability of treatment-plan-specific dose-volume outcome model predictions. Methods and Materials: A practical method is proposed for evaluating model prediction based on the original input data together with bootstrap-based estimates of parameter uncertainties. The general framework is applicable to continuous variable predictions (e.g., prediction of long-term salivary function) and dichotomous variable predictions (e.g., tumor control probability [TCP] or NTCP). Using bootstrap resampling, a histogram of the likelihood of alternative parameter values is generated. For a given patient and treatment plan we generate a histogram of alternative model results by computing the model predicted outcome for each parameter set in the bootstrap list. Residual uncertainty ('noise') is accounted for by adding a random component to the computed outcome values. The residual noise distribution is estimated from the original fit between model predictions and patient data. Results: The method is demonstrated using a continuous-endpoint model to predict long-term salivary function for head-and-neck cancer patients. Histograms represent the probabilities for the level of posttreatment salivary function based on the input clinical data, the salivary function model, and the three-dimensional dose distribution. For some patients there is significant uncertainty in the prediction of xerostomia, whereas for other patients the predictions are expected to be more reliable. In contrast, TCP and NTCP endpoints are dichotomous, and parameter uncertainties should be folded directly into the estimated probabilities, thereby improving the accuracy of the estimates. Using bootstrap parameter estimates, competing treatment

  15. Rasch model based analysis of the Force Concept Inventory

    Directory of Open Access Journals (Sweden)

    Maja Planinic

    2010-03-01

    Full Text Available The Force Concept Inventory (FCI is an important diagnostic instrument which is widely used in the field of physics education research. It is therefore very important to evaluate and monitor its functioning using different tools for statistical analysis. One of such tools is the stochastic Rasch model, which enables construction of linear measures for persons and items from raw test scores and which can provide important insight in the structure and functioning of the test (how item difficulties are distributed within the test, how well the items fit the model, and how well the items work together to define the underlying construct. The data for the Rasch analysis come from the large-scale research conducted in 2006-07, which investigated Croatian high school students’ conceptual understanding of mechanics on a representative sample of 1676 students (age 17–18 years. The instrument used in research was the FCI. The average FCI score for the whole sample was found to be (27.7±0.4%, indicating that most of the students were still non-Newtonians at the end of high school, despite the fact that physics is a compulsory subject in Croatian schools. The large set of obtained data was analyzed with the Rasch measurement computer software WINSTEPS 3.66. Since the FCI is routinely used as pretest and post-test on two very different types of population (non-Newtonian and predominantly Newtonian, an additional predominantly Newtonian sample (N=141, average FCI score of 64.5% of first year students enrolled in introductory physics course at University of Zagreb was also analyzed. The Rasch model based analysis suggests that the FCI has succeeded in defining a sufficiently unidimensional construct for each population. The analysis of fit of data to the model found no grossly misfitting items which would degrade measurement. Some items with larger misfit and items with significantly different difficulties in the two samples of students do require further

  16. Building an Understanding of How Model-Based Inquiry Is Implemented in the High School Chemistry Classroom

    Science.gov (United States)

    Dass, Katarina; Head, Michelle L.; Rushton, Gregory T.

    2015-01-01

    Modeling as a scientific practice in K-12 classrooms has received a wealth of attention in the U.S. and abroad due to the advent of revised national science education standards. The study described herein investigated how a group of high school chemistry teachers developed their understanding of the nature and function of models in the precollege…

  17. Experiences with formal engineering: model-based specification, implementation and testing of a software bus at Neopost

    NARCIS (Netherlands)

    Sijtema, M.; Salaün, G.; Schätz, B.; Belinfante, Axel; Stoelinga, Mariëlle Ida Antoinette; Marinelli, L.

    2014-01-01

    We report on the actual industrial use of formal methods during the development of a software bus. During an internship at Neopost Inc., of 14 weeks, we developed the server component of a software bus, called the XBus, using formal methods during the design, validation and testing phase: we modeled

  18. Experiences with Formal Engineering : Model-Based Specification, Implementation and Testing of a Software Bus at Neopost

    NARCIS (Netherlands)

    Sijtema, Marten; Stoelinga, Mariëlle Ida Antoinette; Belinfante, Axel; Marinelli, Lawrence; Salaün, Gwen; Schätz, Bernhard

    We report on the actual industrial use of formal methods during the development of a software bus. At Neopost Inc., we developed the server component of a software bus, called the XBus, using formal methods during the design, validation and testing phase: We modeled our design of the XBus in the

  19. Implementation of ECIRR model based on virtual simulation media to reduce students’ misconception on kinetic theory of gases

    Science.gov (United States)

    Prastiwi, A. C.; Kholiq, A.; Setyarsih, W.

    2018-03-01

    The purposed of this study are to analyse reduction of students’ misconceptions after getting ECIRR with virtual simulation. The design of research is the pre-experimental design with One Group Pretest-Posttest Design. Subjects of this research were 36 students of class XI MIA-5 SMAN 1 Driyorejo Gresik 2015/2016 school year. Students misconceptions was determined by Three-tier Diagnostic Test. The result shows that the average percentage of misconceptions reduced on topics of ideal gas law, equation of ideal gases and kinetic theory of gases respectively are 38%, 34% and 38%.

  20. Construction and Experimental Implementation of a Model-Based Inverse Filter to Attenuate Hysteresis in Ferroelectric Transducers

    National Research Council Canada - National Science Library

    Hatch, Andrew G; Smith, Ralph C; De, Tathagata; Salapaka, Murti V

    2005-01-01

    .... In this paper, we illustrate the construction of inverse filters, based on homogenized energy models, which can be used to approximately linearize the piezoceramic transducer behavior for linear...

  1. Proteomics Analysis Reveals Previously Uncharacterized Virulence Factors in Vibrio proteolyticus

    Directory of Open Access Journals (Sweden)

    Ann Ray

    2016-07-01

    Full Text Available Members of the genus Vibrio include many pathogens of humans and marine animals that share genetic information via horizontal gene transfer. Hence, the Vibrio pan-genome carries the potential to establish new pathogenic strains by sharing virulence determinants, many of which have yet to be characterized. Here, we investigated the virulence properties of Vibrio proteolyticus, a Gram-negative marine bacterium previously identified as part of the Vibrio consortium isolated from diseased corals. We found that V. proteolyticus causes actin cytoskeleton rearrangements followed by cell lysis in HeLa cells in a contact-independent manner. In search of the responsible virulence factor involved, we determined the V. proteolyticus secretome. This proteomics approach revealed various putative virulence factors, including active type VI secretion systems and effectors with virulence toxin domains; however, these type VI secretion systems were not responsible for the observed cytotoxic effects. Further examination of the V. proteolyticus secretome led us to hypothesize and subsequently demonstrate that a secreted hemolysin, belonging to a previously uncharacterized clan of the leukocidin superfamily, was the toxin responsible for the V. proteolyticus-mediated cytotoxicity in both HeLa cells and macrophages. Clearly, there remains an armory of yet-to-be-discovered virulence factors in the Vibrio pan-genome that will undoubtedly provide a wealth of knowledge on how a pathogen can manipulate host cells.

  2. Incidence of Acneform Lesions in Previously Chemically Damaged Persons-2004

    Directory of Open Access Journals (Sweden)

    N Dabiri

    2008-04-01

    Full Text Available ABSTRACT: Introduction & Objective: Chemical gas weapons especially nitrogen mustard which was used in Iraq-Iran war against Iranian troops have several harmful effects on skin. Some other chemical agents also can cause acne form lesions on skin. The purpose of this study was to compare the incidence of acneform in previously chemically damaged soldiers and non chemically damaged persons. Materials & Methods: In this descriptive and analytical study, 180 chemically damaged soldiers, who have been referred to dermatology clinic between 2000 – 2004, and forty non-chemically damaged people, were chosen randomly and examined for acneform lesions. SPSS software was used for statistic analysis of the data. Results: The mean age of the experimental group was 37.5 ± 5.2 and that of the control group was 38.7 ± 5.9 years. The mean percentage of chemical damage in cases was 31 percent and the time after the chemical damage was 15.2 ± 1.1 years. Ninety seven cases (53.9 percent of the subjects and 19 people (47.5 percent of the control group had some degree of acne. No significant correlation was found in incidence, degree of lesions, site of lesions and age of subjects between two groups. No significant correlation was noted between percentage of chemical damage and incidence and degree of lesions in case group. Conclusion: Incidence of acneform lesions among previously chemically injured peoples was not higher than the normal cases.

  3. Relationship of deer and moose populations to previous winters' snow

    Science.gov (United States)

    Mech, L.D.; McRoberts, R.E.; Peterson, R.O.; Page, R.E.

    1987-01-01

    (1) Linear regression was used to relate snow accumulation during single and consecutive winters with white-tailed deer (Odocoileus virginianus) fawn:doe ratios, mosse (Alces alces) twinning rates and calf:cow ratios, and annual changes in deer and moose populations. Significant relationships were found between snow accumulation during individual winters and these dependent variables during the following year. However, the strongest relationships were between the dependent variables and the sums of the snow accumulations over the previous three winters. The percentage of the variability explained was 36 to 51. (2) Significant relationships were also found between winter vulnerability of moose calves and the sum of the snow accumulations in the current, and up to seven previous, winters, with about 49% of the variability explained. (3) No relationship was found between wolf numbers and the above dependent variables. (4) These relationships imply that winter influences on maternal nutrition can accumulate for several years and that this cumulative effect strongly determines fecundity and/or calf and fawn survivability. Although wolf (Canis lupus L.) predation is the main direct mortality agent on fawns and calves, wolf density itself appears to be secondary to winter weather in influencing the deer and moose populations.

  4. Kidnapping Detection and Recognition in Previous Unknown Environment

    Directory of Open Access Journals (Sweden)

    Yang Tian

    2017-01-01

    Full Text Available An unaware event referred to as kidnapping makes the estimation result of localization incorrect. In a previous unknown environment, incorrect localization result causes incorrect mapping result in Simultaneous Localization and Mapping (SLAM by kidnapping. In this situation, the explored area and unexplored area are divided to make the kidnapping recovery difficult. To provide sufficient information on kidnapping, a framework to judge whether kidnapping has occurred and to identify the type of kidnapping with filter-based SLAM is proposed. The framework is called double kidnapping detection and recognition (DKDR by performing two checks before and after the “update” process with different metrics in real time. To explain one of the principles of DKDR, we describe a property of filter-based SLAM that corrects the mapping result of the environment using the current observations after the “update” process. Two classical filter-based SLAM algorithms, Extend Kalman Filter (EKF SLAM and Particle Filter (PF SLAM, are modified to show that DKDR can be simply and widely applied in existing filter-based SLAM algorithms. Furthermore, a technique to determine the adapted thresholds of metrics in real time without previous data is presented. Both simulated and experimental results demonstrate the validity and accuracy of the proposed method.

  5. [ANTITHROMBOTIC MEDICATION IN PREGNANT WOMEN WITH PREVIOUS INTRAUTERINE GROWTH RESTRICTION].

    Science.gov (United States)

    Neykova, K; Dimitrova, V; Dimitrov, R; Vakrilova, L

    2016-01-01

    To analyze pregnancy outcome in patients who were on antithrombotic medication (AM) because of previous pregnancy with fetal intrauterine growth restriction (IUGR). The studied group (SG) included 21 pregnancies in 15 women with history of previous IUGR. The patients were on low dose aspirin (LDA) and/or low molecular weight heparin (LMWH). Pregnancy outcome was compared to the one in two more groups: 1) primary group (PG) including the previous 15 pregnancies with IUGR of the same women; 2) control group (CG) including 45 pregnancies of women matched for parity with the ones in the SG, with no history of IUGR and without medication. The SG, PG and CG were compared for the following: mean gestational age (g.a.) at birth, mean birth weight (BW), proportion of cases with early preeclampsia (PE), IUGR (total, moderate, and severe), intrauterine fetal death (IUFD), neonatal death (NND), admission to NICU, cesarean section (CS) because of chronic or acute fetal distress (FD) related to IUGR, PE or placental abruption. Student's t-test was applied to assess differences between the groups. P values < 0.05 were considered statistically significant. The differences between the SG and the PG regarding mean g. a. at delivery (33.7 and 29.8 w.g. respectively) and the proportion of babies admitted to NICU (66.7% vs. 71.4%) were not statistically significant. The mean BW in the SG (2114,7 g.) was significantly higher than in the PG (1090.8 g.). In the SG compared with the PG there were significantly less cases of IUFD (14.3% and 53.3% respectively), early PE (9.5% vs. 46.7%) moderate and severe IUGR (10.5% and 36.8% vs. 41.7% and 58.3%). Neonatal mortality in the SG (5.6%) was significantly lower than in the PG (57.1%), The proportion of CS for FD was not significantly different--53.3% in the SG and 57.1% in the PG. On the other hand, comparison between the SG and the CG demonstrated significantly lower g.a. at delivery in the SG (33.7 vs. 38 w.g.) an lower BW (2114 vs. 3094 g

  6. Model Based Analysis of Ethnic Differences in Type 2 Diabetes

    DEFF Research Database (Denmark)

    Møller, Jonas Bech

    the applicability of stochastic differential equations (SDEs) and non-linear mixed effects (NLME) models for such an assessment. One way to perform such an investigation is to characterise the pathophysiology of the two groups at different stages of disease progression. For T2D this involves a characterisation...... equations (SDEs) or another improved description of residuals. For characterising disease progression in Caucasian and Japanese, established models that include parameters for insulin sensitivity and beta-cell function were implemented in a non-linear mixed-effects setting with ODEs. Based on the ACF......-cell function, measured by simple insulin based measures,could be explained by difference in body size (BMI). This was supported by Forest plots of covariate effects obtained from population models, in general indicating that race had no clinical relevant effect on either the insulin sensitivit yor the beta...

  7. Intelligent harmonic load model based on neural networks

    Science.gov (United States)

    Ji, Pyeong-Shik; Lee, Dae-Jong; Lee, Jong-Pil; Park, Jae-Won; Lim, Jae-Yoon

    2007-12-01

    In this study, we developed a RBFNs(Radial Basis Function Networks) based load modeling method with harmonic components. The developed method implemented by using harmonic information as well as fundamental frequency and voltage which are essential input factors in conventional method. Thus, the proposed method makes it possible to effectively estimate load characteristics in power lines with harmonics. The RBFNs have certain advantage such as simple structure and rapid computation ability compared with multilayer perceptron which is extensively applied for load modeling. To show the effectiveness, the proposed method has been intensively tested with various dataset acquired under the different frequency and voltage and compared it with conventional methods such as polynominal 2nd equation method, MLP and RBF without considering harmonic components.

  8. Ionospheric scintillation forecasting model based on NN-PSO technique

    Science.gov (United States)

    Sridhar, M.; Venkata Ratnam, D.; Padma Raju, K.; Sai Praharsha, D.; Saathvika, K.

    2017-09-01

    The forecasting and modeling of ionospheric scintillation effects are crucial for precise satellite positioning and navigation applications. In this paper, a Neural Network model, trained using Particle Swarm Optimization (PSO) algorithm, has been implemented for the prediction of amplitude scintillation index (S4) observations. The Global Positioning System (GPS) and Ionosonde data available at Darwin, Australia (12.4634° S, 130.8456° E) during 2013 has been considered. The correlation analysis between GPS S4 and Ionosonde drift velocities (hmf2 and fof2) data has been conducted for forecasting the S4 values. The results indicate that forecasted S4 values closely follow the measured S4 values for both the quiet and disturbed conditions. The outcome of this work will be useful for understanding the ionospheric scintillation phenomena over low latitude regions.

  9. Modifying EPA radiation risk models based on BEIR VII

    International Nuclear Information System (INIS)

    Pawel, D.; Puskin, J.

    2007-01-01

    This paper summarizes a 'draft White Paper' that provides details on proposed changes in EPA's methodology for estimating radiogenic cancer risks. Many of the changes are based on the contents of a recent National Academy of Sciences (NAS) report (BEIR VII), that addresses cancer and genetic risks from low doses of low-LET radiation. The draft White Paper was prepared for a meeting with the EPA's Science Advisory Board's Radiation Advisory Committee (RAC) in September for seeking advice on the application of BEIR VII and on issues relating to these modifications and expansions. After receiving the Advisory review, we plan to implement the changes by publishing the new methodology in an EPA report, which we expect to submit to the RAC for final review. The revised methodology could then be applied to update the cancer risk coefficients for over 800 radionuclides that are published in EPA's Federal Guidance Report 13. (author)

  10. Model-based microwave image reconstruction: simulations and experiments

    International Nuclear Information System (INIS)

    Ciocan, Razvan; Jiang Huabei

    2004-01-01

    We describe an integrated microwave imaging system that can provide spatial maps of dielectric properties of heterogeneous media with tomographically collected data. The hardware system (800-1200 MHz) was built based on a lock-in amplifier with 16 fixed antennas. The reconstruction algorithm was implemented using a Newton iterative method with combined Marquardt-Tikhonov regularizations. System performance was evaluated using heterogeneous media mimicking human breast tissue. Finite element method coupled with the Bayliss and Turkel radiation boundary conditions were applied to compute the electric field distribution in the heterogeneous media of interest. The results show that inclusions embedded in a 76-diameter background medium can be quantitatively reconstructed from both simulated and experimental data. Quantitative analysis of the microwave images obtained suggests that an inclusion of 14 mm in diameter is the smallest object that can be fully characterized presently using experimental data, while objects as small as 10 mm in diameter can be quantitatively resolved with simulated data

  11. Model based controls and the AGS booster controls system architecture

    International Nuclear Information System (INIS)

    Casella, R.A.

    1987-01-01

    For the past three years the Accelerator Controls Section has been responsible for the development of the Heavy Ion Transfer Line (HITL) used to inject heavy ions created at the Tandem Van de Graaff into the Alternating Gradient Synchrotron (AGS). This was recognized as an opportunity to test new ideas for control of a beam line, which if successful, could be implemented in an upgrade of the existing control system for the AGS. The in place control system for the AGS consisted of DEC PDP10 computer as the primary computer interface to the accelerator via three control room consoles, and keeper of the device database. For the HITL project it was decided to make the control system a true distributed network putting more computing power down at the device level via intelligent subsystems. A network of Apollo workstations was added at the host level. Apollos run a distributed operating system and are connected to each other by the Domain Token Ring Network. The Apollos were seen as the new primary computers for consoles with each console containing at least one Apollo. These hosts and all other subsystems are connected to each other via an in house developed LAN (RELWAY). The design of the control system developed for HITL was mostly successful. The proposed AGS Booster is designed to be a synchrotron injector for the AGS. With the forthcoming development of the Booster for the AGS an opportunity has again developed to implement new ideas for accelerator control. One weakness of the HITL control system is the limited cpu power and poor debugging facilities of the stations

  12. Model-based design approaches for plug-in hybrid vehicle design

    Energy Technology Data Exchange (ETDEWEB)

    Mendes, C.J. [CrossChasm Technologies, Cambridge, ON (Canada); Stevens, M.B.; Fowler, M.W. [Waterloo Univ., ON (Canada). Dept. of Chemical Engineering; Fraser, R.A. [Waterloo Univ., ON (Canada). Dept. of Mechanical Engineering; Wilhelm, E.J. [Paul Scherrer Inst., Villigen (Switzerland). Energy Systems Analysis

    2007-07-01

    A model-based design process for plug-in hybrid vehicles (PHEVs) was presented. The paper discussed steps between the initial design concept and a working vehicle prototype, and focused on an investigation of the software-in-the-loop (SIL), hardware-in-the-loop (HIL), and component-in-the-loop (CIL) design phases. The role and benefits of using simulation were also reviewed. A method for mapping and identifying components was provided along with a hybrid control strategy and component-level control optimization process. The role of simulation in component evaluation, architecture design, and de-bugging procedures was discussed, as well as the role simulation networks can play in speeding deployment times. The simulations focused on work performed on a 2005 Chevrolet Equinox converted to a fuel cell hybrid electric vehicle (FCHEV). Components were aggregated to create a complete virtual vehicle. A simplified vehicle model was implemented onto the on-board vehicle control hardware. Optimization metrics were estimated at 10 alpha values during each control loop iteration. The simulation was then used to tune the control system under a variety of drive cycles and conditions. A CIL technique was used to place a physical hybrid electric vehicle (HEV) component under the control of a real time HEV/PHEV simulation. It was concluded that controllers should have a standardized component description that supports integration into advanced testing procedures. 4 refs., 9 figs.

  13. A model-based approach to on-line process disturbance management

    International Nuclear Information System (INIS)

    Kim, I.S.

    1988-01-01

    The methodology developed can be applied to the design of a real-time expert system to aid control-room operators in coping with process abnormalities. The approach encompasses diverse functional aspects required for an effective on-line process disturbance management: (1) intelligent process monitoring and alarming, (2) on-line sensor data validation, (3) on-line sensor and hardware (except sensors) fault diagnosis, and (4) real-time corrective measure synthesis. Accomplishment of these functions is made possible through the application of various models, goal-tree success-tree, process monitor-tree, sensor failure diagnosis, and hardware failure diagnosis models. The models used in the methodology facilitate not only the knowledge-acquisition process - a bottleneck in the development of an expert system - but also the reasoning process of the knowledge-based system. These transparent models and model-based reasoning significantly enhance the maintainability of the real-time expert systems. The proposed approach was applied to the feedwater control system of a nuclear power plant, and implemented into a real-time expert system, MOAS II, using the expert system shell, PICON, on the LMI machine

  14. Building spatio-temporal database model based on ontological approach using relational database environment

    International Nuclear Information System (INIS)

    Mahmood, N.; Burney, S.M.A.

    2017-01-01

    Everything in this world is encapsulated by space and time fence. Our daily life activities are utterly linked and related with other objects in vicinity. Therefore, a strong relationship exist with our current location, time (including past, present and future) and event through with we are moving as an object also affect our activities in life. Ontology development and its integration with database are vital for the true understanding of the complex systems involving both spatial and temporal dimensions. In this paper we propose a conceptual framework for building spatio-temporal database model based on ontological approach. We have used relational data model for modelling spatio-temporal data content and present our methodology with spatio-temporal ontological accepts and its transformation into spatio-temporal database model. We illustrate the implementation of our conceptual model through a case study related to cultivated land parcel used for agriculture to exhibit the spatio-temporal behaviour of agricultural land and related entities. Moreover, it provides a generic approach for designing spatiotemporal databases based on ontology. The proposed model is capable to understand the ontological and somehow epistemological commitments and to build spatio-temporal ontology and transform it into a spatio-temporal data model. Finally, we highlight the existing and future research challenges. (author)

  15. Toward a Model-Based Approach to Flight System Fault Protection

    Science.gov (United States)

    Day, John; Murray, Alex; Meakin, Peter

    2012-01-01

    Fault Protection (FP) is a distinct and separate systems engineering sub-discipline that is concerned with the off-nominal behavior of a system. Flight system fault protection is an important part of the overall flight system systems engineering effort, with its own products and processes. As with other aspects of systems engineering, the FP domain is highly amenable to expression and management in models. However, while there are standards and guidelines for performing FP related analyses, there are not standards or guidelines for formally relating the FP analyses to each other or to the system hardware and software design. As a result, the material generated for these analyses are effectively creating separate models that are only loosely-related to the system being designed. Development of approaches that enable modeling of FP concerns in the same model as the system hardware and software design enables establishment of formal relationships that has great potential for improving the efficiency, correctness, and verification of the implementation of flight system FP. This paper begins with an overview of the FP domain, and then continues with a presentation of a SysML/UML model of the FP domain and the particular analyses that it contains, by way of showing a potential model-based approach to flight system fault protection, and an exposition of the use of the FP models in FSW engineering. The analyses are small examples, inspired by current real-project examples of FP analyses.

  16. Model-based control of the temporal patterns of intracellular signaling in silico

    Science.gov (United States)

    Murakami, Yohei; Koyama, Masanori; Oba, Shigeyuki; Kuroda, Shinya; Ishii, Shin

    2017-01-01

    The functions of intracellular signal transduction systems are determined by the temporal behavior of intracellular molecules and their interactions. Of the many dynamical properties of the system, the relationship between the dynamics of upstream molecules and downstream molecules is particularly important. A useful tool in understanding this relationship is a methodology to control the dynamics of intracellular molecules with an extracellular stimulus. However, this is a difficult task because the relationship between the levels of upstream molecules and those of downstream molecules is often not only stochastic, but also time-inhomogeneous, nonlinear, and not one-to-one. In this paper, we present an easy-to-implement model-based control method that makes the target downstream molecule to trace a desired time course by changing the concentration of a controllable upstream molecule. Our method uses predictions from Monte Carlo simulations of the model to decide the strength of the stimulus, while using a particle-based approach to make inferences regarding unobservable states. We applied our method to in silico control problems of insulin-dependent AKT pathway model and EGF-dependent Akt pathway model with system noise. We show that our method can robustly control the dynamics of the intracellular molecules against unknown system noise of various strengths, even in the absence of complete knowledge of the true model of the target system. PMID:28275530

  17. Natural Model based Design in Context: an Effective Method for Environmental Problems

    Directory of Open Access Journals (Sweden)

    Eric D. Kameni

    2017-10-01

    Full Text Available Analyzing complex problem domains is not easy. Simulation tools support decision makers to find the best policies. Model-based system development is an approach where a model of the application domain is the central driving force when designing simulation tools. State-of-the-art techniques however still require both expert knowledge of the application domain and the implementation techniques as provided by ICT (such as multilevel agent technology. Domain experts, however, usually do not master ICT sufficiently. Modeling is more insightful for the domain expert when its goal is to formalize the language being used in that domain as a semi-natural language. At the meta level, this language describes the main concepts of the type of application domain. The model then is a concretization of this meta model. The main focus of this article is (1 to propose a natural-language-based approach to modeling application domains, (2 to show how these models can be transformed systematically into computational models, and (3 to propose the tool TiC (Tool in Context that supports the domain expert when developing a model and generating a simulation tool. Our research methodology is based on Design Science. We verify our approach by describing the various transformation steps in detail, and by demonstrating the way of working via a sample session applying a real problem of Laf Forest Reserve deforestation in North Cameroon.

  18. Bridge Deterioration Prediction Model Based On Hybrid Markov-System Dynamic

    Directory of Open Access Journals (Sweden)

    Widodo Soetjipto Jojok

    2017-01-01

    Full Text Available Instantaneous bridge failure tends to increase in Indonesia. To mitigate this condition, Indonesia’s Bridge Management System (I-BMS has been applied to continuously monitor the condition of bridges. However, I-BMS only implements visual inspection for maintenance priority of the bridge structure component instead of bridge structure system. This paper proposes a new bridge failure prediction model based on hybrid Markov-System Dynamic (MSD. System dynamic is used to represent the correlation among bridge structure components while Markov chain is used to calculate temporal probability of the bridge failure. Around 235 data of bridges in Indonesia were collected from Directorate of Bridge the Ministry of Public Works and Housing for calculating transition probability of the model. To validate the model, a medium span concrete bridge was used as a case study. The result shows that the proposed model can accurately predict the bridge condition. Besides predicting the probability of the bridge failure, this model can also be used as an early warning system for bridge monitoring activity.

  19. Aligning ERP systems with companies' real needs: an `Operational Model Based' method

    Science.gov (United States)

    Mamoghli, Sarra; Goepp, Virginie; Botta-Genoulaz, Valérie

    2017-02-01

    Enterprise Resource Planning (ERP) systems offer standard functionalities that have to be configured and customised by a specific company depending on its own requirements. A consistent alignment is therefore an essential success factor of ERP projects. To manage this alignment, an 'Operational Model Based' method is proposed. It is based on the design and the matching of models, and conforms to the modelling views and constructs of the ISO 19439 and 19440 enterprise-modelling standards. It is characterised by: (1) a predefined design and matching order of the models; (2) the formalisation, in terms of modelling constructs, of alignment and misalignment situations; and (3) their association with a set of decisions in order to mitigate the misalignment risk. Thus, a comprehensive understanding of the alignment management during ERP projects is given. Unlike existing methods, this one includes decisions related to the organisational changes an ERP system can induce, as well as criteria on which the best decision can be based. In this way, it provides effective support and guidance to companies implementing ERP systems, as the alignment process is detailed and structured. The method is applied on the ERP project of a Small and Medium Enterprise, showing that it can be used even in contexts where the ERP project expertise level is low.

  20. Model-Based Learning Environment Based on The Concept IPS School-Based Management

    Directory of Open Access Journals (Sweden)

    Hamid Darmadi

    2017-03-01

    Full Text Available The results showed: (1 learning model IPS-oriented environment can grow and not you love the cultural values of the area as a basis for the development of national culture, (2 community participation, and the role of government in implementing learning model of IPS-based environment provides a positive impact for the improvement of management school resources, (3 learning model IPS-based environment effectively creating a way of life together peacefully, increase the intensity of togetherness and mutual respect (4 learning model IPS-based environment can improve student learning outcomes, (5 there are differences in the expression of attitudes and results learning among students who are located in the area of conflict with students who are outside the area of conflict (6 analysis of the scale of attitudes among school students da SMA result rewards high school students to the values of unity and nation, respect for diversity and peaceful coexistence, It is recommended that the Department of Education authority as an institution of Trustees and the development of social and cultural values in the province can apply IPS learning model based environments.

  1. A Model-Based Approach to Engineering Behavior of Complex Aerospace Systems

    Science.gov (United States)

    Ingham, Michel; Day, John; Donahue, Kenneth; Kadesch, Alex; Kennedy, Andrew; Khan, Mohammed Omair; Post, Ethan; Standley, Shaun

    2012-01-01

    One of the most challenging yet poorly defined aspects of engineering a complex aerospace system is behavior engineering, including definition, specification, design, implementation, and verification and validation of the system's behaviors. This is especially true for behaviors of highly autonomous and intelligent systems. Behavior engineering is more of an art than a science. As a process it is generally ad-hoc, poorly specified, and inconsistently applied from one project to the next. It uses largely informal representations, and results in system behavior being documented in a wide variety of disparate documents. To address this problem, JPL has undertaken a pilot project to apply its institutional capabilities in Model-Based Systems Engineering to the challenge of specifying complex spacecraft system behavior. This paper describes the results of the work in progress on this project. In particular, we discuss our approach to modeling spacecraft behavior including 1) requirements and design flowdown from system-level to subsystem-level, 2) patterns for behavior decomposition, 3) allocation of behaviors to physical elements in the system, and 4) patterns for capturing V&V activities associated with behavioral requirements. We provide examples of interesting behavior specification patterns, and discuss findings from the pilot project.

  2. Deepwater Gulf of Mexico more profitable than previously thought

    International Nuclear Information System (INIS)

    Craig, M.J.K.; Hyde, S.T.

    1997-01-01

    Economic evaluations and recent experience show that the deepwater Gulf of Mexico (GOM) is much more profitable than previously thought. Four factors contributing to the changed viewpoint are: First, deepwater reservoirs have proved to have excellent productive capacity, distribution, and continuity when compared to correlative-age shelf deltaic sands. Second, improved technologies and lower perceived risks have lowered the cost of floating production systems (FPSs). Third, projects now get on-line quicker. Fourth, a collection of other important factors are: Reduced geologic risk and associated high success rates for deepwater GOM wells due primarily to improved seismic imaging and processing tools (3D, AVO, etc.); absence of any political risk in the deepwater GOM (common overseas, and very significant in some international areas); and positive impact of deepwater federal royalty relief. This article uses hypothetical reserve distributions and price forecasts to illustrate indicative economics of deepwater prospects. Economics of Shell Oil Co.'s three deepwater projects are also discussed

  3. Corneal perforation after conductive keratoplasty with previous refractive surgery.

    Science.gov (United States)

    Kymionis, George D; Titze, Patrik; Markomanolakis, Marinos M; Aslanides, Ioannis M; Pallikaris, Ioannis G

    2003-12-01

    A 56-year-old woman had conductive keratoplasty (CK) for residual hyperopia and astigmatism. Three years before the procedure, the patient had arcuate keratotomy, followed by laser in situ keratomileusis 2 years later for high astigmatism correction in both eyes. During CK, a corneal perforation occurred in the right eye; during the postoperative examination, an iris perforation and anterior subcapsule opacification were seen beneath the perforation site. The perforation was managed with a bandage contact lens and an antibiotic-steroid ointment; it had a negative Seidel sign by the third day. The surgery in the left eye was uneventful. Three months after the procedure, the uncorrected visual acuity was 20/32 and the best corrected visual acuity 20/20 in both eyes with a significant improvement in corneal topography. Care must be taken to prevent CK-treated spots from coinciding with areas in the corneal stroma that might have been altered by previous refractive procedures.

  4. Interference from previous distraction disrupts older adults' memory.

    Science.gov (United States)

    Biss, Renée K; Campbell, Karen L; Hasher, Lynn

    2013-07-01

    Previously relevant information can disrupt the ability of older adults to remember new information. Here, the researchers examined whether prior irrelevant information, or distraction, can also interfere with older adults' memory for new information. Younger and older adults first completed a 1-back task on pictures that were superimposed with distracting words. After a delay, participants learned picture-word paired associates and memory was tested using picture-cued recall. In 1 condition (high interference), some pairs included pictures from the 1-back task now paired with new words. In a low-interference condition, the transfer list used all new items. Older adults had substantially lower cued-recall performance in the high- compared with the low-interference condition. In contrast, younger adults' performance did not vary across conditions. These findings suggest that even never-relevant information from the past can disrupt older adults' memory for new associations.

  5. The long-term consequences of previous hyperthyroidism

    DEFF Research Database (Denmark)

    Hjelm Brandt Kristensen, Frans

    2015-01-01

    Thyroid hormones affect every cell in the human body, and the cardiovascular changes associated with increased levels of thyroid hormones are especially well described. As an example, short-term hyperthyroidism has positive chronotropic and inotropic effects on the heart, leading to a hyperdynamic...... with CVD, LD and DM both before and after the diagnosis of hyperthyroidism. Although the design used does not allow a stringent distinction between cause and effect, the findings indicate a possible direct association between hyperthyroidism and these morbidities, or vice versa....... vascular state. While it is biologically plausible that these changes may induce long-term consequences, the insight into morbidity as well as mortality in patients with previous hyperthyroidism is limited. The reasons for this are a combination of inadequately powered studies, varying definitions...

  6. Drivers of Adoption and Implementation of Internet-Based Marketing Channels

    DEFF Research Database (Denmark)

    Nielsen, Jørn Flohr; Mols, Niels Peter; Høst, Viggo

    2007-01-01

    This chapter analyses factors influencing manufacturers= adoption and implementation of Internet-based marketing channels, using models based on marketing channel and organisational innovation theory. Survey data from 1163 Danish, Finnish, and Swedish manufacturers form the empirical basis for te...

  7. Is Previous Respiratory Disease a Risk Factor for Lung Cancer?

    Science.gov (United States)

    Denholm, Rachel; Schüz, Joachim; Straif, Kurt; Stücker, Isabelle; Jöckel, Karl-Heinz; Brenner, Darren R.; De Matteis, Sara; Boffetta, Paolo; Guida, Florence; Brüske, Irene; Wichmann, Heinz-Erich; Landi, Maria Teresa; Caporaso, Neil; Siemiatycki, Jack; Ahrens, Wolfgang; Pohlabeln, Hermann; Zaridze, David; Field, John K.; McLaughlin, John; Demers, Paul; Szeszenia-Dabrowska, Neonila; Lissowska, Jolanta; Rudnai, Peter; Fabianova, Eleonora; Dumitru, Rodica Stanescu; Bencko, Vladimir; Foretova, Lenka; Janout, Vladimir; Kendzia, Benjamin; Peters, Susan; Behrens, Thomas; Vermeulen, Roel; Brüning, Thomas; Kromhout, Hans

    2014-01-01

    Rationale: Previous respiratory diseases have been associated with increased risk of lung cancer. Respiratory conditions often co-occur and few studies have investigated multiple conditions simultaneously. Objectives: Investigate lung cancer risk associated with chronic bronchitis, emphysema, tuberculosis, pneumonia, and asthma. Methods: The SYNERGY project pooled information on previous respiratory diseases from 12,739 case subjects and 14,945 control subjects from 7 case–control studies conducted in Europe and Canada. Multivariate logistic regression models were used to investigate the relationship between individual diseases adjusting for co-occurring conditions, and patterns of respiratory disease diagnoses and lung cancer. Analyses were stratified by sex, and adjusted for age, center, ever-employed in a high-risk occupation, education, smoking status, cigarette pack-years, and time since quitting smoking. Measurements and Main Results: Chronic bronchitis and emphysema were positively associated with lung cancer, after accounting for other respiratory diseases and smoking (e.g., in men: odds ratio [OR], 1.33; 95% confidence interval [CI], 1.20–1.48 and OR, 1.50; 95% CI, 1.21–1.87, respectively). A positive relationship was observed between lung cancer and pneumonia diagnosed 2 years or less before lung cancer (OR, 3.31; 95% CI, 2.33–4.70 for men), but not longer. Co-occurrence of chronic bronchitis and emphysema and/or pneumonia had a stronger positive association with lung cancer than chronic bronchitis “only.” Asthma had an inverse association with lung cancer, the association being stronger with an asthma diagnosis 5 years or more before lung cancer compared with shorter. Conclusions: Findings from this large international case–control consortium indicate that after accounting for co-occurring respiratory diseases, chronic bronchitis and emphysema continue to have a positive association with lung cancer. PMID:25054566

  8. Twelve previously unknown phage genera are ubiquitous in global oceans.

    Science.gov (United States)

    Holmfeldt, Karin; Solonenko, Natalie; Shah, Manesh; Corrier, Kristen; Riemann, Lasse; Verberkmoes, Nathan C; Sullivan, Matthew B

    2013-07-30

    Viruses are fundamental to ecosystems ranging from oceans to humans, yet our ability to study them is bottlenecked by the lack of ecologically relevant isolates, resulting in "unknowns" dominating culture-independent surveys. Here we present genomes from 31 phages infecting multiple strains of the aquatic bacterium Cellulophaga baltica (Bacteroidetes) to provide data for an underrepresented and environmentally abundant bacterial lineage. Comparative genomics delineated 12 phage groups that (i) each represent a new genus, and (ii) represent one novel and four well-known viral families. This diversity contrasts the few well-studied marine phage systems, but parallels the diversity of phages infecting human-associated bacteria. Although all 12 Cellulophaga phages represent new genera, the podoviruses and icosahedral, nontailed ssDNA phages were exceptional, with genomes up to twice as large as those previously observed for each phage type. Structural novelty was also substantial, requiring experimental phage proteomics to identify 83% of the structural proteins. The presence of uncommon nucleotide metabolism genes in four genera likely underscores the importance of scavenging nutrient-rich molecules as previously seen for phages in marine environments. Metagenomic recruitment analyses suggest that these particular Cellulophaga phages are rare and may represent a first glimpse into the phage side of the rare biosphere. However, these analyses also revealed that these phage genera are widespread, occurring in 94% of 137 investigated metagenomes. Together, this diverse and novel collection of phages identifies a small but ubiquitous fraction of unknown marine viral diversity and provides numerous environmentally relevant phage-host systems for experimental hypothesis testing.

  9. Urethrotomy has a much lower success rate than previously reported.

    Science.gov (United States)

    Santucci, Richard; Eisenberg, Lauren

    2010-05-01

    We evaluated the success rate of direct vision internal urethrotomy as a treatment for simple male urethral strictures. A retrospective chart review was performed on 136 patients who underwent urethrotomy from January 1994 through March 2009. The Kaplan-Meier method was used to analyze stricture-free probability after the first, second, third, fourth and fifth urethrotomy. Patients with complex strictures (36) were excluded from the study for reasons including previous urethroplasty, neophallus or previous radiation, and 24 patients were lost to followup. Data were available for 76 patients. The stricture-free rate after the first urethrotomy was 8% with a median time to recurrence of 7 months. For the second urethrotomy stricture-free rate was 6% with a median time to recurrence of 9 months. For the third urethrotomy stricture-free rate was 9% with a median time to recurrence of 3 months. For procedures 4 and 5 stricture-free rate was 0% with a median time to recurrence of 20 and 8 months, respectively. Urethrotomy is a popular treatment for male urethral strictures. However, the performance characteristics are poor. Success rates were no higher than 9% in this series for first or subsequent urethrotomy during the observation period. Most of the patients in this series will be expected to experience failure with longer followup and the expected long-term success rate from any (1 through 5) urethrotomy approach is 0%. Urethrotomy should be considered a temporizing measure until definitive curative reconstruction can be planned. 2010 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  10. Typing DNA profiles from previously enhanced fingerprints using direct PCR.

    Science.gov (United States)

    Templeton, Jennifer E L; Taylor, Duncan; Handt, Oliva; Linacre, Adrian

    2017-07-01

    Fingermarks are a source of human identification both through the ridge patterns and DNA profiling. Typing nuclear STR DNA markers from previously enhanced fingermarks provides an alternative method of utilising the limited fingermark deposit that can be left behind during a criminal act. Dusting with fingerprint powders is a standard method used in classical fingermark enhancement and can affect DNA data. The ability to generate informative DNA profiles from powdered fingerprints using direct PCR swabs was investigated. Direct PCR was used as the opportunity to generate usable DNA profiles after performing any of the standard DNA extraction processes is minimal. Omitting the extraction step will, for many samples, be the key to success if there is limited sample DNA. DNA profiles were generated by direct PCR from 160 fingermarks after treatment with one of the following dactyloscopic fingerprint powders: white hadonite; silver aluminium; HiFi Volcano silk black; or black magnetic fingerprint powder. This was achieved by a combination of an optimised double-swabbing technique and swab media, omission of the extraction step to minimise loss of critical low-template DNA, and additional AmpliTaq Gold ® DNA polymerase to boost the PCR. Ninety eight out of 160 samples (61%) were considered 'up-loadable' to the Australian National Criminal Investigation DNA Database (NCIDD). The method described required a minimum of working steps, equipment and reagents, and was completed within 4h. Direct PCR allows the generation of DNA profiles from enhanced prints without the need to increase PCR cycle numbers beyond manufacturer's recommendations. Particular emphasis was placed on preventing contamination by applying strict protocols and avoiding the use of previously used fingerprint brushes. Based on this extensive survey, the data provided indicate minimal effects of any of these four powders on the chance of obtaining DNA profiles from enhanced fingermarks. Copyright © 2017

  11. Model-Based Extracted Water Desalination System for Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Dees, Elizabeth M. [General Electric Global Research Center, Niskayuna, NY (United States); Moore, David Roger [General Electric Global Research Center, Niskayuna, NY (United States); Li, Li [Pennsylvania State Univ., University Park, PA (United States); Kumar, Manish [Pennsylvania State Univ., University Park, PA (United States)

    2017-05-28

    Over the last 1.5 years, GE Global Research and Pennsylvania State University defined a model-based, scalable, and multi-stage extracted water desalination system that yields clean water, concentrated brine, and, optionally, salt. The team explored saline brines that ranged across the expected range for extracted water for carbon sequestration reservoirs (40,000 up to 220,000 ppm total dissolved solids, TDS). In addition, the validated the system performance at pilot scale with field-sourced water using GE’s pre-pilot and lab facilities. This project encompassed four principal tasks, in addition to Project Management and Planning: 1) identify a deep saline formation carbon sequestration site and a partner that are suitable for supplying extracted water; 2) conduct a techno-economic assessment and down-selection of pre-treatment and desalination technologies to identify a cost-effective system for extracted water recovery; 3) validate the downselected processes at the lab/pre-pilot scale; and 4) define the scope of the pilot desalination project. Highlights from each task are described below: Deep saline formation characterization The deep saline formations associated with the five DOE NETL 1260 Phase 1 projects were characterized with respect to their mineralogy and formation water composition. Sources of high TDS feed water other than extracted water were explored for high TDS desalination applications, including unconventional oil and gas and seawater reverse osmosis concentrate. Technoeconomic analysis of desalination technologies Techno-economic evaluations of alternate brine concentration technologies, including humidification-dehumidification (HDH), membrane distillation (MD), forward osmosis (FO), turboexpander-freeze, solvent extraction and high pressure reverse osmosis (HPRO), were conducted. These technologies were evaluated against conventional falling film-mechanical vapor recompression (FF-MVR) as a baseline desalination process. Furthermore, a

  12. Fuzzy GML Modeling Based on Vague Soft Sets

    Directory of Open Access Journals (Sweden)

    Bo Wei

    2017-01-01

    Full Text Available The Open Geospatial Consortium (OGC Geography Markup Language (GML explicitly represents geographical spatial knowledge in text mode. All kinds of fuzzy problems will inevitably be encountered in spatial knowledge expression. Especially for those expressions in text mode, this fuzziness will be broader. Describing and representing fuzziness in GML seems necessary. Three kinds of fuzziness in GML can be found: element fuzziness, chain fuzziness, and attribute fuzziness. Both element fuzziness and chain fuzziness belong to the reflection of the fuzziness between GML elements and, then, the representation of chain fuzziness can be replaced by the representation of element fuzziness in GML. On the basis of vague soft set theory, two kinds of modeling, vague soft set GML Document Type Definition (DTD modeling and vague soft set GML schema modeling, are proposed for fuzzy modeling in GML DTD and GML schema, respectively. Five elements or pairs, associated with vague soft sets, are introduced. Then, the DTDs and the schemas of the five elements are correspondingly designed and presented according to their different chains and different fuzzy data types. While the introduction of the five elements or pairs is the basis of vague soft set GML modeling, the corresponding DTD and schema modifications are key for implementation of modeling. The establishment of vague soft set GML enables GML to represent fuzziness and solves the problem of lack of fuzzy information expression in GML.

  13. Customer-centered careflow modeling based on guidelines.

    Science.gov (United States)

    Huang, Biqing; Zhu, Peng; Wu, Cheng

    2012-10-01

    In contemporary society, customer-centered health care, which stresses customer participation and long-term tailored care, is inevitably becoming a trend. Compared with the hospital or physician-centered healthcare process, the customer-centered healthcare process requires more knowledge and modeling such a process is extremely complex. Thus, building a care process model for a special customer is cost prohibitive. In addition, during the execution of a care process model, the information system should have flexibility to modify the model so that it adapts to changes in the healthcare process. Therefore, supporting the process in a flexible, cost-effective way is a key challenge for information technology. To meet this challenge, first, we analyze various kinds of knowledge used in process modeling, illustrate their characteristics, and detail their roles and effects in careflow modeling. Secondly, we propose a methodology to manage a lifecycle of the healthcare process modeling, with which models could be built gradually with convenience and efficiency. In this lifecycle, different levels of process models are established based on the kinds of knowledge involved, and the diffusion strategy of these process models is designed. Thirdly, architecture and prototype of the system supporting the process modeling and its lifecycle are given. This careflow system also considers the compatibility of legacy systems and authority problems. Finally, an example is provided to demonstrate implementation of the careflow system.

  14. Sensitivity analysis practices: Strategies for model-based inference

    International Nuclear Information System (INIS)

    Saltelli, Andrea; Ratto, Marco; Tarantola, Stefano; Campolongo, Francesca

    2006-01-01

    Fourteen years after Science's review of sensitivity analysis (SA) methods in 1989 (System analysis at molecular scale, by H. Rabitz) we search Science Online to identify and then review all recent articles having 'sensitivity analysis' as a keyword. In spite of the considerable developments which have taken place in this discipline, of the good practices which have emerged, and of existing guidelines for SA issued on both sides of the Atlantic, we could not find in our review other than very primitive SA tools, based on 'one-factor-at-a-time' (OAT) approaches. In the context of model corroboration or falsification, we demonstrate that this use of OAT methods is illicit and unjustified, unless the model under analysis is proved to be linear. We show that available good practices, such as variance based measures and others, are able to overcome OAT shortcomings and easy to implement. These methods also allow the concept of factors importance to be defined rigorously, thus making the factors importance ranking univocal. We analyse the requirements of SA in the context of modelling, and present best available practices on the basis of an elementary model. We also point the reader to available recipes for a rigorous SA

  15. A global geomagnetic model based on historical and paleomagnetic data

    Science.gov (United States)

    Arneitz, P.; Leonhardt, R.; Fabian, K.

    2015-12-01

    Two main types of data are available to reconstruct the temporal and spatial geomagnetic field evolution. Historical instrumental measurements (direct data) extend from present day to the late Middle Age, and, prior the 19th century, consist mainly of declination values. Further back in the past, field reconstructions rely exclusively on the magnetization acquired by archaeological artefacts and rocks or sediments (indirect data). The major challenges for a reliable inversion approach are the inhomogeneous data distribution, the highly variable data quality, and inconsistent quality parameters. Available historical, archeomagnetic and volcanic records have been integrated into a single database together with corresponding metadata. This combination of compilations enables a joint evaluation of geomagnetic field records from different origins. In particular, data reliability and quality of indirect records are investigated using a detailed comparison with their direct counterparts. The collection forms the basis for combined inverse modeling of the geomagnetic field evolution. The iterative Bayesian inversion approach targets the implementation of reliable error treatments, which allow to combine data from different sources. Furthermore, a verification method scrutinizing the limitations of the applied inversion scheme and the used datasets is developed. Here, we will present strategies for the integration of different data types into the modeling procedure. The obtained modeling results and their validity will be discussed.

  16. Improved spring model-based collaborative indoor visible light positioning

    Science.gov (United States)

    Luo, Zhijie; Zhang, WeiNan; Zhou, GuoFu

    2016-06-01

    Gaining accuracy with indoor positioning of individuals is important as many location-based services rely on the user's current position to provide them with useful services. Many researchers have studied indoor positioning techniques based on WiFi and Bluetooth. However, they have disadvantages such as low accuracy or high cost. In this paper, we propose an indoor positioning system in which visible light radiated from light-emitting diodes is used to locate the position of receivers. Compared with existing methods using light-emitting diode light, we present a high-precision and simple implementation collaborative indoor visible light positioning system based on an improved spring model. We first estimate coordinate position information using the visible light positioning system, and then use the spring model to correct positioning errors. The system can be employed easily because it does not require additional sensors and the occlusion problem of visible light would be alleviated. We also describe simulation experiments, which confirm the feasibility of our proposed method.

  17. Sensitivity analysis practices: Strategies for model-based inference

    Energy Technology Data Exchange (ETDEWEB)

    Saltelli, Andrea [Institute for the Protection and Security of the Citizen (IPSC), European Commission, Joint Research Centre, TP 361, 21020 Ispra (Vatican City State, Holy See,) (Italy)]. E-mail: andrea.saltelli@jrc.it; Ratto, Marco [Institute for the Protection and Security of the Citizen (IPSC), European Commission, Joint Research Centre, TP 361, 21020 Ispra (VA) (Italy); Tarantola, Stefano [Institute for the Protection and Security of the Citizen (IPSC), European Commission, Joint Research Centre, TP 361, 21020 Ispra (VA) (Italy); Campolongo, Francesca [Institute for the Protection and Security of the Citizen (IPSC), European Commission, Joint Research Centre, TP 361, 21020 Ispra (VA) (Italy)

    2006-10-15

    Fourteen years after Science's review of sensitivity analysis (SA) methods in 1989 (System analysis at molecular scale, by H. Rabitz) we search Science Online to identify and then review all recent articles having 'sensitivity analysis' as a keyword. In spite of the considerable developments which have taken place in this discipline, of the good practices which have emerged, and of existing guidelines for SA issued on both sides of the Atlantic, we could not find in our review other than very primitive SA tools, based on 'one-factor-at-a-time' (OAT) approaches. In the context of model corroboration or falsification, we demonstrate that this use of OAT methods is illicit and unjustified, unless the model under analysis is proved to be linear. We show that available good practices, such as variance based measures and others, are able to overcome OAT shortcomings and easy to implement. These methods also allow the concept of factors importance to be defined rigorously, thus making the factors importance ranking univocal. We analyse the requirements of SA in the context of modelling, and present best available practices on the basis of an elementary model. We also point the reader to available recipes for a rigorous SA.

  18. Correction tool for Active Shape Model based lumbar muscle segmentation.

    Science.gov (United States)

    Valenzuela, Waldo; Ferguson, Stephen J; Ignasiak, Dominika; Diserens, Gaelle; Vermathen, Peter; Boesch, Chris; Reyes, Mauricio

    2015-08-01

    In the clinical environment, accuracy and speed of the image segmentation process plays a key role in the analysis of pathological regions. Despite advances in anatomic image segmentation, time-effective correction tools are commonly needed to improve segmentation results. Therefore, these tools must provide faster corrections with a low number of interactions, and a user-independent solution. In this work we present a new interactive correction method for correcting the image segmentation. Given an initial segmentation and the original image, our tool provides a 2D/3D environment, that enables 3D shape correction through simple 2D interactions. Our scheme is based on direct manipulation of free form deformation adapted to a 2D environment. This approach enables an intuitive and natural correction of 3D segmentation results. The developed method has been implemented into a software tool and has been evaluated for the task of lumbar muscle segmentation from Magnetic Resonance Images. Experimental results show that full segmentation correction could be performed within an average correction time of 6±4 minutes and an average of 68±37 number of interactions, while maintaining the quality of the final segmentation result within an average Dice coefficient of 0.92±0.03.

  19. Forest height estimation from mountain forest areas using general model-based decomposition for polarimetric interferometric synthetic aperture radar images

    Science.gov (United States)

    Minh, Nghia Pham; Zou, Bin; Cai, Hongjun; Wang, Chengyi

    2014-01-01

    The estimation of forest parameters over mountain forest areas using polarimetric interferometric synthetic aperture radar (PolInSAR) images is one of the greatest interests in remote sensing applications. For mountain forest areas, scattering mechanisms are strongly affected by the ground topography variations. Most of the previous studies in modeling microwave backscattering signatures of forest area have been carried out over relatively flat areas. Therefore, a new algorithm for the forest height estimation from mountain forest areas using the general model-based decomposition (GMBD) for PolInSAR image is proposed. This algorithm enables the retrieval of not only the forest parameters, but also the magnitude associated with each mechanism. In addition, general double- and single-bounce scattering models are proposed to fit for the cross-polarization and off-diagonal term by separating their independent orientation angle, which remains unachieved in the previous model-based decompositions. The efficiency of the proposed approach is demonstrated with simulated data from PolSARProSim software and ALOS-PALSAR spaceborne PolInSAR datasets over the Kalimantan areas, Indonesia. Experimental results indicate that forest height could be effectively estimated by GMBD.

  20. A Multi-Agent Traffic Control Model Based on Distributed System

    Directory of Open Access Journals (Sweden)

    Qian WU

    2014-06-01

    Full Text Available With the development of urbanization construction, urban travel has become a quite thorny and imminent problem. Some previous researches on the large urban traffic systems easily change into NPC problems. We purpose a multi-agent inductive control model based on the distributed approach. To describe the real traffic scene, this model designs four different types of intelligent agents, i.e. we regard each lane, route, intersection and traffic region as different types of intelligent agents. Each agent can achieve the real-time traffic data from its neighbor agents, and decision-making agents establish real-time traffic signal plans through the communication between local agents and their neighbor agents. To evaluate the traffic system, this paper takes the average delay, the stopped time and the average speed as performance parameters. Finally, the distributed multi-agent is simulated on the VISSIM simulation platform, the simulation results show that the multi-agent system is more effective than the adaptive control system in solving the traffic congestion.

  1. A model-based exploration of the role of pattern generating circuits during locomotor adaptation.

    Science.gov (United States)

    Marjaninejad, Ali; Finley, James M

    2016-08-01

    In this study, we used a model-based approach to explore the potential contributions of central pattern generating circuits (CPGs) during adaptation to external perturbations during locomotion. We constructed a neuromechanical modeled of locomotion using a reduced-phase CPG controller and an inverted pendulum mechanical model. Two different forms of locomotor adaptation were examined in this study: split-belt treadmill adaptation and adaptation to a unilateral, elastic force field. For each simulation, we first examined the effects of phase resetting and varying the model's initial conditions on the resulting adaptation. After evaluating the effect of phase resetting on the adaptation of step length symmetry, we examined the extent to which the results from these simple models could explain previous experimental observations. We found that adaptation of step length symmetry during split-belt treadmill walking could be reproduced using our model, but this model failed to replicate patterns of adaptation observed in response to force field perturbations. Given that spinal animal models can adapt to both of these types of perturbations, our findings suggest that there may be distinct features of pattern generating circuits that mediate each form of adaptation.

  2. A Labeling Model Based on the Region of Movability for Point-Feature Label Placement

    Directory of Open Access Journals (Sweden)

    Lin Li

    2016-09-01

    Full Text Available Automatic point-feature label placement (PFLP is a fundamental task for map visualization. As the dominant solutions to the PFLP problem, fixed-position and slider models have been widely studied in previous research. However, the candidate labels generated with these models are set to certain fixed positions or a specified track line for sliding. Thus, the whole surrounding space of a point feature is not sufficiently used for labeling. Hence, this paper proposes a novel label model based on the region of movability, which comes from plane collision detection theory. The model defines a complete conflict-free search space for label placement. On the premise of no conflict with the point, line, and area features, the proposed model utilizes the surrounding zone of the point feature to generate candidate label positions. By combining with heuristic search method, the model achieves high-quality label placement. In addition, the flexibility of the proposed model enables placing arbitrarily shaped labels.

  3. Model-Based Normalization of a Fractional-Crystal Collimator for Small-Animal PET Imaging.

    Science.gov (United States)

    Li, Yusheng; Matej, Samuel; Karp, Joel S; Metzler, Scott D

    2017-05-01

    Previously, we proposed to use a coincidence collimator to achieve fractional-crystal resolution in PET imaging. We have designed and fabricated a collimator prototype for a small-animal PET scanner, A-PET. To compensate for imperfections in the fabricated collimator prototype, collimator normalization, as well as scanner normalization, is required to reconstruct quantitative and artifact-free images. In this study, we develop a normalization method for the collimator prototype based on the A-PET normalization using a uniform cylinder phantom. We performed data acquisition without the collimator for scanner normalization first, and then with the collimator from eight different rotation views for collimator normalization. After a reconstruction without correction, we extracted the cylinder parameters from which we generated expected emission sinograms. Single scatter simulation was used to generate the scattered sinograms. We used the least-squares method to generate the normalization coefficient for each LOR based on measured, expected and scattered sinograms. The scanner and collimator normalization coefficients were factorized by performing two normalizations separately. The normalization methods were also verified using experimental data acquired from A-PET with and without the collimator. In summary, we developed a model-base collimator normalization that can significantly reduce variance and produce collimator normalization with adequate statistical quality within feasible scan time.

  4. Model-Based Design and Evaluation of a Brachiating Monkey Robot with an Active Waist

    Directory of Open Access Journals (Sweden)

    Alex Kai-Yuan Lo

    2017-09-01

    Full Text Available We report on the model-based development of a monkey robot that is capable of performing continuous brachiation locomotion on swingable rod, as the intermediate step toward studying brachiation on the soft rope or on horizontal ropes with both ends fixed. The work is different from other previous works where the model or the robot swings on fixed bars. The model, which is composed of two rigid links, was inspired by the dynamic motion of primates. The model further served as the design guideline for a robot that has five degree of freedoms: two on each arm for rod changing and one on the waist to initiate a swing motion. The model was quantitatively formulated, and its dynamic behavior was analyzed in simulation. Further, a two-stage controller was developed within the simulation environment, where the first stage used the natural dynamics of a two-link pendulum-like model, and the second stage used the angular velocity feedback to regulate the waist motion. Finally, the robot was empirically built and evaluated. The experimental results confirm that the robot can perform model-like swing behavior and continuous brachiation locomotion on rods.

  5. Model-based T{sub 2} relaxometry using undersampled magnetic resonance imaging

    Energy Technology Data Exchange (ETDEWEB)

    Sumpf, Tilman

    2013-11-01

    T{sub 2} relaxometry refers to the quantitative determination of spin-spin relaxation times in magnetic resonance imaging (MRI). Particularly in clinical diagnostics, the method provides important information about tissue structures and respective pathologic alterations. Unfortunately, it also requires comparatively long measurement times which preclude widespread practical applications. To overcome such limitations, a so-called model-based reconstruction concept has recently been proposed. The method allows for the estimation of spin-density and T{sub 2} parameter maps from only a fraction of the usually required data. So far, promising results have been reported for a radial data acquisition scheme. However, due to technical reasons, radial imaging is only available on a very limited number of MRI systems. The present work deals with the realization and evaluation of different model-based T{sub 2} reconstruction methods that are applicable for the most widely available Cartesian (rectilinear) acquisition scheme. The initial implementation is based on the conventional assumption of a mono-exponential T{sub 2} signal decay. A suitable sampling scheme as well as an automatic scaling procedure are developed, which remove the necessity of manual parameter tuning. As demonstrated for human brain MRI data, the technique allows for a more than 5-fold acceleration of the underlying data acquisition. Furthermore, general limitations and specific error sources are identified and suitable simulation programs are developed for their detailed analysis. In addition to phase variations in image space, the simulations reveal truncation effects as a relevant cause of reconstruction artifacts. To reduce the latter, an alternative model formulation is developed and tested. For noise-free simulated data, the method yields an almost complete suppression of associated artifacts. Residual problems in the reconstruction of experimental MRI data point to the predominant influence of other

  6. The incidence of congenital heart disease: Previous findings and perspectives

    Directory of Open Access Journals (Sweden)

    Miranović Vesna

    2014-01-01

    Full Text Available Congenital heart defects (CHD are the most common of all congenital anomalies, and represent a significant global health problem. Involvement of medical professionals of different profiles has led to drastic changes in survival and quality of life of children with CHD. The motivation for the implementation of the first large population studies on this subject was not only to obtain answers to the question on the level of incidence of CHD, but the harmonization of criteria and protocols for monitoring and treatment of certain defects as well as the planning of medical staff dealing with children with CHD. Data on the incidence varies from 4-10/1000 live births. Fetal echocardiography can have potential impact on decrease of CHD incidence. The increase in incidence may be due to the possibility that children with CHD will grow up and have offsprings. Owing to the progress that has been made, an increasing number of patients experiences adulthood, creating an entirely new and growing population of patients: patients with “adult” CHD. Survivors suffer morbidity resulting from their circulatory abnormalities as well as from medical and surgical therapies they have been subjected to. Application of the achievements of human genome projects will in time lead to drastic changes in the approach to the patients with CHD. Until the time when it is possible, the goal will be further improvement of the existing system of service: networking in a unique, multicenter clinical registry of patients with CHD, as well as upgrading of technical and non-technical conditions for the treatment of patients with CHD. We are in an unprecedented time of change, but are actually at the end of the beginning of making pediatric cardiac care a highly reliable institution.

  7. Radon anomalies prior to earthquakes (1). Review of previous studies

    International Nuclear Information System (INIS)

    Ishikawa, Tetsuo; Tokonami, Shinji; Yasuoka, Yumi; Shinogi, Masaki; Nagahama, Hiroyuki; Omori, Yasutaka; Kawada, Yusuke

    2008-01-01

    The relationship between radon anomalies and earthquakes has been studied for more than 30 years. However, most of the studies dealt with radon in soil gas or in groundwater. Before the 1995 Hyogoken-Nanbu earthquake, an anomalous increase of atmospheric radon was observed at Kobe Pharmaceutical University. The increase was well fitted with a mathematical model related to earthquake fault dynamics. This paper reports the significance of this observation, reviewing previous studies on radon anomaly before earthquakes. Groundwater/soil radon measurements for earthquake prediction began in 1970's in Japan as well as foreign countries. One of the most famous studies in Japan is groundwater radon anomaly before the 1978 Izu-Oshima-kinkai earthquake. We have recognized the significance of radon in earthquake prediction research, but recently its limitation was also pointed out. Some researchers are looking for a better indicator for precursors; simultaneous measurements of radon and other gases are new trials in recent studies. Contrary to soil/groundwater radon, we have not paid much attention to atmospheric radon before earthquakes. However, it might be possible to detect precursors in atmospheric radon before a large earthquake. In the next issues, we will discuss the details of the anomalous atmospheric radon data observed before the Hyogoken-Nanbu earthquake. (author)

  8. Mediastinal involvement in lymphangiomatosis: a previously unreported MRI sign

    Energy Technology Data Exchange (ETDEWEB)

    Shah, Vikas; Shah, Sachit; Barnacle, Alex; McHugh, Kieran [Great Ormond Street Hospital for Children, Department of Radiology, London (United Kingdom); Sebire, Neil J. [Great Ormond Street Hospital for Children, Department of Histopathology, London (United Kingdom); Brock, Penelope [Great Ormond Street Hospital for Children, Department of Oncology, London (United Kingdom); Harper, John I. [Great Ormond Street Hospital for Children, Department of Dermatology, London (United Kingdom)

    2011-08-15

    Multifocal lymphangiomatosis is a rare systemic disorder affecting children. Due to its rarity and wide spectrum of clinical, histological and imaging features, establishing the diagnosis of multifocal lymphangiomatosis can be challenging. The purpose of this study was to describe a new imaging sign in this disorder: paraspinal soft tissue and signal abnormality at MRI. We retrospectively reviewed the imaging, clinical and histopathological findings in a cohort of eight children with thoracic involvement from this condition. Evidence of paraspinal chest disease was identified at MRI and CT in all eight of these children. The changes comprise heterogeneous intermediate-to-high signal parallel to the thoracic vertebrae on T2-weighted sequences at MRI, with abnormal paraspinal soft tissue at CT and plain radiography. Multifocal lymphangiomatosis is a rare disorder with a broad range of clinicopathological and imaging features. MRI allows complete evaluation of disease extent without the use of ionising radiation and has allowed us to describe a previously unreported imaging sign in this disorder, namely, heterogeneous hyperintense signal in abnormal paraspinal tissue on T2-weighted images. (orig.)

  9. Cerebral Metastasis from a Previously Undiagnosed Appendiceal Adenocarcinoma

    Directory of Open Access Journals (Sweden)

    Antonio Biroli

    2012-01-01

    Full Text Available Brain metastases arise in 10%–40% of all cancer patients. Up to one third of the patients do not have previous cancer history. We report a case of a 67-years-old male patient who presented with confusion, tremor, and apraxia. A brain MRI revealed an isolated right temporal lobe lesion. A thorax-abdomen-pelvis CT scan showed no primary lesion. The patient underwent a craniotomy with gross-total resection. Histopathology revealed an intestinal-type adenocarcinoma. A colonoscopy found no primary lesion, but a PET-CT scan showed elevated FDG uptake in the appendiceal nodule. A right hemicolectomy was performed, and the specimen showed a moderately differentiated mucinous appendiceal adenocarcinoma. Whole brain radiotherapy was administrated. A subsequent thorax-abdomen CT scan revealed multiple lung and hepatic metastasis. Seven months later, the patient died of disease progression. In cases of undiagnosed primary lesions, patients present in better general condition, but overall survival does not change. Eventual identification of the primary tumor does not affect survival. PET/CT might be a helpful tool in detecting lesions of the appendiceal region. To the best of our knowledge, such a case was never reported in the literature, and an appendiceal malignancy should be suspected in patients with brain metastasis from an undiagnosed primary tumor.

  10. Coronary collateral vessels in patients with previous myocardial infarction

    International Nuclear Information System (INIS)

    Nakatsuka, M.; Matsuda, Y.; Ozaki, M.

    1987-01-01

    To assess the degree of collateral vessels after myocardial infarction, coronary angiograms, left ventriculograms, and exercise thallium-201 myocardial scintigrams of 36 patients with previous myocardial infarction were reviewed. All 36 patients had total occlusion of infarct-related coronary artery and no more than 70% stenosis in other coronary arteries. In 19 of 36 patients with transient reduction of thallium-201 uptake in the infarcted area during exercise (Group A), good collaterals were observed in 10 patients, intermediate collaterals in 7 patients, and poor collaterals in 2 patients. In 17 of 36 patients without transient reduction of thallium-201 uptake in the infarcted area during exercise (Group B), good collaterals were seen in 2 patients, intermediate collaterals in 7 patients, and poor collaterals in 8 patients (p less than 0.025). Left ventricular contractions in the infarcted area were normal or hypokinetic in 10 patients and akinetic or dyskinetic in 9 patients in Group A. In Group B, 1 patient had hypokinetic contraction and 16 patients had akinetic or dyskinetic contraction (p less than 0.005). Thus, patients with transient reduction of thallium-201 uptake in the infarcted area during exercise had well developed collaterals and preserved left ventricular contraction, compared to those in patients without transient reduction of thallium-201 uptake in the infarcted area during exercise. These results suggest that the presence of viable myocardium in the infarcted area might be related to the degree of collateral vessels

  11. High-Grade Leiomyosarcoma Arising in a Previously Replanted Limb

    Directory of Open Access Journals (Sweden)

    Tiffany J. Pan

    2015-01-01

    Full Text Available Sarcoma development has been associated with genetics, irradiation, viral infections, and immunodeficiency. Reports of sarcomas arising in the setting of prior trauma, as in burn scars or fracture sites, are rare. We report a case of a leiomyosarcoma arising in an arm that had previously been replanted at the level of the elbow joint following traumatic amputation when the patient was eight years old. He presented twenty-four years later with a 10.8 cm mass in the replanted arm located on the volar forearm. The tumor was completely resected and pathology examination showed a high-grade, subfascial spindle cell sarcoma diagnosed as a grade 3 leiomyosarcoma with stage pT2bNxMx. The patient underwent treatment with brachytherapy, reconstruction with a free flap, and subsequently chemotherapy. To the best of our knowledge, this is the first case report of leiomyosarcoma developing in a replanted extremity. Development of leiomyosarcoma in this case could be related to revascularization, scar formation, or chronic injury after replantation. The patient remains healthy without signs of recurrence at three-year follow-up.

  12. Global functional atlas of Escherichia coli encompassing previously uncharacterized proteins.

    Directory of Open Access Journals (Sweden)

    Pingzhao Hu

    2009-04-01

    Full Text Available One-third of the 4,225 protein-coding genes of Escherichia coli K-12 remain functionally unannotated (orphans. Many map to distant clades such as Archaea, suggesting involvement in basic prokaryotic traits, whereas others appear restricted to E. coli, including pathogenic strains. To elucidate the orphans' biological roles, we performed an extensive proteomic survey using affinity-tagged E. coli strains and generated comprehensive genomic context inferences to derive a high-confidence compendium for virtually the entire proteome consisting of 5,993 putative physical interactions and 74,776 putative functional associations, most of which are novel. Clustering of the respective probabilistic networks revealed putative orphan membership in discrete multiprotein complexes and functional modules together with annotated gene products, whereas a machine-learning strategy based on network integration implicated the orphans in specific biological processes. We provide additional experimental evidence supporting orphan participation in protein synthesis, amino acid metabolism, biofilm formation, motility, and assembly of the bacterial cell envelope. This resource provides a "systems-wide" functional blueprint of a model microbe, with insights into the biological and evolutionary significance of previously uncharacterized proteins.

  13. Global functional atlas of Escherichia coli encompassing previously uncharacterized proteins.

    Science.gov (United States)

    Hu, Pingzhao; Janga, Sarath Chandra; Babu, Mohan; Díaz-Mejía, J Javier; Butland, Gareth; Yang, Wenhong; Pogoutse, Oxana; Guo, Xinghua; Phanse, Sadhna; Wong, Peter; Chandran, Shamanta; Christopoulos, Constantine; Nazarians-Armavil, Anaies; Nasseri, Negin Karimi; Musso, Gabriel; Ali, Mehrab; Nazemof, Nazila; Eroukova, Veronika; Golshani, Ashkan; Paccanaro, Alberto; Greenblatt, Jack F; Moreno-Hagelsieb, Gabriel; Emili, Andrew

    2009-04-28

    One-third of the 4,225 protein-coding genes of Escherichia coli K-12 remain functionally unannotated (orphans). Many map to distant clades such as Archaea, suggesting involvement in basic prokaryotic traits, whereas others appear restricted to E. coli, including pathogenic strains. To elucidate the orphans' biological roles, we performed an extensive proteomic survey using affinity-tagged E. coli strains and generated comprehensive genomic context inferences to derive a high-confidence compendium for virtually the entire proteome consisting of 5,993 putative physical interactions and 74,776 putative functional associations, most of which are novel. Clustering of the respective probabilistic networks revealed putative orphan membership in discrete multiprotein complexes and functional modules together with annotated gene products, whereas a machine-learning strategy based on network integration implicated the orphans in specific biological processes. We provide additional experimental evidence supporting orphan participation in protein synthesis, amino acid metabolism, biofilm formation, motility, and assembly of the bacterial cell envelope. This resource provides a "systems-wide" functional blueprint of a model microbe, with insights into the biological and evolutionary significance of previously uncharacterized proteins.

  14. Influence of Previous Knowledge in Torrance Tests of Creative Thinking

    Directory of Open Access Journals (Sweden)

    María Aranguren

    2015-07-01

    Full Text Available The aim of this work is to analyze the influence of study field, expertise and recreational activities participation in Torrance Tests of Creative Thinking (TTCT, 1974 performance. Several hypotheses were postulated to explore the possible effects of previous knowledge in TTCT verbal and TTCT figural university students’ outcomes. Participants in this study included 418 students from five study fields: Psychology;Philosophy and Literature, Music; Engineering; and Journalism and Advertising (Communication Sciences. Results found in this research seem to indicate that there in none influence of the study field, expertise and recreational activities participation in neither of the TTCT tests. Instead, the findings seem to suggest some kind of interaction between certain skills needed to succeed in specific studies fields and performance on creativity tests, such as the TTCT. These results imply that TTCT is a useful and valid instrument to measure creativity and that some cognitive process involved in innovative thinking can be promoted using different intervention programs in schools and universities regardless the students study field.

  15. Gastrointestinal tolerability with ibandronate after previous weekly bisphosphonate treatment.

    Science.gov (United States)

    Derman, Richard; Kohles, Joseph D; Babbitt, Ann

    2009-01-01

    Data from two open-label trials (PRIOR and CURRENT) of women with postmenopausal osteoporosis or osteopenia were evaluated to assess whether monthly oral and quarterly intravenous (IV) ibandronate dosing improved self-reported gastrointestinal (GI) tolerability for patients who had previously experienced GI irritation with bisphosphonate (BP) use. In PRIOR, women who had discontinued daily or weekly BP treatment due to GI intolerance received monthly oral or quarterly IV ibandronate for 12 months. The CURRENT subanalysis included women receiving weekly BP treatment who switched to monthly oral ibandronate for six months. GI symptom severity and frequency were assessed using the Osteoporosis Patient Satisfaction Questionnaire. In PRIOR, mean GI tolerability scores increased significantly at month 1 from screening for both treatment groups (oral: 79.3 versus 54.1; IV: 84.4 versus 51.0; p 90% at Month 10). In the CURRENT subanalysis >60% of patients reported improvements in heartburn or acid reflux and >70% indicated improvement in other stomach upset at month 6. Postmenopausal women with GI irritability with daily or weekly BPs experienced improvement in symptoms with extended dosing monthly or quarterly ibandronate compared with baseline.

  16. Pertussis-associated persistent cough in previously vaccinated children.

    Science.gov (United States)

    Principi, Nicola; Litt, David; Terranova, Leonardo; Picca, Marina; Malvaso, Concetta; Vitale, Cettina; Fry, Norman K; Esposito, Susanna

    2017-11-01

    To evaluate the role of Bordetella pertussis infection, 96 otherwise healthy 7- to 17-year-old subjects who were suffering from a cough lasting from 2 to 8 weeks were prospectively recruited. At enrolment, a nasopharyngeal swab and an oral fluid sample were obtained to search for pertussis infection by the detection of B. pertussis DNA and/or an elevated titre of anti-pertussis toxin IgG. Evidence of pertussis infection was found in 18 (18.7 %; 95 % confidence interval, 11.5-28.0) cases. In 15 cases, the disease occurred despite booster administration. In two cases, pertussis was diagnosed less than 2 years after the booster injection, whereas in the other cases it was diagnosed between 2 and 9 years after the booster dose. This study used non-invasive testing to show that pertussis is one of the most important causes of long-lasting cough in school-age subjects. Moreover, the protection offered by acellular pertussis vaccines currently wanes more rapidly than previously thought.

  17. Multispecies Coevolution Particle Swarm Optimization Based on Previous Search History

    Directory of Open Access Journals (Sweden)

    Danping Wang

    2017-01-01

    Full Text Available A hybrid coevolution particle swarm optimization algorithm with dynamic multispecies strategy based on K-means clustering and nonrevisit strategy based on Binary Space Partitioning fitness tree (called MCPSO-PSH is proposed. Previous search history memorized into the Binary Space Partitioning fitness tree can effectively restrain the individuals’ revisit phenomenon. The whole population is partitioned into several subspecies and cooperative coevolution is realized by an information communication mechanism between subspecies, which can enhance the global search ability of particles and avoid premature convergence to local optimum. To demonstrate the power of the method, comparisons between the proposed algorithm and state-of-the-art algorithms are grouped into two categories: 10 basic benchmark functions (10-dimensional and 30-dimensional, 10 CEC2005 benchmark functions (30-dimensional, and a real-world problem (multilevel image segmentation problems. Experimental results show that MCPSO-PSH displays a competitive performance compared to the other swarm-based or evolutionary algorithms in terms of solution accuracy and statistical tests.

  18. An improved resource management model based on MDS

    Science.gov (United States)

    Yuan, Man; Sun, Changying; Li, Pengfei; Sun, Yongdong; He, Rui

    2005-11-01

    GRID technology provides a kind of convenient method for managing GRID resources. This service is so-called monitoring, discovering service. This method is proposed by Globus Alliance, in this GRID environment, all kinds of resources, such as computational resources, storage resources and other resources can be organized by MDS specifications. However, this MDS is a theory framework, particularly, in a small world intranet, in the case of limit of resources, the MDS has its own limitation. Based on MDS, an improved light method for managing corporation computational resources and storage resources is proposed in intranet(IMDS). Firstly, in MDS, all kinds of resource description information is stored in LDAP, it is well known although LDAP is a light directory access protocol, in practice, programmers rarely master how to access and store resource information into LDAP store, in such way, it limits MDS to be used. So, in intranet, these resources' description information can be stored in RDBMS, programmers and users can access this information by standard SQL. Secondly, in MDS, how to monitor all kinds of resources in GRID is not transparent for programmers and users. In such way, it limits its application scope, in general, resource monitoring method base on SNMP is widely employed in intranet, therefore, a kind of resource monitoring method based on SNMP is integrated into MDS. Finally, all kinds of resources in the intranet can be described by XML, and all kinds of resources' description information is stored in RDBMS, such as MySql, and retrieved by standard SQL, dynamic information for all kinds of resources can be sent to resource storage by SNMP, A prototype resource description, monitoring is designed and implemented in intranet.

  19. Systematic review of model-based cervical screening evaluations.

    Science.gov (United States)

    Mendes, Diana; Bains, Iren; Vanni, Tazio; Jit, Mark

    2015-05-01

    Optimising population-based cervical screening policies is becoming more complex due to the expanding range of screening technologies available and the interplay with vaccine-induced changes in epidemiology. Mathematical models are increasingly being applied to assess the impact of cervical cancer screening strategies. We systematically reviewed MEDLINE®, Embase, Web of Science®, EconLit, Health Economic Evaluation Database, and The Cochrane Library databases in order to identify the mathematical models of human papillomavirus (HPV) infection and cervical cancer progression used to assess the effectiveness and/or cost-effectiveness of cervical cancer screening strategies. Key model features and conclusions relevant to decision-making were extracted. We found 153 articles meeting our eligibility criteria published up to May 2013. Most studies (72/153) evaluated the introduction of a new screening technology, with particular focus on the comparison of HPV DNA testing and cytology (n = 58). Twenty-eight in forty of these analyses supported HPV DNA primary screening implementation. A few studies analysed more recent technologies - rapid HPV DNA testing (n = 3), HPV DNA self-sampling (n = 4), and genotyping (n = 1) - and were also supportive of their introduction. However, no study was found on emerging molecular markers and their potential utility in future screening programmes. Most evaluations (113/153) were based on models simulating aggregate groups of women at risk of cervical cancer over time without accounting for HPV infection transmission. Calibration to country-specific outcome data is becoming more common, but has not yet become standard practice. Models of cervical screening are increasingly used, and allow extrapolation of trial data to project the population-level health and economic impact of different screening policy. However, post-vaccination analyses have rarely incorporated transmission dynamics. Model calibration to country

  20. Rice production model based on the concept of ecological footprint

    Science.gov (United States)

    Faiz, S. A.; Wicaksono, A. D.; Dinanti, D.

    2017-06-01

    Pursuant to what had been stated in Region Spatial Planning (RTRW) of Malang Regency for period 2010-2030, Malang Regency was considered as the center of agricultural development, including districts bordered with Malang City. To protect the region functioning as the provider of rice production, then the policy of sustainable food farming-land (LP2B) was made which its implementation aims to protect rice-land. In the existing condition, LP2B system was not maximally executed, and it caused a limited extend of rice-land to deliver rice production output. One cause related with the development of settlements and industries due to the effect of Malang City that converted land-function. Location of research focused on 30 villages with direct border with Malang City. Review was conducted to develop a model of relation between farming production output and ecological footprint variables. These variables include rice-land area (X1), built land percentage (X2), and number of farmers (X3). Analysis technique was regression. Result of regression indicated that the model of rice production output Y=-207,983 + 10.246X1. Rice-land area (X1) was the most influential independent variable. It was concluded that of villages directly bordered with Malang City, there were 11 villages with higher production potential because their rice production yield was more than 1,000 tons/year, while 12 villages were threatened with low production output because its rice production yield only attained 500 tons/year. Based on the model and the spatial direction of RTRW, it can be said that the direction for the farming development policy must be redesigned to maintain rice-land area on the regions on which agricultural activity was still dominant. Because rice-land area was the most influential factor to farming production. Therefore, the wider the rice-land is, the higher rice production output is on each village.

  1. Model-based Extracted Water Desalination System for Carbon Sequestration

    Energy Technology Data Exchange (ETDEWEB)

    Gettings, Rachel; Dees, Elizabeth

    2017-03-23

    The focus of this research effort centered around water recovery from high Total Dissolved Solids (TDS) extracted waters (180,000 mg/L) using a combination of water recovery (partial desalination) technologies. The research goals of this project were as follows: 1. Define the scope and test location for pilot-scale implementation of the desalination system, 2.Define a scalable, multi-stage extracted water desalination system that yields clean water, concentrated brine, and, salt from saline brines, and 3. Validate overall system performance with field-sourced water using GE pre-pilot lab facilities. Conventional falling film-mechanical vapor recompression (FF-MVR) technology was established as a baseline desalination process. A quality function deployment (QFD) method was used to compare alternate high TDS desalination technologies to the base case FF-MVR technology, including but not limited to: membrane distillation (MD), forward osmosis (FO), and high pressure reverse osmosis (HPRO). Technoeconomic analysis of high pressure reverse osmosis (HPRO) was performed comparing the following two cases: 1. a hybrid seawater RO (SWRO) plus HPRO system and 2. 2x standard seawater RO system, to achieve the same total pure water recovery rate. Pre-pilot-scale tests were conducted using field production water to validate key process steps for extracted water pretreatment. Approximately 5,000 gallons of field produced water was processed through, microfiltration, ultrafiltration, and steam regenerable sorbent operations. Improvements in membrane materials of construction were considered as necessary next steps to achieving further improvement in element performance at high pressure. Several modifications showed promising results in their ability to withstand close to 5,000 PSI without gross failure.

  2. Incentive Model Based on Cooperative Relationship in Sustainable Construction Projects

    Directory of Open Access Journals (Sweden)

    Guangdong Wu

    2017-07-01

    Full Text Available Considering the cooperative relationship between owners and contractors in sustainable construction projects, as well as the synergistic effects created by cooperative behaviors, a cooperative incentive model was developed using game theory. The model was formulated and analyzed under both non-moral hazard and moral hazard situations. Then, a numerical simulation and example were proposed to verify the conclusions derived from the model. The results showed that the synergistic effect increases the input intensity of one party’s resource transfer into the increase of marginal utility of the other party, thus the owner and contractor are willing to enhance their levels of effort. One party’s optimal benefit allocation coefficient is positively affected by its own output efficiency, and negatively affected by the other party’s output efficiency. The effort level and expected benefits of the owner and contractor can be improved by enhancing the cooperative relationship between the two parties, as well as enhancing the net benefits of a sustainable construction project. The synergistic effect cannot lower the negative effect of moral hazard behaviors during the implementation of sustainable construction projects. Conversely, the higher levels of the cooperative relationship, the wider the gaps amongst the optimal values under both non-moral hazard and moral hazard situations for the levels of effort, expected benefits and net project benefits. Since few studies to date have emphasized the effects of cooperative relationship on sustainable construction projects, this study constructed a game-based incentive model to bridge the gaps. This study contributes significant theoretical and practical insights into the management of cooperation amongst stakeholders, and into the enhancement of the overall benefits of sustainable construction projects.

  3. A new Expert Finding model based on Term Correlation Matrix

    Directory of Open Access Journals (Sweden)

    Ehsan Pornour

    2015-09-01

    Full Text Available Due to the enormous volume of unstructured information available on the Web and inside organization, finding an answer to the knowledge need in a short time is difficult. For this reason, beside Search Engines which don’t consider users individual characteristics, Recommender systems were created which use user’s previous activities and other individual characteristics to help users find needed knowledge. Recommender systems usage is increasing every day. Expert finder systems also by introducing expert people instead of recommending information to users have provided this facility for users to ask their questions form experts. Having relation with experts not only causes information transition, but also with transferring experiences and inception causes knowledge transition. In this paper we used university professors academic resume as expert people profile and then proposed a new expert finding model that recommends experts to users query. We used Term Correlation Matrix, Vector Space Model and PageRank algorithm and proposed a new hybrid model which outperforms conventional methods. This model can be used in internet environment, organizations and universities that experts have resume dataset.

  4. Cognitive performance modeling based on general systems performance theory.

    Science.gov (United States)

    Kondraske, George V

    2010-01-01

    General Systems Performance Theory (GSPT) was initially motivated by problems associated with quantifying different aspects of human performance. It has proved to be invaluable for measurement development and understanding quantitative relationships between human subsystem capacities and performance in complex tasks. It is now desired to bring focus to the application of GSPT to modeling of cognitive system performance. Previous studies involving two complex tasks (i.e., driving and performing laparoscopic surgery) and incorporating measures that are clearly related to cognitive performance (information processing speed and short-term memory capacity) were revisited. A GSPT-derived method of task analysis and performance prediction termed Nonlinear Causal Resource Analysis (NCRA) was employed to determine the demand on basic cognitive performance resources required to support different levels of complex task performance. This approach is presented as a means to determine a cognitive workload profile and the subsequent computation of a single number measure of cognitive workload (CW). Computation of CW may be a viable alternative to measuring it. Various possible "more basic" performance resources that contribute to cognitive system performance are discussed. It is concluded from this preliminary exploration that a GSPT-based approach can contribute to defining cognitive performance models that are useful for both individual subjects and specific groups (e.g., military pilots).

  5. Interactive object modelling based on piecewise planar surface patches.

    Science.gov (United States)

    Prankl, Johann; Zillich, Michael; Vincze, Markus

    2013-06-01

    Detecting elements such as planes in 3D is essential to describe objects for applications such as robotics and augmented reality. While plane estimation is well studied, table-top scenes exhibit a large number of planes and methods often lock onto a dominant plane or do not estimate 3D object structure but only homographies of individual planes. In this paper we introduce MDL to the problem of incrementally detecting multiple planar patches in a scene using tracked interest points in image sequences. Planar patches are reconstructed and stored in a keyframe-based graph structure. In case different motions occur, separate object hypotheses are modelled from currently visible patches and patches seen in previous frames. We evaluate our approach on a standard data set published by the Visual Geometry Group at the University of Oxford [24] and on our own data set containing table-top scenes. Results indicate that our approach significantly improves over the state-of-the-art algorithms.

  6. Interactive object modelling based on piecewise planar surface patches☆

    Science.gov (United States)

    Prankl, Johann; Zillich, Michael; Vincze, Markus

    2013-01-01

    Detecting elements such as planes in 3D is essential to describe objects for applications such as robotics and augmented reality. While plane estimation is well studied, table-top scenes exhibit a large number of planes and methods often lock onto a dominant plane or do not estimate 3D object structure but only homographies of individual planes. In this paper we introduce MDL to the problem of incrementally detecting multiple planar patches in a scene using tracked interest points in image sequences. Planar patches are reconstructed and stored in a keyframe-based graph structure. In case different motions occur, separate object hypotheses are modelled from currently visible patches and patches seen in previous frames. We evaluate our approach on a standard data set published by the Visual Geometry Group at the University of Oxford [24] and on our own data set containing table-top scenes. Results indicate that our approach significantly improves over the state-of-the-art algorithms. PMID:24511219

  7. Model-Based Battery Management Systems: From Theory to Practice

    Science.gov (United States)

    Pathak, Manan

    Lithium-ion batteries are now extensively being used as the primary storage source. Capacity and power fade, and slow recharging times are key issues that restrict its use in many applications. Battery management systems are critical to address these issues, along with ensuring its safety. This dissertation focuses on exploring various control strategies using detailed physics-based electrochemical models developed previously for lithium-ion batteries, which could be used in advanced battery management systems. Optimal charging profiles for minimizing capacity fade based on SEI-layer formation are derived and the benefits of using such control strategies are shown by experimentally testing them on a 16 Ah NMC-based pouch cell. This dissertation also explores different time-discretization strategies for non-linear models, which gives an improved order of convergence for optimal control problems. Lastly, this dissertation also explores a physics-based model for predicting the linear impedance of a battery, and develops a freeware that is extremely robust and computationally fast. Such a code could be used for estimating transport, kinetic and material properties of the battery based on the linear impedance spectra.

  8. Sacrococcygeal pilonidal disease: analysis of previously proposed risk factors

    Directory of Open Access Journals (Sweden)

    Ali Harlak

    2010-01-01

    Full Text Available PURPOSE: Sacrococcygeal pilonidal disease is a source of one of the most common surgical problems among young adults. While male gender, obesity, occupations requiring sitting, deep natal clefts, excessive body hair, poor body hygiene and excessive sweating are described as the main risk factors for this disease, most of these need to be verified with a clinical trial. The present study aimed to evaluate the value and effect of these factors on pilonidal disease. METHOD: Previously proposed main risk factors were evaluated in a prospective case control study that included 587 patients with pilonidal disease and 2,780 healthy control patients. RESULTS: Stiffness of body hair, number of baths and time spent seated per day were the three most predictive risk factors. Adjusted odds ratios were 9.23, 6.33 and 4.03, respectively (p<0.001. With an adjusted odds ratio of 1.3 (p<.001, body mass index was another risk factor. Family history was not statistically different between the groups and there was no specific occupation associated with the disease. CONCLUSIONS: Hairy people who sit down for more than six hours a day and those who take a bath two or less times per week are at a 219-fold increased risk for sacrococcygeal pilonidal disease than those without these risk factors. For people with a great deal of hair, there is a greater need for them to clean their intergluteal sulcus. People who engage in work that requires sitting in a seat for long periods of time should choose more comfortable seats and should also try to stand whenever possible.

  9. Impact of Students’ Class Attendance on Recalling Previously Acquired Information

    Directory of Open Access Journals (Sweden)

    Camellia Hemyari

    2018-03-01

    Full Text Available Background: In recent years, availability of class material including typed lectures, the professor’s Power Point slides, sound recordings, and even videos made a group of students feel that it is unnecessary to attend the classes. These students usually read and memorize typed lectures within two or three days prior to the exams and usually pass the tests even with low attendance rate. Thus, the question is how effective is this learning system and how long the one-night memorized lessons may last.Methods: A group of medical students (62 out of 106 students, with their class attendance and educational achievements in the Medical Mycology and Parasitology course being recorded since two years ago, was selected and their knowledge about this course was tested by multiple choice questions (MCQ designed based on the previous lectures.Results: Although the mean re-exam score of the students at the end of the externship was lower than the corresponding final score, a significant association was found between the scores of the students in these two exams (r=0.48, P=0.01. Moreover, a significant negative association was predicted between the number of absences and re-exam scores (r=-0.26, P=0.037.Conclusion: As our findings show, the phenomenon of recalling the acquired lessons is preserved for a long period of time and it is associated with the students’ attendance. Many factors including generation effect (by taking notes and cued-recall (via slide picture might play a significant role in the better recalling of the learned information in students with good class attendance.Keywords: STUDENT, MEMORY, LONG-TERM, RECALL, ABSENTEEISM, LEARNING

  10. Repeat immigration: A previously unobserved source of heterogeneity?

    Science.gov (United States)

    Aradhya, Siddartha; Scott, Kirk; Smith, Christopher D

    2017-07-01

    Register data allow for nuanced analyses of heterogeneities between sub-groups which are not observable in other data sources. One heterogeneity for which register data is particularly useful is in identifying unique migration histories of immigrant populations, a group of interest across disciplines. Years since migration is a commonly used measure of integration in studies seeking to understand the outcomes of immigrants. This study constructs detailed migration histories to test whether misclassified migrations may mask important heterogeneities. In doing so, we identify a previously understudied group of migrants called repeat immigrants, and show that they differ systematically from permanent immigrants. In addition, we quantify the degree to which migration information is misreported in the registers. The analysis is carried out in two steps. First, we estimate income trajectories for repeat immigrants and permanent immigrants to understand the degree to which they differ. Second, we test data validity by cross-referencing migration information with changes in income to determine whether there are inconsistencies indicating misreporting. From the first part of the analysis, the results indicate that repeat immigrants systematically differ from permanent immigrants in terms of income trajectories. Furthermore, income trajectories differ based on the way in which years since migration is calculated. The second part of the analysis suggests that misreported migration events, while present, are negligible. Repeat immigrants differ in terms of income trajectories, and may differ in terms of other outcomes as well. Furthermore, this study underlines that Swedish registers provide a reliable data source to analyze groups which are unidentifiable in other data sources.

  11. Gastrointestinal tolerability with ibandronate after previous weekly bisphosphonate treatment

    Directory of Open Access Journals (Sweden)

    Richard Derman

    2009-09-01

    Full Text Available Richard Derman1, Joseph D Kohles2, Ann Babbitt31Department of Obstetrics and Gynecology, Christiana Hospital, Newark, DE, USA; 2Roche, Nutley, NJ, USA; 3Greater Portland Bone and Joint Specialists, Portland, ME, USAAbstract: Data from two open-label trials (PRIOR and CURRENT of women with postmenopausal osteoporosis or osteopenia were evaluated to assess whether monthly oral and quarterly intravenous (IV ibandronate dosing improved self-reported gastrointestinal (GI tolerability for patients who had previously experienced GI irritation with bisphosphonate (BP use. In PRIOR, women who had discontinued daily or weekly BP treatment due to GI intolerance received monthly oral or quarterly IV ibandronate for 12 months. The CURRENT subanalysis included women receiving weekly BP treatment who switched to monthly oral ibandronate for six months. GI symptom severity and frequency were assessed using the Osteoporosis Patient Satisfaction Questionnaire™. In PRIOR, mean GI tolerability scores increased significantly at month 1 from screening for both treatment groups (oral: 79.3 versus 54.1; IV: 84.4 versus 51.0; p < 0.001 for both. Most patients reported improvement in GI symptom severity and frequency from baseline at all post-screening assessments (>90% at Month 10. In the CURRENT subanalysis >60% of patients reported improvements in heartburn or acid reflux and >70% indicated improvement in other stomach upset at month 6. Postmenopausal women with GI irritability with daily or weekly BPs experienced improvement in symptoms with extended dosing monthly or quarterly ibandronate compared with baseline.Keywords: ibandronate, osteoporosis, bisphosphonate, gastrointestinal

  12. Implementation of resource-negotiating agents in telemanufacturing

    CSIR Research Space (South Africa)

    Van Zyl, TL

    2010-01-01

    Full Text Available the negotiations should be stored to allow future agents to benefit from previous negotiations. The components that are needed to implement the above and the results of the implementation are discussed in this paper....

  13. Comparison of cloud optical depth and cloud mask applying BRDF model-based background surface reflectance

    Science.gov (United States)

    Kim, H. W.; Yeom, J. M.; Woo, S. H.

    2017-12-01

    Over the thin cloud region, satellite can simultaneously detect the reflectance from thin clouds and land surface. Since the mixed reflectance is not the exact cloud information, the background surface reflectance should be eliminated to accurately distinguish thin cloud such as cirrus. In the previous research, Kim et al (2017) was developed the cloud masking algorithm using the Geostationary Ocean Color Imager (GOCI), which is one of significant instruments for Communication, Ocean, and Meteorology Satellite (COMS). Although GOCI has 8 spectral channels including visible and near infra-red spectral ranges, the cloud masking has quantitatively reasonable result when comparing with MODIS cloud mask (Collection 6 MYD35). Especially, we noticed that this cloud masking algorithm is more specialized in thin cloud detections through the validation with Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) data. Because this cloud masking method was concentrated on eliminating background surface effects from the top-of-atmosphere (TOA) reflectance. Applying the difference between TOA reflectance and the bi-directional reflectance distribution function (BRDF) model-based background surface reflectance, cloud areas both thick cloud and thin cloud can be discriminated without infra-red channels which were mostly used for detecting clouds. Moreover, when the cloud mask result was utilized as the input data when simulating BRDF model and the optimized BRDF model-based surface reflectance was used for the optimized cloud masking, the probability of detection (POD) has higher value than POD of the original cloud mask. In this study, we examine the correlation between cloud optical depth (COD) and its cloud mask result. Cloud optical depths mostly depend on the cloud thickness, the characteristic of contents, and the size of cloud contents. COD ranges from less than 0.1 for thin clouds to over 1000 for the huge cumulus due to scattering by droplets. With

  14. Riparian erosion vulnerability model based on environmental features.

    Science.gov (United States)

    Botero-Acosta, Alejandra; Chu, Maria L; Guzman, Jorge A; Starks, Patrick J; Moriasi, Daniel N

    2017-12-01

    Riparian erosion is one of the major causes of sediment and contaminant load to streams, degradation of riparian wildlife habitats, and land loss hazards. Land and soil management practices are implemented as conservation and restoration measures to mitigate the environmental problems brought about by riparian erosion. This, however, requires the identification of vulnerable areas to soil erosion. Because of the complex interactions between the different mechanisms that govern soil erosion and the inherent uncertainties involved in quantifying these processes, assessing erosion vulnerability at the watershed scale is challenging. The main objective of this study was to develop a methodology to identify areas along the riparian zone that are susceptible to erosion. The methodology was developed by integrating the physically-based watershed model MIKE-SHE, to simulate water movement, and a habitat suitability model, MaxEnt, to quantify the probability of presences of elevation changes (i.e., erosion) across the watershed. The presences of elevation changes were estimated based on two LiDAR-based elevation datasets taken in 2009 and 2012. The changes in elevation were grouped into four categories: low (0.5 - 0.7 m), medium (0.7 - 1.0 m), high (1.0 - 1.7 m) and very high (1.7 - 5.9 m), considering each category as a studied "species". The categories' locations were then used as "species location" map in MaxEnt. The environmental features used as constraints to the presence of erosion were land cover, soil, stream power index, overland flow, lateral inflow, and discharge. The modeling framework was evaluated in the Fort Cobb Reservoir Experimental watershed in southcentral Oklahoma. Results showed that the most vulnerable areas for erosion were located at the upper riparian zones of the Cobb and Lake sub-watersheds. The main waterways of these sub-watersheds were also found to be prone to streambank erosion. Approximatively 80% of the riparian zone (streambank

  15. Model-based wear measurements in total knee arthroplasty : development and validation of novel radiographic techniques

    NARCIS (Netherlands)

    IJsseldijk, van E.A.

    2016-01-01

    The primary aim of this work was to develop novel model-based mJSW measurement methods using a 3D reconstruction and compare the accuracy and precision of these methods to conventional mJSW measurement. This thesis contributed to the development, validation and clinical application of model-based

  16. Model-based design of supervisory controllers for baggage handling systems

    NARCIS (Netherlands)

    Swartjes, L.; van Beek, D.A.; Fokkink, W.J.; van Eekelen, J.A.W.M.

    2017-01-01

    The complexity of airport baggage handling systems in combination with the required high level of robustness makes designing supervisory controllers for these systems a challenging task. We show how a state of the art, formal, model-based design framework has been successfully used for model-based

  17. The Use of Modeling-Based Text to Improve Students' Modeling Competencies

    Science.gov (United States)

    Jong, Jing-Ping; Chiu, Mei-Hung; Chung, Shiao-Lan

    2015-01-01

    This study investigated the effects of a modeling-based text on 10th graders' modeling competencies. Fifteen 10th graders read a researcher-developed modeling-based science text on the ideal gas law that included explicit descriptions and representations of modeling processes (i.e., model selection, model construction, model validation, model…

  18. Architectural design thinking as a form of model-based reasoning

    NARCIS (Netherlands)

    Pauwels, P.; Bod, R.

    2014-01-01

    Model-based reasoning can be considered central in very diverse domains of practice. Recently considered domains of practice are political discourse, social intercourse, language learning, archaeology, collaboration and conversation, and so forth. In this paper, we explore features of model-based

  19. A model-based systems approach to pharmaceutical product-process design and analysis

    DEFF Research Database (Denmark)

    Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    This is a perspective paper highlighting the need for systematic model-based design and analysis in pharmaceutical product-process development. A model-based framework is presented and the role, development and use of models of various types are discussed together with the structure of the models...

  20. Integration of supervisory control synthesis in model-based systems engineering

    NARCIS (Netherlands)

    Baeten, J.C.M.; Mortel - Fronczak, van de J.M.; Rooda, J.E.

    2011-01-01

    Due to increasing system complexity, time-to-market and development costs reduction, there are higher demands on engineering processes. Model-based engineering can play a role here because it supports system development by enabling the use of various model-based analysis techniques and tools. As a

  1. Impact of vocational interests, previous academic experience, gender and age on Situational Judgement Test performance.

    Science.gov (United States)

    Schripsema, Nienke R; van Trigt, Anke M; Borleffs, Jan C C; Cohen-Schotanus, Janke

    2017-05-01

    Situational Judgement Tests (SJTs) are increasingly implemented in medical school admissions. In this paper, we investigate the effects of vocational interests, previous academic experience, gender and age on SJT performance. The SJT was part of the selection process for the Bachelor's degree programme in Medicine at University of Groningen, the Netherlands. All applicants for the academic year 2015-2016 were included and had to choose between learning communities Global Health (n = 126), Sustainable Care (n = 149), Intramural Care (n = 225), or Molecular Medicine (n = 116). This choice was used as a proxy for vocational interest. In addition, all graduate-entry applicants for academic year 2015-2016 (n = 213) were included to examine the effect of previous academic experience on performance. We used MANCOVA analyses with Bonferroni post hoc multiple comparisons tests for applicant performance on a six-scenario SJT. The MANCOVA analyses showed that for all scenarios, the independent variables were significantly related to performance (Pillai's Trace: 0.02-0.47, p performance on three scenarios (p performance on two scenarios (p performance, as was previous academic experience. Gender and age were related to performance on SJT scenarios in different settings. Especially the first effect might be helpful in selecting appropriate candidates for areas of health care in which more professionals are needed.

  2. Model-based crosstalk compensation in simultaneous dual isotope SPECT

    International Nuclear Information System (INIS)

    Frey, E.C.; Tsui, B.M.W.; Du, A.Y.; Song, X.Y.

    2002-01-01

    Simultaneous dual isotope imaging has the potential of allowing imaging of two different physiological processes at the same time. Two examples are Tc-99m stress and Tl-201 rest myocardial perfusion imaging and Tc-99m perfusion and I-123 neuroreceptor brain imaging. However, for both of these cases crosstalk is a significant problem that results in degradation of the simultaneously acquired images. For the Tc-99m and Tl-201 case, the crosstalk includes downscatter and the generation of Pb x-rays detected in the Tl-201 energy window. For the Tc-99m/I-123 case, the crosstalk includes overlap of the two photopeaks, downscatter, especially of the 159 keV photons into the Tc-99m energy window, and contamination of the images of both isotopes by low abundance, high-energy I-123 photons. We have developed methods to accurately model the crosstalk in both cases. For the Tc-99m/Tl-201 case, the crosstalk model uses a previously described method for modeling scatter, effective source scatter estimation (ESSE). The scatter estimates are combined with a parameterization of the Pb x-ray response of the collimator to estimate the crosstalk. For the I-123/Tc-99m case we combine ESSE with a MC simulated collimator scatter and penetration responses to model the contamination due to the high-energy photons from I-123. In both cases the crosstalk models have been incorporated into an iterative reconstruction procedure that allows simultaneous reconstruction of the activity distributions from two isotopes including crosstalk compensation We have evaluated these methods both by Monte Carlo simulation studies and using physical phantom experiments. We find that the methods perform well and produce images with a quality and contrast approaching that of separately acquired images. However, the compensated simultaneously acquired images do have an increase in image noise. To reduce the noise we have applied ideal observer methodology to determine optimal energy windows and relative

  3. Forecasting the Success of Implementing Sensors Advanced Manufacturing Technology

    OpenAIRE

    Cheng-Shih Su; Shu-Chen Hsu

    2014-01-01

    This paper is presented fuzzy preference relations approach to forecast the success of implementing sensors advanced manufacturing technology (AMT). In the manufacturing environment, performance measurement is based on different quantitative and qualitative factors. This study proposes an analytic hierarchical prediction model based on fuzzy preference relations to help the organizations become aware of the essential factors affecting the AMT implementation, forecasting the chance of successf...

  4. Model-based fault diagnosis approach on external short circuit of lithium-ion battery used in electric vehicles

    International Nuclear Information System (INIS)

    Chen, Zeyu; Xiong, Rui; Tian, Jinpeng; Shang, Xiong; Lu, Jiahuan

    2016-01-01

    Highlights: • The characteristics of ESC fault of lithium-ion battery are investigated experimentally. • The proposed method to simulate the electrical behavior of ESC fault is viable. • Ten parameters in the presented fault model were optimized using a DPSO algorithm. • A two-layer model-based fault diagnosis approach for battery ESC is proposed. • The effective and robustness of the proposed algorithm has been evaluated. - Abstract: This study investigates the external short circuit (ESC) fault characteristics of lithium-ion battery experimentally. An experiment platform is established and the ESC tests are implemented on ten 18650-type lithium cells considering different state-of-charges (SOCs). Based on the experiment results, several efforts have been made. (1) The ESC process can be divided into two periods and the electrical and thermal behaviors within these two periods are analyzed. (2) A modified first-order RC model is employed to simulate the electrical behavior of the lithium cell in the ESC fault process. The model parameters are re-identified by a dynamic-neighborhood particle swarm optimization algorithm. (3) A two-layer model-based ESC fault diagnosis algorithm is proposed. The first layer conducts preliminary fault detection and the second layer gives a precise model-based diagnosis. Four new cells are short-circuited to evaluate the proposed algorithm. It shows that the ESC fault can be diagnosed within 5 s, the error between the model and measured data is less than 0.36 V. The effectiveness of the fault diagnosis algorithm is not sensitive to the precision of battery SOC. The proposed algorithm can still make the correct diagnosis even if there is 10% error in SOC estimation.

  5. Design and implementation of FPGA-based phase modulation ...

    Indian Academy of Sciences (India)

    Section 3 deals with the implementation process of the digital con- .... To aid the analysis of the SRC, a simulation model based on the state-space technique is ... compensator designed using 'SISO' tool in Matlab, is a PI compensator which ...

  6. Empirical wind retrieval model based on SAR spectrum measurements

    Science.gov (United States)

    Panfilova, Maria; Karaev, Vladimir; Balandina, Galina; Kanevsky, Mikhail; Portabella, Marcos; Stoffelen, Ad

    ambiguity from polarimetric SAR. A criterion based on the complex correlation coefficient between the VV and VH signals sign is applied to select the wind direction. An additional quality control on the wind speed value retrieved with the spectral method is applied. Here, we use the direction obtained with the spectral method and the backscattered signal for CMOD wind speed estimate. The algorithm described above may be refined by the use of numerous SAR data and wind measurements. In the present preliminary work the first results of SAR images combined with in situ data processing are presented. Our results are compared to the results obtained using previously developed models CMOD, C-2PO for VH polarization and statistical wind retrieval approaches [1]. Acknowledgments. This work is supported by the Russian Foundation of Basic Research (grants 13-05-00852-a). [1] M. Portabella, A. Stoffelen, J. A. Johannessen, Toward an optimal inversion method for synthetic aperture radar wind retrieval, Journal of geophysical research, V. 107, N C8, 2002

  7. [Barriers to the normalization of telemedicine in a healthcare system model based on purchasing of healthcare services using providers' contracts].

    Science.gov (United States)

    Roig, Francesc; Saigí, Francesc

    2011-01-01

    Despite the clear political will to promote telemedicine and the large number of initiatives, the incorporation of this modality in clinical practice remains limited. The objective of this study was to identify the barriers perceived by key professionals who actively participate in the design and implementation of telemedicine in a healthcare system model based on purchasing of healthcare services using providers' contracts. We performed a qualitative study based on data from semi-structured interviews with 17 key informants belonging to distinct Catalan health organizations. The barriers identified were grouped in four areas: technological, organizational, human and economic. The main barriers identified were changes in the healthcare model caused by telemedicine, problems with strategic alignment, resistance to change in the (re)definition of roles, responsibilities and new skills, and lack of a business model that incorporates telemedicine in the services portfolio to ensure its sustainability. In addition to suitable management of change and of the necessary strategic alignment, the definitive normalization of telemedicine in a mixed healthcare model based on purchasing of healthcare services using providers' contracts requires a clear and stable business model that incorporates this modality in the services portfolio and allows healthcare organizations to obtain reimbursement from the payer. 2010 SESPAS. Published by Elsevier Espana. All rights reserved.

  8. Interface design and human factors considerations for model-based tight glycemic control in critical care.

    Science.gov (United States)

    Ward, Logan; Steel, James; Le Compte, Aaron; Evans, Alicia; Tan, Chia-Siong; Penning, Sophie; Shaw, Geoffrey M; Desaive, Thomas; Chase, J Geoffrey

    2012-01-01

    Tight glycemic control (TGC) has shown benefits but has been difficult to implement. Model-based methods and computerized protocols offer the opportunity to improve TGC quality and compliance. This research presents an interface design to maximize compliance, minimize real and perceived clinical effort, and minimize error based on simple human factors and end user input. The graphical user interface (GUI) design is presented by construction based on a series of simple, short design criteria based on fundamental human factors engineering and includes the use of user feedback and focus groups comprising nursing staff at Christchurch Hospital. The overall design maximizes ease of use and minimizes (unnecessary) interaction and use. It is coupled to a protocol that allows nurse staff to select measurement intervals and thus self-manage workload. The overall GUI design is presented and requires only one data entry point per intervention cycle. The design and main interface are heavily focused on the nurse end users who are the predominant users, while additional detailed and longitudinal data, which are of interest to doctors guiding overall patient care, are available via tabs. This dichotomy of needs and interests based on the end user's immediate focus and goals shows how interfaces must adapt to offer different information to multiple types of users. The interface is designed to minimize real and perceived clinical effort, and ongoing pilot trials have reported high levels of acceptance. The overall design principles, approach, and testing methods are based on fundamental human factors principles designed to reduce user effort and error and are readily generalizable. © 2012 Diabetes Technology Society.

  9. Model-Based Comprehensive Analysis of School Closure Policies for Mitigating Influenza Epidemics and Pandemics.

    Science.gov (United States)

    Fumanelli, Laura; Ajelli, Marco; Merler, Stefano; Ferguson, Neil M; Cauchemez, Simon

    2016-01-01

    School closure policies are among the non-pharmaceutical measures taken into consideration to mitigate influenza epidemics and pandemics spread. However, a systematic review of the effectiveness of alternative closure policies has yet to emerge. Here we perform a model-based analysis of four types of school closure, ranging from the nationwide closure of all schools at the same time to reactive gradual closure, starting from class-by-class, then grades and finally the whole school. We consider policies based on triggers that are feasible to monitor, such as school absenteeism and national ILI surveillance system. We found that, under specific constraints on the average number of weeks lost per student, reactive school-by-school, gradual, and county-wide closure give comparable outcomes in terms of optimal infection attack rate reduction, peak incidence reduction or peak delay. Optimal implementations generally require short closures of one week each; this duration is long enough to break the transmission chain without leading to unnecessarily long periods of class interruption. Moreover, we found that gradual and county closures may be slightly more easily applicable in practice as they are less sensitive to the value of the excess absenteeism threshold triggering the start of the intervention. These findings suggest that policy makers could consider school closure policies more diffusely as response strategy to influenza epidemics and pandemics, and the fact that some countries already have some experience of gradual or regional closures for seasonal influenza outbreaks demonstrates that logistic and feasibility challenges of school closure strategies can be to some extent overcome.

  10. Model-based software engineering for an optical navigation system for spacecraft

    Science.gov (United States)

    Franz, T.; Lüdtke, D.; Maibaum, O.; Gerndt, A.

    2018-06-01

    The project Autonomous Terrain-based Optical Navigation (ATON) at the German Aerospace Center (DLR) is developing an optical navigation system for future landing missions on celestial bodies such as the moon or asteroids. Image data obtained by optical sensors can be used for autonomous determination of the spacecraft's position and attitude. Camera-in-the-loop experiments in the Testbed for Robotic Optical Navigation (TRON) laboratory and flight campaigns with unmanned aerial vehicle (UAV) are performed to gather flight data for further development and to test the system in a closed-loop scenario. The software modules are executed in the C++ Tasking Framework that provides the means to concurrently run the modules in separated tasks, send messages between tasks, and schedule task execution based on events. Since the project is developed in collaboration with several institutes in different domains at DLR, clearly defined and well-documented interfaces are necessary. Preventing misconceptions caused by differences between various development philosophies and standards turned out to be challenging. After the first development cycles with manual Interface Control Documents (ICD) and manual implementation of the complex interactions between modules, we switched to a model-based approach. The ATON model covers a graphical description of the modules, their parameters and communication patterns. Type and consistency checks on this formal level help to reduce errors in the system. The model enables the generation of interfaces and unified data types as well as their documentation. Furthermore, the C++ code for the exchange of data between the modules and the scheduling of the software tasks is created automatically. With this approach, changing the data flow in the system or adding additional components (e.g., a second camera) have become trivial.

  11. Model-based software engineering for an optical navigation system for spacecraft

    Science.gov (United States)

    Franz, T.; Lüdtke, D.; Maibaum, O.; Gerndt, A.

    2017-09-01

    The project Autonomous Terrain-based Optical Navigation (ATON) at the German Aerospace Center (DLR) is developing an optical navigation system for future landing missions on celestial bodies such as the moon or asteroids. Image data obtained by optical sensors can be used for autonomous determination of the spacecraft's position and attitude. Camera-in-the-loop experiments in the Testbed for Robotic Optical Navigation (TRON) laboratory and flight campaigns with unmanned aerial vehicle (UAV) are performed to gather flight data for further development and to test the system in a closed-loop scenario. The software modules are executed in the C++ Tasking Framework that provides the means to concurrently run the modules in separated tasks, send messages between tasks, and schedule task execution based on events. Since the project is developed in collaboration with several institutes in different domains at DLR, clearly defined and well-documented interfaces are necessary. Preventing misconceptions caused by differences between various development philosophies and standards turned out to be challenging. After the first development cycles with manual Interface Control Documents (ICD) and manual implementation of the complex interactions between modules, we switched to a model-based approach. The ATON model covers a graphical description of the modules, their parameters and communication patterns. Type and consistency checks on this formal level help to reduce errors in the system. The model enables the generation of interfaces and unified data types as well as their documentation. Furthermore, the C++ code for the exchange of data between the modules and the scheduling of the software tasks is created automatically. With this approach, changing the data flow in the system or adding additional components (e.g., a second camera) have become trivial.

  12. A simple model-based control for Pichia pastoris allows a more efficient heterologous protein production bioprocess.

    Science.gov (United States)

    Cos, Oriol; Ramon, Ramon; Montesinos, José Luis; Valero, Francisco

    2006-09-05

    A predictive control algorithm coupled with a PI feedback controller has been satisfactorily implemented in the heterologous Rhizopus oryzae lipase production by Pichia pastoris methanol utilization slow (Mut(s)) phenotype. This control algorithm has allowed the study of the effect of methanol concentration, ranging from 0.5 to 1.75 g/L, on heterologous protein production. The maximal lipolytic activity (490 UA/mL), specific yield (11,236 UA/g(biomass)), productivity (4,901 UA/L . h), and specific productivity (112 UA/g(biomass)h were reached for a methanol concentration of 1 g/L. These parameters are almost double than those obtained with a manual control at a similar methanol set-point. The study of the specific growth, consumption, and production rates showed different patterns for these rates depending on the methanol concentration set-point. Results obtained have shown the need of implementing a robust control scheme when reproducible quality and productivity are sought. It has been demonstrated that the model-based control proposed here is a very efficient, robust, and easy-to-implement strategy from an industrial application point of view. (c) 2006 Wiley Periodicals, Inc.

  13. Milky Way Past Was More Turbulent Than Previously Known

    Science.gov (United States)

    2004-04-01

    Results of 1001 observing nights shed new light on our Galaxy [1] Summary A team of astronomers from Denmark, Switzerland and Sweden [2] has achieved a major breakthrough in our understanding of the Milky Way, the galaxy in which we live. After more than 1,000 nights of observations spread over 15 years, they have determined the spatial motions of more than 14,000 solar-like stars residing in the neighbourhood of the Sun. For the first time, the changing dynamics of the Milky Way since its birth can now be studied in detail and with a stellar sample sufficiently large to allow a sound analysis. The astronomers find that our home galaxy has led a much more turbulent and chaotic life than previously assumed. PR Photo 10a/04: Distribution on the sky of the observed stars. PR Photo 10b/04: Stars in the solar neigbourhood and the Milky Way galaxy (artist's view). PR Video Clip 04/04: The motions of the observed stars during the past 250 million years. Unknown history Home is the place we know best. But not so in the Milky Way - the galaxy in which we live. Our knowledge of our nearest stellar neighbours has long been seriously incomplete and - worse - skewed by prejudice concerning their behaviour. Stars were generally selected for observation because they were thought to be "interesting" in some sense, not because they were typical. This has resulted in a biased view of the evolution of our Galaxy. The Milky Way started out just after the Big Bang as one or more diffuse blobs of gas of almost pure hydrogen and helium. With time, it assembled into the flattened spiral galaxy which we inhabit today. Meanwhile, generation after generation of stars were formed, including our Sun some 4,700 million years ago. But how did all this really happen? Was it a rapid process? Was it violent or calm? When were all the heavier elements formed? How did the Milky Way change its composition and shape with time? Answers to these and many other questions are 'hot' topics for the

  14. Load management: Model-based control of aggregate power for populations of thermostatically controlled loads

    International Nuclear Information System (INIS)

    Perfumo, Cristian; Kofman, Ernesto; Braslavsky, Julio H.; Ward, John K.

    2012-01-01

    Highlights: ► Characterisation of power response of a population of air conditioners. ► Implementation of demand side management on a group of air conditioners. ► Design of a controller for the power output of a group of air conditioners. ► Quantification of comfort impact of demand side management. - Abstract: Large groups of electrical loads can be controlled as a single entity to reduce their aggregate power demand in the electricity network. This approach, known as load management (LM) or demand response, offers an alternative to the traditional paradigm in the electricity market, where matching supply and demand is achieved solely by regulating how much generation is dispatched. Thermostatically controlled loads (TCLs), such as air conditioners (ACs) and fridges, are particularly suitable for LM, which can be implemented using feedback control techniques to regulate their aggregate power. To achieve high performance, such feedback control techniques require an accurate mathematical model of the TCL aggregate dynamics. Although such models have been developed, they appear too complex to be effectively used in control design. In this paper we develop a mathematical model aimed at the design of a model-based feedback control strategy. The proposed model analytically characterises the aggregate power response of a population of ACs to a simultaneous step change in temperature set points. Based on this model, we then derive, and completely parametrise in terms of the ACs ensemble properties, a reduced-order mathematical model to design an internal-model controller that regulates aggregate power by broadcasting temperature set-point offset changes. The proposed controller achieves high LM performance provided the ACs are equipped with high resolution thermostats. With coarser resolution thermostats, which are typical in present commercial and residential ACs, performance deteriorates significantly. This limitation is overcome by subdividing the population

  15. Large-scale model-based assessment of deer-vehicle collision risk.

    Directory of Open Access Journals (Sweden)

    Torsten Hothorn

    Full Text Available Ungulates, in particular the Central European roe deer Capreolus capreolus and the North American white-tailed deer Odocoileus virginianus, are economically and ecologically important. The two species are risk factors for deer-vehicle collisions and as browsers of palatable trees have implications for forest regeneration. However, no large-scale management systems for ungulates have been implemented, mainly because of the high efforts and costs associated with attempts to estimate population sizes of free-living ungulates living in a complex landscape. Attempts to directly estimate population sizes of deer are problematic owing to poor data quality and lack of spatial representation on larger scales. We used data on >74,000 deer-vehicle collisions observed in 2006 and 2009 in Bavaria, Germany, to model the local risk of deer-vehicle collisions and to investigate the relationship between deer-vehicle collisions and both environmental conditions and browsing intensities. An innovative modelling approach for the number of deer-vehicle collisions, which allows nonlinear environment-deer relationships and assessment of spatial heterogeneity, was the basis for estimating the local risk of collisions for specific road types on the scale of Bavarian municipalities. Based on this risk model, we propose a new "deer-vehicle collision index" for deer management. We show that the risk of deer-vehicle collisions is positively correlated to browsing intensity and to harvest numbers. Overall, our results demonstrate that the number of deer-vehicle collisions can be predicted with high precision on the scale of municipalities. In the densely populated and intensively used landscapes of Central Europe and North America, a model-based risk assessment for deer-vehicle collisions provides a cost-efficient instrument for deer management on the landscape scale. The measures derived from our model provide valuable information for planning road protection and defining

  16. Model-based versus specific dosimetry in diagnostic context: Comparison of three dosimetric approaches

    Energy Technology Data Exchange (ETDEWEB)

    Marcatili, S., E-mail: sara.marcatili@inserm.fr; Villoing, D.; Mauxion, T.; Bardiès, M. [Inserm, UMR1037 CRCT, Toulouse F-31000, France and Université Toulouse III-Paul Sabatier, UMR1037 CRCT, Toulouse F-31000 (France); McParland, B. J. [Imaging Technology Group, GE Healthcare, Life Sciences, B22U The Grove Centre, White Lion Road, Amersham, England HP7 9LL (United Kingdom)

    2015-03-15

    Purpose: The dosimetric assessment of novel radiotracers represents a legal requirement in most countries. While the techniques for the computation of internal absorbed dose in a therapeutic context have made huge progresses in recent years, in a diagnostic scenario the absorbed dose is usually extracted from model-based lookup tables, most often derived from International Commission on Radiological Protection (ICRP) or Medical Internal Radiation Dose (MIRD) Committee models. The level of approximation introduced by these models may impact the resulting dosimetry. The aim of this work is to establish whether a more refined approach to dosimetry can be implemented in nuclear medicine diagnostics, by analyzing a specific case. Methods: The authors calculated absorbed doses to various organs in six healthy volunteers administered with flutemetamol ({sup 18}F) injection. Each patient underwent from 8 to 10 whole body 3D PET/CT scans. This dataset was analyzed using a Monte Carlo (MC) application developed in-house using the toolkit GATE that is capable to take into account patient-specific anatomy and radiotracer distribution at the voxel level. They compared the absorbed doses obtained with GATE to those calculated with two commercially available software: OLINDA/EXM and STRATOS implementing a dose voxel kernel convolution approach. Results: Absorbed doses calculated with GATE were higher than those calculated with OLINDA. The average ratio between GATE absorbed doses and OLINDA’s was 1.38 ± 0.34 σ (from 0.93 to 2.23). The discrepancy was particularly high for the thyroid, with an average GATE/OLINDA ratio of 1.97 ± 0.83 σ for the six patients. Differences between STRATOS and GATE were found to be higher. The average ratio between GATE and STRATOS absorbed doses was 2.51 ± 1.21 σ (from 1.09 to 6.06). Conclusions: This study demonstrates how the choice of the absorbed dose calculation algorithm may introduce a bias when gamma radiations are of importance, as is

  17. Technique for sparing previously irradiated critical normal structures in salvage proton craniospinal irradiation

    International Nuclear Information System (INIS)

    McDonald, Mark W; Wolanski, Mark R; Simmons, Joseph W; Buchsbaum, Jeffrey C

    2013-01-01

    Cranial reirradiation is clinically appropriate in some cases but cumulative radiation dose to critical normal structures remains a practical concern. The authors developed a simple technique in 3D conformal proton craniospinal irradiation (CSI) to block organs at risk (OAR) while minimizing underdosing of adjacent target brain tissue. Two clinical cases illustrate the use of proton therapy to provide salvage CSI when a previously irradiated OAR required sparing from additional radiation dose. The prior radiation plan was coregistered to the treatment planning CT to create a planning organ at risk volume (PRV) around the OAR. Right and left lateral cranial whole brain proton apertures were created with a small block over the PRV. Then right and left lateral “inverse apertures” were generated, creating an aperture opening in the shape of the area previously blocked and blocking the area previously open. The inverse aperture opening was made one millimeter smaller than the original block to minimize the risk of dose overlap. The inverse apertures were used to irradiate the target volume lateral to the PRV, selecting a proton beam range to abut the 50% isodose line against either lateral edge of the PRV. Together, the 4 cranial proton fields created a region of complete dose avoidance around the OAR. Comparative photon treatment plans were generated with opposed lateral X-ray fields with custom blocks and coplanar intensity modulated radiation therapy optimized to avoid the PRV. Cumulative dose volume histograms were evaluated. Treatment plans were developed and successfully implemented to provide sparing of previously irradiated critical normal structures while treating target brain lateral to these structures. The absence of dose overlapping during irradiation through the inverse apertures was confirmed by film. Compared to the lateral X-ray and IMRT treatment plans, the proton CSI technique improved coverage of target brain tissue while providing the least

  18. The Importance of Business Model Factors for Cloud Computing Adoption: Role of Previous Experiences

    Directory of Open Access Journals (Sweden)

    Bogataj Habjan Kristina

    2017-08-01

    Full Text Available Background and Purpose: Bringing several opportunities for more effective and efficient IT governance and service exploitation, cloud computing is expected to impact the European and global economies significantly. Market data show that despite many advantages and promised benefits the adoption of cloud computing is not as fast and widespread as foreseen. This situation shows the need for further exploration of the potentials of cloud computing and its implementation on the market. The purpose of this research was to identify individual business model factors with the highest impact on cloud computing adoption. In addition, the aim was to identify the differences in opinion regarding the importance of business model factors on cloud computing adoption according to companies’ previous experiences with cloud computing services.

  19. A Two-Factor Autoregressive Moving Average Model Based on Fuzzy Fluctuation Logical Relationships

    Directory of Open Access Journals (Sweden)

    Shuang Guan

    2017-10-01

    Full Text Available Many of the existing autoregressive moving average (ARMA forecast models are based on one main factor. In this paper, we proposed a new two-factor first-order ARMA forecast model based on fuzzy fluctuation logical relationships of both a main factor and a secondary factor of a historical training time series. Firstly, we generated a fluctuation time series (FTS for two factors by calculating the difference of each data point with its previous day, then finding the absolute means of the two FTSs. We then constructed a fuzzy fluctuation time series (FFTS according to the defined linguistic sets. The next step was establishing fuzzy fluctuation logical relation groups (FFLRGs for a two-factor first-order autoregressive (AR(1 model and forecasting the training data with the AR(1 model. Then we built FFLRGs for a two-factor first-order autoregressive moving average (ARMA(1,m model. Lastly, we forecasted test data with the ARMA(1,m model. To illustrate the performance of our model, we used real Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX and Dow Jones datasets as a secondary factor to forecast TAIEX. The experiment results indicate that the proposed two-factor fluctuation ARMA method outperformed the one-factor method based on real historic data. The secondary factor may have some effects on the main factor and thereby impact the forecasting results. Using fuzzified fluctuations rather than fuzzified real data could avoid the influence of extreme values in historic data, which performs negatively while forecasting. To verify the accuracy and effectiveness of the model, we also employed our method to forecast the Shanghai Stock Exchange Composite Index (SHSECI from 2001 to 2015 and the international gold price from 2000 to 2010.

  20. A general U-block model-based design procedure for nonlinear polynomial control systems

    Science.gov (United States)

    Zhu, Q. M.; Zhao, D. Y.; Zhang, Jianhua

    2016-10-01

    The proposition of U-model concept (in terms of 'providing concise and applicable solutions for complex problems') and a corresponding basic U-control design algorithm was originated in the first author's PhD thesis. The term of U-model appeared (not rigorously defined) for the first time in the first author's other journal paper, which established a framework for using linear polynomial control system design approaches to design nonlinear polynomial control systems (in brief, linear polynomial approaches → nonlinear polynomial plants). This paper represents the next milestone work - using linear state-space approaches to design nonlinear polynomial control systems (in brief, linear state-space approaches → nonlinear polynomial plants). The overall aim of the study is to establish a framework, defined as the U-block model, which provides a generic prototype for using linear state-space-based approaches to design the control systems with smooth nonlinear plants/processes described by polynomial models. For analysing the feasibility and effectiveness, sliding mode control design approach is selected as an exemplary case study. Numerical simulation studies provide a user-friendly step-by-step procedure for the readers/users with interest in their ad hoc applications. In formality, this is the first paper to present the U-model-oriented control system design in a formal way and to study the associated properties and theorems. The previous publications, in the main, have been algorithm-based studies and simulation demonstrations. In some sense, this paper can be treated as a landmark for the U-model-based research from intuitive/heuristic stage to rigour/formal/comprehensive studies.